• 0 Posts
  • 32 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle
  • Sorry to clarify: updates come as security or as feature updates. If I’ve already got a standard operating environment (SOE) with all the features I/staff need to do work, I don’t need new features.

    I then have to watch cves with my cve trackers to know when software updates are needed and all devices with those software get updated and the SOE is updated.

    I can go on a rant about how bad the Linux has recently made my life as someone’s policy is that any Linux bug might be a security vulnerability and therefore I now have infinite noise in my cve feed, which in turn is making decisions on how to mitigate security issues hard, but that is beyond this discussion.

    So in short I’m only talking about when you update, updating only security fixes, not the software and features. Live patching security vulnerabilities is pretty much free low effort, low impact, and in my personal opinion, absolutely critical. But software features patching can be disruptive, leaves little to be gained, and really only should be driven for a request to need that feature at which point it would also include an update to the SOE.





  • They probably have been using it for years, and for the last more then a decade I’ve been using Ubuntu as my main Linux distribution since I have work to do and I’ll get to doing work faster in ubuntu than any other distribution.

    Why did I start with Ubuntu? 10+ years ago Ubuntu was lightyears ahead for community support for issues. Again, I had work to do, I wasn’t hobbyist playing “fuck windows”.

    In fact look at things like ROS where you can get going with “apt install ros-noetic-desktop” and now you can build your robotics stuff instantly. Every dependency to start and all the other tooling is there too. Sure a bunch of people would now say “use nix” but my autonomous robotics project doesn’t care I am trying to get lidar, camera, motors, and SLAM algorithms to work. I don’t want to care or think about compiling ROS for some arch distribution.

    I won’t say I don’t dabble with other distributions but if I’ve got work to do, I’m going to use the tools I already know better than the back of my hand. And at the time, when selecting these tools, Ubuntu had it answered and is stable enough to have been unchanging for basically a decade.

    Oh and if I needed to, I could pay and get support so the CEO can hear that risk is gone too (despite almost every other vendor we pay never actually resolving a issue before we find and fix it… Though I do like also being able to say “we have raised a ticket with vendor x and am waiting on a reply”).


  • From my perspective, if used for work, automatic security updates should be mandatory. Linux is damn impressive with live patch. With thousands or even tens of thousands of endpoints, it’s negligent to not patch.

    Features? Don’t care. But security updates are essential in a large organisation.

    The worst part of the Linux fan base is the users who hate forced updates, and also don’t believe in AV. Ok on your home network that’s not very risky compared to a corp network with a million student and staff personal information often with byo devices only a network segment away and APT groups targeting you because they know your reputation is worth something to ransom.


  • One rich company trying to claim money off the other rich companies using its software. The ROI on enforcing these will come from only those that really should have afforded to pay and if they can’t, shouldn’t have built on the framework. Let them duke it out. I have zero empathy for either side.

    The hopeful other side is with a “budget” for the license, a company can consider using that to weigh up open source contributions and expertise. Allowing those projects to have experts who have income. Even if it’s only a few companies that then hire for that role of porting over, and contributing back to include needed features, more of that helps everyone.

    The same happens in security, there used to be no budget for it, it was a cost centre. But then insurance providers wouldn’t provide cyber insurance without meeting minimum standards (after they lost billions) and now companies suddenly have a budget. Security is thriving.

    When companies value something, because they need to weigh opportunity cost, they’ll find money.


  • Mac book pro from 2012 still going, not strong, Bluetooth barely works, there’s a dying row of pixels, on the screen, the CPU doesn’t seem to support any modem video codec in accelerated mode, and the speakers were clearly garbage and it doubles how bad the Bluetooth is. But it’s running pop os! And it’s running it fine. I mean as long as you connect via rustdesk to another real machine to do real work. It can’t handle tabs or browser rendering…

    Anyway even if i retire it today, it’s outlasted 3 work laptops.



  • Hold them all to account, no single points of failure. Make them all responsible.

    When talking about vscode especially, those users aren’t your mum and dad. They’re technology professionals or enthusiasts.

    With respect to vendors (Microsoft) for too long have they lived off an expectation that its always a end user or publisher responsibility, not theirs when they’re offering a brokering (store or whatever) service. They’ve tried using words like ‘custodian’ when they took the service to further detract from responsibility and fault.

    Vendors of routers and firewalls and other network connected IoT for the consumer space now are being legislatively enforced to start adhering to bare minimum responsible practices such as ‘push to change’ configuration updates and automated security firmware updates, of and the long awaited mandatory random password with reset on first configuration (no more admin/Admin).

    Is clear this burden will cost those providers. Good. Just like we should take a stance against polluters freely polluting, so too should we make providers take responsibility for reasonable security defaults instead of making the world less secure.

    That then makes it even more the users responsibility to be responsible for what they then do insecurely since security should be the default by design. Going outside of those bounds are at your own risk.

    Right now it’s a wild West, and telling what is and isn’t secure would be a roll of the dice since it’s just users telling users that they think it’s fine. Are you supposed to just trust a publisher? But what if they act in bad faith? That problem needs solving. Once an app/plugin/device has millions of people using it, it’s reputation is publicly seen as ok even if completely undeserved.

    Hmm rant over. I got a bit worked up.


  • You’re right. Both cloud services (like Microsoft 365 measured by licensing) and azure each individually are about double Windows. They together make over half of Microsoft’s earnings while Windows is like 16%. Then you’ve got games and linkedin and others filling up the smaller %.

    Microsoft doesn’t need Windows, you can run your office 365 off Mac or Linux for all they care. Just host all your virtual workloads on azure regardless of OS if it’s not serverless, and they’re fine with taking that money.





  • I’ve used virtio for Nutanix before and not using open speed test, but instead using iperf, gathered line rate across hosts.

    However I also know network cards matter a lot. Some network cards, especially cheap Intel x710 suck. They don’t have specific compute offloading that can be done so the CPU does all the work and the host cpu itself processes network traffic significantly slowing throughput.

    My change to mellanox 25g cards showed all vm network performance increase to the expected line rate even on same host.

    That was not a home lab though, that was production at a client.

    Edit sorry I meant to wrap up:

    • to test use iperf (you could use UDP at 10Gbit and run it continuous, in UDP mode you need to set the size you try to send)
    • while testing look for CPU on the host

    If you want to exclude proxmox you could attempt to live boot another usb Linux and test iperf over the lan to another device.





  • AGPS probably does work though for location. Many work laptops have sim cards for 5g, and that means connectivity permanence and assisted gps from cell tower triangulation.

    However I know from testing things like m365 login just accepts the ip location of vpn endpoint.

    My advice is it depends: and it mostly depends on the effort of the sysadmin and the level of logs they look into. The timing of the log from your vpn connection and your location. If they own the networks you did connect to, those networks will know where you are.

    Use your personal device for personal things. End of story.