• 0 Posts
  • 14 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle




  • BartyDeCanter@lemmy.sdf.orgtoProgrammer Humor@programming.devTrue?
    link
    fedilink
    arrow-up
    144
    arrow-down
    1
    ·
    4 months ago

    And… they’re basically all correct. Linux does run on all sort of machines, even really ancient ones. It has a solid command line environment, or rather lots of them. And it’s astounding powerful. Windows does still blue screen, is currently the best place for gaming, and wow is MS fucking you with Win11. Macs can have a cool setup, are really simplified for most users and expensive.




  • When I was a very junior EE I ended up working mostly on microcontroller code. There was one bit of extremely ugly code I inherited that parsed a terribly designed serial communication protocol by using a giant maze of nested if statements. I really wanted to rewrite it to something better, but I never quite came up a solution while I worked there. Years later after I was no longer at the company I had a stress dream about it and finally came up with a working solution. I still wish I could go fix it. I really hope it’s no longer used, or that someone else has finally fixed it.



  • First, GPUs and CPUs and very different beasts. GPU workloads are by definition highly parallelized and so a GPU has an architecture where it is much easier to just throw more cores at the problem, even if you don’t make each core faster. This means that the power issue is less. Take a look at GPU clock rates vs CPU clocks.

    CPU workload tends to have much, much less parallelism and so there is less and less return on adding more cores.

    Second, GPUs have started to have lower year over year lift. Your chart is almost a decade out of date. Take a look at, say, a 2080 vs 3080 vs 4080 and you’ll see that the overall compute lift is shrinking.


  • It’s a real technical limitation. The major driver of CPU progress was the ability to make transistors smaller so you could pack more in without needing so much power that they burn themselves up. However, we’ve reached the point where they are so small that is it getting to be extremely difficult to make them any smaller because of physics. Sure, there is and will continue to be progress in shrinking transistors even more, but it’s just damn hard.