I remember having a few of these for WordPerfect, MS Word for DOS and Lotus 1-2-3.
I remember having a few of these for WordPerfect, MS Word for DOS and Lotus 1-2-3.
I use the “short meetings” option in gcal, which shortens meetings by 5-10 minutes to give me a passing period between meetings. Twice this week people have had the audacity to try and schedule a meeting in that break. 😬
Exactly, the same way I handle all my credentials.
And… they’re basically all correct. Linux does run on all sort of machines, even really ancient ones. It has a solid command line environment, or rather lots of them. And it’s astounding powerful. Windows does still blue screen, is currently the best place for gaming, and wow is MS fucking you with Win11. Macs can have a cool setup, are really simplified for most users and expensive.
I just tried it and it actually works! 🤣
Oh look, it’s me.
When I was a very junior EE I ended up working mostly on microcontroller code. There was one bit of extremely ugly code I inherited that parsed a terribly designed serial communication protocol by using a giant maze of nested if statements. I really wanted to rewrite it to something better, but I never quite came up a solution while I worked there. Years later after I was no longer at the company I had a stress dream about it and finally came up with a working solution. I still wish I could go fix it. I really hope it’s no longer used, or that someone else has finally fixed it.
Which word?
First, GPUs and CPUs and very different beasts. GPU workloads are by definition highly parallelized and so a GPU has an architecture where it is much easier to just throw more cores at the problem, even if you don’t make each core faster. This means that the power issue is less. Take a look at GPU clock rates vs CPU clocks.
CPU workload tends to have much, much less parallelism and so there is less and less return on adding more cores.
Second, GPUs have started to have lower year over year lift. Your chart is almost a decade out of date. Take a look at, say, a 2080 vs 3080 vs 4080 and you’ll see that the overall compute lift is shrinking.
It’s a real technical limitation. The major driver of CPU progress was the ability to make transistors smaller so you could pack more in without needing so much power that they burn themselves up. However, we’ve reached the point where they are so small that is it getting to be extremely difficult to make them any smaller because of physics. Sure, there is and will continue to be progress in shrinking transistors even more, but it’s just damn hard.
Also, me eating a prune.
I love it, though I haven’t tried using it as a daily driver in a long time. I should give it a whirl!
BeOS was the best desktop OS of its day, and from an end user POV still has features that I miss every day.
Looking back… that was right, hmm, 7 out of 8 times. The miss was a very chill place that gave out Dells, but I lost my job because the funding round didn’t come in.