

When learning a new human language, it’s good practice to also learn sentences and practice speaking and writing, not just rote vocabulary memorization.
When learning a new human language, it’s good practice to also learn sentences and practice speaking and writing, not just rote vocabulary memorization.
When learning a new human language, it’s good practice to also learn sentences and practice speaking and writing, not just rote vocabulary memorization.
The waste and corruption in private industry is mind bogglingly huge compared to the public sector.
ADB was superior.
Start with designing a physical card or board game. Play test it until it’s fun.
For artwork practice drawing on paper.
For writing write stories and dialogue.
Make or choose some music.
Use the above to make a video.
No programming required.
Java has been running serious server software since the mid 1990s. Think WebObjects running on Solaris. Lots of business stuff with big databases still run infrastructure like that.
Java still has the big advantage of being machine agnostic. No need to recompile for ARM or Intel.
Early Swift was very slow to compile and start. The debugger was nonfunctional.
Otherwise it was pretty usable. Especially since it got to leverage the huge libraries written for Objective-C.
Which meant it lacked some basic collection types. A Swift native Set was introduced with Swift 3 IIRC. Before that you had to bridge back and forth between Swift and Objective-C. Sometimes leading to unexpected behavior at runtime.
In Objective-C if an object reference was nil, you could send it messages (call methods) without a problem. Swift however did away with this. Optionals had to be explicitly unwrapped. So if the annotations weren’t correct, Swift code would crash at runtime where Objective-C would have been fine. Lots of bugs related to that existed.
Swift peaked around version 4. Since then, they have been adding kitchen sink features and lots of complexity to feel smart.
I still would have preferred an Objective-C 3.0. Chris Lattner was a C++ guy and never really understood Objective-C culture and strengths.
You can generate the code with a simple macro.
Always put a ticket number in the commit message. That can make it much easier later to find out what the context was for some weird solution.
It’s about fame, power, adoration, and legacy.
Great points. Especially putting in more memory can get you very far.
Database optimization? Nah, just put in 1 TB of RAM to keep the whole DB in memory at all times.
Oh you need this in garbage collected languages too, once you run into memory use issues. GC languages are notorious for being wasteful with memory, even when working correctly.
Yes, pretty much like UML diagrams. Who is responsible for allocating memory and freeing it.
Languages like Swift, Objective-C, C++ have features that mean you don’t need to do this by hand. But you have to tell the compiler if you want to keep and object around and who owns it.
See this article on Objective-C to see the different ways to manage memory this language supports.
OOP also has object ownership hierarchy structures. Which object owns which other object, is a question always worth answering.
Books, classes, and documentation can also be lacking for new tech.
Those microservices are a mix of a dozen programming languages across dozens of different versions.
JPEG2000 can do exactly what you want for decades.
Lots of programmers are autists and dont like new information.
Most pages don’t need dynamic loading.
Never use numbers when calculating dates. Use the data formats and constants the calendar library provides.