
deleted by creator
deleted by creator
Centralization is likely the unintended end result of the internet. Consider a mesh network where all the links have even throughput. Now suddenly one node has some content that goes viral. Everyone wants to access that data. Suddenly that node needs to support a link that’s much wider because everyone’s requests accumulate there.
Someone goes and upgrades that link. Well now they can serve many more other nodes so they start advertising to put others’ viral information on the node with larger link.
My friend, let me tell you a story during my studies when I had to help someone find a bug in their 1383-line long main() in C… on the other hand I think Ill spare you from the gruesome details, but it took me 30 hours.
The Test part of TDD isn’t meant to encompass your whole need before developing the application. It’s function-by function based. It also forces you to not have giant functions. Let’s say you’re making a compiler. First you need to parse text. Idk what language structure we are doing yet but first we need to tokenize our steam. You write a test that inputs hello world
into your tokenizer then expects two tokens back. You start implementing your tokenizer. Repeat for parser. Then you realize you need to tokenize numbers too. So you go back and make a token test for numbers.
So you don’t need to make all the tests ahead of time. You just expand at the smallest test possible.
On regular desktop environments I really like Guake - it’s a drop down terminal emulator similar to how old games used to do it. It’s nice for quick use here and there. Though these days I just run tilling wm with xfce-terminal. It gets the job done and still looks good.
To be fair, new programmers generally don’t know enough to construct a proper Google query either. And yes there are some lazy people who just don’t try. But sometimes you know what you want to achieve but any query you try seems to be unhelpful. For example, if I want to learn how to store settings in c++ the first link for me tells me to use boost. Now I need to learn about linking libraries and 300 other boost-isms. While anyone with any basic knowledge could recommend reading strings line by line and splitting the string on the equal sign.
Some context though is that this article was written when cloud computing was all the buzz like crypto just was and AI is now. So many people used cloud just for the buzz and without understanding the tool (same with crypto and AI now)
Yea and no. The email “big tech club” happened under a pretense of spam blocking because back then spam and bots were a new concept. We now know a bit more about it and have anti-spam measures built into, e.g. even lemmy so I don’t think big tech will be able to use spam itself to piggy back on. At the same time Facebook has already announced Fediverse integration, and while there’s a petition to defederate from it as soon as they bring their servers up - what’s going to happen if Facebook+Twitter+Reddit decide to hop on the fediverse bandwagon? There’s just too much juicy content there right now. The FOSS Fediverse will have a tough time choosing between accessing all of that juicy content or keep the team values up. Now all of the mentioned sites are in decline, so as long as FOSS fediverse gains momentum faster than Big Tech unites I think we’re safe.
I think another problem is that since FOSS is not profitable, it mostly attracts people who want to make software “for themselves” - hey I need a tool that can do X and if I make it public maybe the other people will like it”. And that’s good but that means the software isn’t “for people”. And the authors already know programming so they make UI that programmers like but not an average Joe. I think FSF needs to invest some money to build a welcoming UI for existing, feature-complete tools.
I can’t quite find the blog post but I saw someone do a blog post using AWS’ map reduce on multiple servers to process a dataset… and then they redid their pipeline using bash, awk, and maybe grep and a single 8-core machine did it 100 times or so faster.
TL;DR: Matrix is good for text AND binary data (XMPP is text only) but XMPP is a bit more centralized than matrix, though both work based on federation principles. XMPP is more lightweight but supports more config options.
It’s not, I agree, but I think if GPL proponents find revenue streams they can use open code will get much better adaptation
What the article fails to address and what I’ve been struggling with personally is… We all need food. Yeah it’s great working on GPL code and ensuring it’s all open. But when companies consider your gpl library vs someone else’s mit library they will naturally go with mit. And then they’ll say “well we’re using this free library already might as well donate/fund it”. So suddenly this MIT dev is able to put way more time into the mit library than your gpl library because it becomes their job. Something that feeds them. Their library gets better faster… And more and more companies use it and fund it. GPL is great if absolutely everyone is on board and everyone is fed. But that’s not the world we live in.
Well you can’t know what you don’t know. So if you start a project and foresee thousands of people use it in scalable manner - yeah you’ll use something faster but then your project might die before getting “in the wild”… but if you’re just nerding out with your friends you just want to have fun… and then suddenly thousands of people want to use your project… there’s just no winning
Do you git clone from the windows share or do all just use the same share as the working tree?
Yes HTNs are a few computational levels higher than goap (HTNs can do everything that goap can). I think project fear AI used HTNs
I kinda am… but I’m also a researcher so I’m not particularly making a game but rather trying to make a new game mechanic. I want to make pawns have complex decision logic to be able to choose multiple ways of doing something. I’m working on creating a hierarchical task network in Rust. I’ve been testing it in godot using the gdnative interface. Don’t really have much to show though and no recent progress… Ben busy with a newborn
Ease of use + networked nature was the bane of PHP in term of security. Everyone and their grandma wanted to make a php site but they weren’t exposed to potential vulnerabilities they could make. Which is why php in part has a notoriety to be vulnerable - a lot of people used it, some made mistakes, so there were many vulnerable code bases.
I think the challenge arises when your hobby project gets funding and thousands of people start using it… But at that point the codebase is likely locked into many previously made decisions. Locked in as in - it would take too much effort to change it.
Global interpreter lock