Not installing through a package manager brings many disadvantages of Windows.
If the developer itself ships the binary, then I can not know for sure that nothing “slipped into” the package that is not in the source code. Malware is still not that common on Linux but I prefer the distributor to build packages on some kind of build system over the developer building on the PC where noone knows how well he cares to not risk installing any “fishy” software.
This is one of those rare times I’d say “no, that thing does not need to change.” The strength of Linux is its lack of centralization with several strong contenders leading the pack. Packaging is not a problem for expert users, and casual users have options. I personally think flatpak and snap are polluters and wasteful, but haven’t broken one of my systems in a while so I don’t mind using them. Options for packaging benefits both the users and maintainers; only someone seeking to monetize that wants to consolidate. Before you know it, graphical installers will have ads. Screw em
I personally think flatpak and snap are polluters and wasteful, but haven’t broken one of my systems in a while so I don’t mind using them.
I’m in a same boat. I have this and that installed via flatpack/snap and they mostly work, but I don’t like them in a principle. And, while they strictly speaking haven’t broken anything the garage computer I’m writing this with has multiple pieces of software which is installed both via apt and via snap. The one from apt is obsolete/broken, so I should go trough and clean them up, but in the other hand the snap ones (signal mostly) complains every now and then that new version has been installed and that it’ll restart automatically after x days. No matter how many times I run updates the message stays until it magically disappeares.
This installation was once xubuntu 16.04 and it’s been upgraded with different hardware for years and until recently it was pretty sufficient to just open console now and then and run apt update && apt dist-upgrade. After that the system would be up to date, run browser and spotify just fine (that’s what I need from a garage computer, play music and offer a way to quickly search whatever online to help with projects) but now it’s in a state where I can’t just let it do it’s thing. It requires handholding and TLC more and more often and I don’t like it. Just let me upgrade a system for decades which used to be possible (and maybe still is) with Debian.
But I’m getting older by the day, I used to have Debian installations which went trough 3-4 major releases without major hiccups and it was wonderful. I like when things just work and I don’t need to pay attention to the operating system itself, it’s just a platform for me to do whatever I need and the less it gets in the way the better. Of course things are better now than back when we had to build our own kernels, but I suppose some of you here are younger than 2.6.0 kernel, so maybe we’ll not go that far into history.
Packaging is no problem at all. That stuff is done automatically nowadays. I’m not sure why that guy mentions that time and time again.
although why you would not want the latest stable version of an app for example is beyond me, like, it’s a stable version, you should want the new features
Because most developers don’t follow Torvalds’ first rule of kernel development: “We don’t cause regressions”. They’re completely fine releasing so-called newer stable versions that are less usable than the earlier ones - removing features, demanding more of the system, letting known bugs to slip through because they assume user case (“it’s fine~”).
And, contrariwise to the guy in the video plenty, plenty users know this: that the latest “stable” version might cause a regression. But they usually don’t have time and/or knowledge to check every single new version of every single piece of software that they might use. So it would be great if there was someone or a group doing this for them, while taking into account that the difference between “this shit is broken!”, “this shit is usable but worse” and “this is actually better” is subjective and depends on user case. Right?
Well. That’s what a distributor does. This is a critical role of distributions that the video does not address - they sort and trial software versions for the users, based on user case.
because they depend on all the versions of libraries that you would not be able to install on the distro because they would break your system or conflict with a newer version
If library developers did what the kernel devs did, this would not be a problem. So while the video guy is addressing a real problem, he’s being unable to pinpoint where the problem lies in; it is not in the distros, but upstream.
duplication, storage, etc.
Is the increased amount of storage necessary a real problem in 2023? I’m not sure given that storage has become dirty cheap even for users, and the cost is usually spread out among the distro maintainers.
Regarding developers releasing multiple versions: usually the ones doing this are the distro maintainers.
I’ve stopped watching the video at 4:09.
Although why you would not like or want the latest stable or your app, for example, is beyond me. It’s a stable version, you should want the new features.
Call me an old man. But I like when things are stable. I don’t like starting my computer, and the software was updated to a new version, and some features disappeared or changed in behavior. This is why I hate the web where people update software right under my nose! With no control from my side.
These repo contains thousands of orphan packages which are not maintained and will never get any update ever again (proceed to show a list of obscure go modules)
Have ever checked if you checked how maintained are the dependencies/libraries of your favorite software? It’s a nightmare as well. The distro is not making anything worse.
You get the duplicated work of maintainers, packaging the same app, multiple times, for multiple supported version of the distro.
First, the work is not often duplicated. The first maintainer to package will usually upstream patches which make packaging easy. Packagers will look how other distros packagers packaged the app they’re trying to package.
Also the duplication only happen a few time. Ubuntu just pulled almost all of their packages from Debian Sid. Same with RHEL/CentOS and Fedora. And so on, and so on
Also you’re overestimating how hard packaging is, most of the time, it’s scripted. (golang modules in debian, are imported in an almost fully automated way)
You know what distro bring?
- Security. (My packages were vetted by packagers)
- Uniformity. (All my software works coherently)
- Stability. (My software doesn’t break at the will of some third party developer)
PeerTube Link: https://tilvids.com/videos/watch/67b6c128-e03e-4a52-a05b-e5fd8061a6fa
@thelinuxexperiment@tilvids.com