Most of the problems in the current internet landscape is caused by the cost of centralized servers. What problems are stopping us from running the fediverse on a peer to peer torrent based network? I would assume latency, but couldn’t that be solved by larger pre caching in clients? Of course interaction and authentication should be handled centrally, but media sharing which is the largest strain on servers could be eased by clients sending media between each other. What am I missing? Torrenting seems to be such an elegant solution.
Authentication and authorization can not be handled centrally if the host performing the actual action you want to apply those to can not be trusted.
Media sharing is mainly a legal problem. With decentralized solutions you couldn’t easily delete illegal content and anyone hosting it would potentially be legally liable.
and anyone hosting it would potentially be legally liable.
Bullshit in many countries. Well, I know two of them: USA and Russia. First requires intention, second has explicitly excludes P2P from liability.
Quite a few systems use torrent style distribution.
Heck, even Windows uses a distributed bandwidth system where you can set it to download chunks of updates from local networked systems.
All technologies, like bittorrent, nonSQL databases, blockchain, AI and the like become used as an invisible part of systems once the idiotic hype about the technology wanes.
Except blockchain solves no useful problems so you will never find it behind anything that isn’t explicitly using it for marketing.
But what if we used AI driven, cloud integrated blockchain to create a hybrid platform for your business needs?
Synergy that!
Synergy that!
https://www.astralcodexten.com/p/why-im-less-than-infinitely-hostile
tldr: there are some legitimate use cases. But not in the first world. And they are unrelated to what crypt-bros are trying to sell.
disclaimer 1: the javascript for the comments is really bad and may freeze your browser
disclaimer 2: while the ideas in the article are interesting, they are flawed (or at least debatable). See comment-section for details.
Git is a distributed block chain, from a certain point of view.
Distributed but high trust.
Zero-trust blockchain tech has no value. There is no such thing as a zero trust system in real life.
Not in an meaningful sense connected to the technology at all
Um no.
To me blockchain just means merkle trees. I know that’s not how everyone else defines it, I’m not trying to start a debate, but tools like git use that technology as well. (Not that Bitcoin invented them or anything.)
I didn’t see any hype arount bittorrent.
Better question: how come we aren’t all using XMPP for chat instead of the current mess we have? Why can’t we just bring back XMPP with encryption? I don’t get people, all for federation and shit but when it comes to messages everyone suddenly forget that XMPP is the original and truly open messaging solution. You can message anyone by email no matter what’s or where’s their server. Pretty much like lemmy, true interoperability.
Currently, attempts at bringing back XMPP has been horrible. Some instances have different plugins, which often break things. Some have e2ee some don’t. XMPP is great but the current attempts are bad. Matrix is good enough for now.
I believe the worst part of XMPP isn’t the instances but the lack of a decent cross platform client that actually supports everything and has a decent UI. For eg. iOS clients are all shit. Without decent clients and push notifications people won’t be using XMPP ever.
Matrix is good enough for now.
Questionable…
What’s wrong with Matrix in your opinion? I found it works fine for chat and group chat. Maybe video and audio calls lack but other than that it works fine.
There are some things such as metadata leakage and the server isn’t the best at being lightweight.
I have read that they have improve the metadata leakage. And the server being lightweight is being working on too , on the new version.
Use construct.
Exactly my concerns, also Matrix isn’t an open standard, truly standardized. It is way more prone to be taken over by some company or ecosystem later on. For what’s worth what is even the Matrix Foundation? Where does the money come from?
The iOS clients have gotten miles better in recent years. It’s still far from perfect, but I’m grateful for the improvements.
What’s the best one in your opinion?
Siskin, but it doesn’t do push with omemo perfectly. Monal does that better.
Great, I used Monal a long time ago and it was buggy maybe I’ll try it again.
I stopped using monal in favour of Siskin, but I think it’s gotten a lot better recently too. My problem was monal started sending notifications for every message in every chat room at some point so I uninstalled it. I assume this has been resolved.
I feel the same way about RSS feeds. It’s a technology meant to keep up with updates on nearly anything across the internet. Even social media sites. It’s been available for ages. But no one is pushing for sites to provide them. 🤷♂️
That’s why I use FreshRSS.
Debatable, likely because Google pretty much killed it.
Meanwhile, IRC just keeps trucking on.
The software landscape for XMPP isn’t the best. I twisted the arms of my immediate family and have them using XMPP messaging with a Snikket server I set up, and we’ve had lots of issues between OMEMO support and the lack of good messaging clients for iOS. It works, but it isn’t the smooth-out-of-the-box experience that non-techies want/need.
That’s another reason why I will never buy an iPhone there are no apps available for niche stuff such as XMPP or Managing torrent for my Linux ISO’s
Do you download ISOs directly on your phone? If not, lots of clients have web interfaces that make it trivial to manage from any device with a browser.
I use Monal on iOS and it’s worked quite well so far. I admit I just joined the XMPP adventure.
SMTP is federation too. But certain megacorp basically fenced off huge chunk of users.
Yes, that’s exactly by point. SMTP is federation, you can setup your own server and have interoperability with others and have 100% of its features working right, so you aren’t locked in to those megacorps. Chat applications should use XMPP to get the same for chat/video.
Torrenting requires way more resources than people realize. It’s easy to look at your torrents’ download speeds and think “oh, that’s less than a normal download, like from Steam, so it must not take nearly as many resources” – it’s not all about bandwidth. The amount of encryption and hashing involved in torrenting is fairly CPU heavy (every ~4 MB piece has to be hashed and verified), especially if your CPU doesn’t have onboard encryption hardware (think mobile devices). The sheer number of connections involved even in just one torrent can also bog down a network like you wouldn’t believe – anyone who runs a home seedbox can attest.
“oh, that’s less than a normal download, like from Steam, so it must not take nearly as many resources”
For me it’s always more.
The amount of encryption and hashing involved in torrenting is fairly CPU heavy
Same amount of encryption https requires. Hashing is completely optional and is not required for operation, encription is optional too, but other peers may require it.
every ~4 MB piece has to be hashed and verified
Which is same. I’m not sure, but steam probably verifies file at some stage too.
The sheer number of connections involved even in just one torrent can also bog down a network like you wouldn’t believe – anyone who runs a home seedbox can attest.
There is not difference between 4kpkts for 500 connections vs 4kpkts for 1 connection for network itself. IP and UDP don’t even have such concept. Network is stateless. But shitty routers with small conntrack table on the other hand…
You’re basically talking about IPFS. I think their problem is that they gave it a local HTTP interface and the documentation is… in need of improvements.
I always shy away from newer tech because of lackluster documentation and poor leadership. The latter is rare enough. Without proper documentation, I feel like I have to read the code and make my own notes to put into their documentation platform. Which is not what I want to do when I use it. Contributing is nice, but when doing something a core member would do without credit, it will dissuade me from participating.
I know that feeling. Curiosity often gets the better of me though: I’m a
nix
/ NixOS user - amongst the worst documented projects I’ve come across.
The short answer is that while torrents show great possibility for content distribution (as an alternative to CDNs for example), they inherently rely on some centralized resources and don’t make sense for a lot of use cases. Most websites are a bunch of small files, torrenting is really much more useful for offloading large bandwidth loads. On small files, the overhead for torrents is a waste. That’s why your favorite linux ISO has a torrent but your favourite website doesn’t.
One major issue is difficulty in accurately tracking the contribution of each member of the swarm. I download a file and I seed it to the next person, sounds great right? But what if the next person doesn’t come along for a long time? Do I keep that slot open for them just in case? How long? How I prove I actually “paid my dues” whether that was waiting for peers or actually delivering to them? How do we track users across different swarms? Do we want a single user ID to be tracked across all content they’ve ever downloaded? When you get into the weeds with these kinds of questions you can see how quickly torrenting is not a great technology for a number of use cases.
Being somewhat centralized, by the way, is how BitTorrent solved the spam issue which plagued P2P networks prior to it. Instead of searching the entire network and everything it contains (and everything every spammer added it to it), you instead rely on a trusted messenger like a torrent index to find your content. The torrent file or magnet link points to a link in a DHT and there you go, no need to worry about trusting peers since you are downloading a file by hash not by name. And you know the hash is right because some trusted messenger gave it to you. Without some form of centralization (as in previous P2P networks), your view of the network was whatever your closest peers wanted it to be, which you essentially got assigned at random and had no reason to trust or not trust. You couldn’t verify they were accurately letting you participate in the wider network. Even a 100% trustworthy peer was only as good as the other peers they were connected to. For every one peer passing you bad data, you needed at least two peers to prove them wrong.
Blockchain gets us close to solving some of these problems as we now have technology for establishing distributed ledgers which could track things like network behavior over time, upload/download ratio, etc. This solves the “who do I trust to say this other peer is a good one?” problem: you trust the ledger. But an underlying problem to applying Blockchain to solve this problem is that ultimately people are just going to be self-reporting their bandwidth. Get another peer to validate it, you say? Of course! But how do we know that peer is not the same person (how do we avoid sybil attacks)? Until we have a solid way to do “proof of bandwidth” or “proof of network availability”, that problem will remain. There are many people working on this problem (they’ve already solved proof of storage so perhaps this could be solved in a similar way) but as of right now I know of no good working implementation that protects against sybil attacks. Then again, if you can use blockchain or some other technology to establish some kind of decentralized datastore for humanity, you don’t need torrents at all as you would instead be using that other base layer protocol for storage and retrieval.
IPFS was intended as a decentralized replacement for much of the way the the current internet works. It was supposed to be this “other protocol”, but the system is byzantinely complex and seems to have suffered from a lack of usability, good leadership, and promotion. When you have an awesome technology and nobody uses it, there are always good reasons for lack of adoption. I don’t know enough about those reasons to really cover them here, but suffice to say they actually do exist. Then again, IPFS has been around for a while now (15 years?) and people use it for stuff so clearly it has some utility.
That said, if you want to code on this problem and contribute to helping solve data storage/transmission problems, there are certainly many OSS projects which could use your help.
I would not want to neither deal with security issues nor pay the data costs associated with some an app being able to connecting to my phone to download media
What if not everyone had to be a seeder?
You rapidly end up with a freeloader issue.
Don’t see how this is much different to today’s way of doing things where pretty much everyone is a freeloader to the centralized server. The major benefit is that it doesn’t have to be just one server anymore.
not if you give benefits to seeder, just like private trackers
The trackers themselves are centralized. The .torrent file you download from a private tracker has a unique private ID tied to your account, which the torrent client advertises to the tracker when it phones home to the announce URL, alongside your leech/seed metadata.
For me I’ve had issues with getting organzational support for use anything close to p2p, with things like “keep that bot net off my system” being said. On personal side I had issues with ISPs assuming traffic was illegal in nature and sending me bogus cease and desist notices.
Agreed though. At least webrtc has a strong market. IPFS and other web3 things also have tried to find footholds in common use, so the fight isn’t over for sure!
Another good example in the fediverse space is peertube too!
On personal side I had issues with ISPs assuming traffic was illegal in nature and sending me bogus cease and desist notices.
On the other hand check if you can sue them for bogus cease and desists. Of you can, do it after changing ISP.
In my mind, Veilid (https://veilid.com/) is trying to solve this exact problem.
It’s only been public since the last DefCon, so it’s a little raw at the moment. That said, it is also only a framework - specific applications need to be built.
See also their Discord: https://discord.com/channels/1077244355439509565/ and GitLab: https://gitlab.com/veilid/
Look into zeronet. https://zeronet.io/
Gotta love the heavy use of buzzword technologies and no actual information on what is actual is. Then you click the “How does it work?” button and it takes you to a Google powerpoint… so much for the sleek website design.
A side effect that is very common with anything related with cryptocurrencies
This is the only system i am aware of using torrent based content sharing. Its not a great system though as you are essentially downloading a whole archive everytime you connect so it just grows and grows unless you set some retention
Nice, a loading screen app
Most clients are web browsers and support for torrents in http is the same as for every other file.
So that would only give us a use for torrents as a form of content distribution plattform to get the actual files closer to the client.
In cases where we have actual non browser clients: i like to curate what i am distributing and don’t want to distribute anything i happen stumble upon or would you be willing to store and more importantly share everything you find on 4chan or that might show up in your mastodon feed?
there’s https://www.scuttlebutt.nz/ but it’s not very user friendly.
at least we could try to eliminate the need for tld’s.
edit: and there’s also https://getaether.net/
IPFS is the most relevant project for media sharing, there’s an in browser JavaScript library that can make all desktop users seeders. Peertube is already encouraged for use as well.
The instance is the aggregator, if it’s P2P then the aggregation is done by the client. In a torrent swarm you contribute bandwidth, not processing power
I’m not necessarily experienced with server hosting, but isnt bandwidth the primary cost of fediverse instances for example? There shouldn’t be to much logical work compared to delivering content?
Fediverse instances with image hosting are bandwidth limited, but that’s just a normal result of image hosting. If you remove image hosting then the bottleneck becomes processing power again.
AFAIK, the main bottleneck is data storage. Related to processing power, but also IO and having a central source of truth.
Sure, but data storage is quite cheap these days. I’m not saying it isn’t a problem, but a 12 Tb raid goes a really long way, or AWS s3 charges pennies per GB per month and solves all your problems if you’re prepared to spend tens of dollars per month.
Bandwidth on the other hand is either inaccessible (read: you have a normie ISP that has at most 2 speeds to sell you and neither of them have guarantees), or extremely expensive, on the order of thousands per month. On top of that, if you happen to pay AWS for storage, each request must be forwarded to AWS, converted in some way by your server then sent to the client, which means it eats both up and down bandwidth. Of course, if you know what you’re doing you can use Amazon’s CDN but at this point administering your instance is a full time job and your expenses are that of small company.
Sure, but this is largely because currently each client doesn’t need to aggregate the whole fediverse. In a decentralised network, you can’t split the sum total of processing required to run the fediverse equally amongst peers. Each peer would need to do more or less the same aggregation job, so the total processing required would be exponentially more than with the current setup. You could still argue it’s a negligible processing cost per client, but it’s certainly way less efficient overall even if we assume perfect i/o etc in the p2p system and even if the client only needs to federate the user selected content
Also just practically deploying a client app that can federate/aggregate constantly in the background (kinda required for full participation) and scale with the growth of fedi without becoming a resource hog I imagine would be pretty tough, like maybe possible yeah but I feel like it makes sense why it isn’t like that also
I was thinking about the possibility torrent based public repo git clones but that isn’t going to pan out.
Why not. Peer to peer sounds like a good idea: https://radicle.xyz/
Because it became heavily associated with illegal activity.