Lemmy.world is very popular, and one of the largest instances. They do a great job with moderation. There's a lot of positives with lemmy.world
Recently, over the last month, federation issues have become more and more drastic. Some comments from lemmy.world take days, or never, synchronize with other instances.
The current incarnation of activity pub as implemented in Lemmy has rate issues with a very popular instance. So now lemmy.world is becoming a island. This is bad because it fractures the discussion, and encourages more centralization on Lemmy.world which actually weakens the ability of the federated universe to survive a single instance failing or just turning off.
For the time being, I encourage everyone to post to communities hosted on other instances so that the conversation can be consistently access by people across the entire Fediverse. I don't think it's necessary to move your user account, because your client will post to the host instance of a community when you make a comment in that community I believe.
Decentralization should be a background thing with hosts providing server space like you would get from a service like AWS and the front end being a single website with users not knowing on which server their content is hosted and backed up.
You don't have "somebody running the front end" though, it's all done by the people providing hosting services.
Think "crypto philosophy as a message board" but instead of having everyone sync the whole history you split all data randomly in a way that guarantees it is stored on three servers at all time.
Heck, you could also have multiple front ends if you wanted, all pulling and pushing data to the same servers and this way you could log in from any of them, the front end would only have an influence on UI/UX, in the background the data would always come from the same places and for this reason the front end dev wouldn't have the power to block communities.
Yep, divide everything so no one has real power and it's the users that decide what they want and don't want on their feed, allow the hosts to decide if they want to host NSFW content or not (and the users can make the same choice), make it so the users don't have to decide what instance they register with, their credentials are just stored in the database.
For the front end you then have two ways to deal with it, a single one and the hosts "vote" on how to deal with it (crypto style) or the hosts are the database that anyone can access and it allows anyone to create a front end...
Not really though, with Lemmy the host and front end are the same. You access Lemmy from hacker talks, I access it from shitjustworks, both our respective instances host our respective data and provide us a UI to access their instance and could decide to only give us access to the content that they host as some instances already do.
What I'm talking about is front end devs not having to host any of the content themselves, just accessing the database hosted by others and showing that info in the UI they developed and pushing changes to the database when the users sign up or post comments.
Front end doesn't have control over where the info is stored and don't store anything locally, back end doesn't have control over who's pulling and pushing data, they can just choose to filter out NSFW content from their own servers but it just means it won't pick them to host that content and will instead pick servers that don't mind hosting it.
In this way the hosting is the same as any hosting services but completely decentralized and the data is open to all and no host can wipe it because of the backups (contrary to Lemmy where if an instance disappears all the content it hosted is gone).
Raid doesn't work in this context. Because we're assuming we have antagonistic peers. So Central control of any element, gives away control of the whole system.
In a redundant array of inexpensive disks, there's the assumption that there's bunificent administrator organizing everything.
In Raid the admin supplies the disks, creates the pools and the raid platform does the rest is this really different?
In the analogy it would be an admin starts a pool of 1, other admins join their node into the pool and the system handles distributing content across the nodes in the pool. No raid level selection as the system aims for optimal redundancy.
I just expect this setup to run into similar issues surrounding equitable data and load distribution, as not all nodes will be equal in power, storage capacity, bandwidth etc etc. something that actual Raid arrays should not have..
With a single front-end, you have a bottleneck. If you have one domain (website) that everybody goes to to get to the front-end, that means that domain is the single point of failure.
In my line of work, we use load balancers and sub-domains to divide the work and provide resilience (High Availability), but at the end of the day, if the DNS for that site goes down, we're down.
Also, as Jet mentioned, whomever whoever controls the domain (website) controls the content. You can't have multiple groups controlling a single domain. Whomever buys it controls it. If they don't like content, they could easily block access to it.
I'm oversimplifying the inner workings, so if you want more details, let me know.
EDIT: subtext called me out on my crap English. Have nobody to blame but myself. English is my first language.
In this case the solution, as I mentioned in other comments, is to make the back end the decentralized database that's accessible to anyone so the people developing a front end don't host the data and you can use any of the available front ends to connect to your account as it's not attached to any specific front end (your info is in the database).
Front end devs would be competing to provide the best UI/UX, but in the end everyone would have access to the same data and front end devs couldn't get in the way of the data or if they did then people could just go to another website without losing anything.
Decentralization should be a background thing with hosts providing server space like you would get from a service like AWS and the front end being a single website with users not knowing on which server their content is hosted and backed up.
You could potentially run into issues with data storage reliability:
What happens if some server hoster were to simply delete their hosted data? Would the data simply just cease to exist? You would end up needing to duplicate it some amount of times to statistically ensure some level of security in the data, and, even then, it's not a guarantee.
How do you ensure that the data doesn't get tampered with when it is stored on other people's untrusted servers? You would need some way to digitally sign data with a user key, which carries with it many potential catches.
You would need to make sure that the data, and networking needs, are distributed according to what the server is able to provide.
I understand that these things could still happen, to a similar extent, with the current model of Lemmy, but they are less likely to occur, given that you can choose which instance to join. These are all not unsolvable issues, but this is not a simple "better" alternative — it's more complicated than that.
All this being said, there is a service that I have heard a little bit about that is sort of similiar to what you appear to be looking for called Nostr.