In the last year, I’ve spent about 200,000 words on a kind of personal journey where I’ve tried again and again to work out why everything digital feels so broken, and why it seems to keep getting worse, despite what tech’s “brightest” minds might promise. More
Pretty convincing explanation of why we should hold Silicon Valley morally accountable for their shitty management. As usual, Ed could have stood to be a little more Marxist; in particular, there is a reason why enshittification and rot naturally happens to companies as they age, and it's precisely the tendency for profits to decline over time!
Bonus sneer: it got double-secret-shadow-banned on HN; here is the submission. Cowards.
That's gotta be one of my favorite Zitron piece to date. Ed managed to articulate some points which have been floating around in my mind for a while which I did not have the words to explain. Especially how using any form of out-of-the-box computer these days is just a completely user-hostile pile of steaming horseshit, and why I am anal-retentive about what software gets installed on my devices and how exactly my window manager has to work, &c.
I mean, it's probably because I'm an obsessive nerd, but the fact that it makes me feel in control when I can rip shit of the source code that bugs me (or put shit in that I miss) is a major factor, too.
Btw slur against people who refuse to use ai apparently dropped. "Refuseniks". A very Bruce noo moment.
Also a very, google that new word you just invented because this is a very weird reference moment: https://en.m.wikipedia.org/wiki/Refusenik (even if the term itself is very grokAI (aka wordplay that 60 year olds think is edgy or cool, but actually so without edge it will not cut warm butter)).
E: whoops wrong thread, imagine this was in the random sneers thread
I honestly had no idea of the original Russian meaning of the gloss. To me "refusenik" implies some sort of hard-left hippie.
Edit finally went and read the linked article.
Schneier and Sanders:
We agree with Morozov that the “refuseniks,” as he calls them, are wrong to see AI as “irreparably tainted” by its origins.
Morozov:
Meanwhile, a small but growing group of scholars and activists are taking aim at the deeper, systemic issues woven into AI’s foundations, particularly its origins in Cold War–era computing. For these refuseniks, AI is more than just a flawed technology; it’s a colonialist, chauvinist, racist, and even eugenicist project, irreparably tainted at its core.
But the original term was not for people refusing to take an action - it was the state refusing to allow their actions! It's done a 180, but considering no-one now remembers the plight of Soviet Jews attempting to emigrate to Israel it's not that strange.
Apropos Bruce, I have this writeup sitting around half-finished, where I go over the AI chapters of A Hackers Mind and try to pinpoint his naivité (however you spell that) of the subject. I really should dump that in a Snubstack.
Where The Rot Economy separates is that growth is, in and of itself, the force that drives companies to enshittify.
Not finished reading the article yet, but my reaction to reading this line is that Zitron is missing the mark in the same way he qualifies Doctorow's Enshittification does.
I don't think growth directly drives companies to enshittify. Rather, infinite (and especially constant-or-better) growth in a finite space is only possible when things degrade. Physical widgets do this pretty decently on their own, though we (humans) had to come up with planned obsolescence to keep degradation above a certain threshhold. Software, on the other hand, doesn't naturally degrade over time. It only seems to do so because the different actors in its ecosystem, from the software "bricks" to the underlying hardware, are similarly incentivized to churn out new things that deprecate the old, indirectly degrading them.
We'll see where Zitron goes from here to the end of the article.
I will never forgive these people for what they’ve done to the computer, and the more I learn about both their intentions and actions the more certain I am that they are unrepentant and that their greed will never be sated.
These men lace our digital lives with asbestos and get told they’re geniuses for doing so because money comes out.
I care about you. The user. The person reading this. The person that may have felt stupid, or deficient, or ignorant, all because the services you pay for or that monetize you have been intentionally rigged against you.
You aren't the failure. The services, the devices, and the executives are.
I don't feel like Zitron completely addressed my remark in the parent comment, but the end result/destination is the same.
But even if we assume that this is the case, and even if there are a lot of people that simply don’t try [to learn new technology]…should companies really take advantage of them?