Pulling a random post from Bluesky:
The linked rant is a damn good read, BTW.
NASB: Baldur Bjarnason's released a second edition of The Intelligence Illusion
It probably is total bullshit
On the one hand, that spectacular failure could potentially dissuade the military from buying in and prolonging this bubble. On the other hand, having an accountability sink for war crimes would be a tempting offer to your average army.
Starting things off with a fresh post from Brian Merchant: Tech under Trump, part 1
Stubsack: weekly thread for sneers not worth an entire post, week ending 1st December 2024
Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
> The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Semi-obligatory thanks to @dgerard for starting this - this one was a bit late, I got distracted)
You just know Netflix's inbox is getting flooded with the absolute worst shit League of Legends players can come up with right now
I don't run any websites, what are you coming at me for
If H5N1 does turn into a full-blown outbreak, part of me expects it'll rack up a heavier deathtoll than COVID.
If we came across very mentally disabled people or extremely early babies (perhaps in a world where we could extract fetuses from the womb after just a few weeks) that could feel pain but only had cognition as complex as shrimp, it would be bad if they were burned with a hot iron, so that they cried out. It’s not just because they’d be smart later, as their hurting would still be bad if the babies were terminally ill so that they wouldn’t be smart later, or, in the case of the cognitively enfeebled who’d be permanently mentally stunted.
wat
This entire fucking shrimp paragraph is what failing philosophy does to a mf
Character.AI Is Hosting Pedophile Chatbots That Groom Users Who Say They're Underage
Three billion dollars and its going into Character AI AutoGroomer 4000s. Fuck this timeline.
Stubsack: weekly thread for sneers not worth an entire post, week ending 24th November 2024
Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
> The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Semi-obligatory thanks to @dgerard for starting this)
Wired tried to put out a defence of tech and the result is incoherent drivel. See for yourself.
This is pure gut instinct, but I suspect Twitter's probably gonna die sometime during Trump's term - the banks which funded Musk's takeover consider it their worst deal since the Great Recession, and the rapid exodus of users is gonna further cripple Twitter's ability to attract advertising revenue.
Part of me suspects we're gonna see Twitter getting banned somewhere during Trump's term as well, a la Musk's tangle with Brazil.
Dorsey jumping ship's about as good a reason as any to jump to Bluesky, especially considering Twitter's currently haemorrhaging users from the one-two-three punch of the AI training, the crippled blocking and THE ELECTIONtm.
By my guess, the AI training is probably doing the most damage - that one's prompting artists to bolt for the exits, and if Tumblr's NSFW ban taught me anything, its that if the artists start leaving, they're gonna start taking their fans with them.
- following on from 1, it’s kind of funny that the EAs, who you could pattern match to a “high school nerd” stereotype, are intellectually beaten out by an analog of the “jock” stereotype of sports fans: fantasy league participants who understand the concept of “intangibles” that EAs apparently cannot grasp.
On a wider note, it feels the "geek/nerd" moniker's lost a whole lot of cultural cachet since its peak in the mid-'10s. It is a topic Sarah Z has touched on, but I could probably make a full goodpost about it.
Got recommended a video from a friend, and its a damn good sneer at the state of YouTube sponsorships:
⢀⣀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠘⣿⣿⡟⠲⢤⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠈⢿⡇⠀⠀⠈⠑⠦⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⣠⠴⢲⣾⣿⣿⠃
⠀⠀⠈⢿⡀⠀⠀⠀⠀⠈⠓⢤⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠖⠚⠉⠀⠀⢸⣿⡿⠃⠀
⠀⠀⠀⠈⢧⡀⠀⠀⠀⠀⠀⠀⠙⠦⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⡤⠖⠋⠁⠀⠀⠀⠀⠀⠀⣸⡟⠁⠀⠀
⠀⠀⠀⠀⠀⠳⡄⠀⠀⠀⠀⠀⠀⠀⠈⠒⠒⠛⠉⠉⠉⠉⠉⠉⠉⠑⠋⠁⠀⠀⠀⠀⠀⠀⠀⠀⠀⣰⠏⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⠘⢦⡀⠀⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⡴⠃⠀⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⠀⠀⠙⣶⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠰⣀⣀⠴⠋⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⠀⠀⣰⠁⠀⠀⠀⣠⣄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⣀⣤⣀⠀⠀⠀⠀⠹⣇⠀⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⠀⢠⠃⠀⠀⠀⢸⣀⣽⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⣧⣨⣿⠀⠀⠀⠀⠀⠸⣆⠀⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⠀⡞⠀⠀⠀⠀⠘⠿⠛⠀⠀⠀⢀⣀⠀⠀⠀⠀⠀⠙⠛⠋⠀⠀⠀⠀⠀⠀⢹⡄⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⢰⢃⡤⠖⠒⢦⡀⠀⠀⠀⠀⠀⠙⠛⠁⠀⠀⠀⠀⠀⠀⠀⣠⠤⠤⢤⡀⠀⠀⢧⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⢸⢸⡀⠀⠀⢀⡗⠀⠀⠀⠀⢀⣠⠤⠤⢤⡀⠀⠀⠀⠀⢸⡁⠀⠀⠀⣹⠀⠀⢸⠀⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⢸⡀⠙⠒⠒⠋⠀⠀⠀⠀⠀⢺⡀⠀⠀⠀⢹⠀⠀⠀⠀⠀⠙⠲⠴⠚⠁⠀⠀⠸⡇⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⠀⢷⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠙⠦⠤⠴⠋⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⠀⠀⢳⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢸⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⠀⠀⢸⠂⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢸⠀⠀⠀⠀⠀⠀
⠀⠀⠀⠀⠀⠀⠀⠀⠾⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠤⠦⠤⠤⠤⠤⠤⠤⠤⠼⠇⠀⠀⠀
(your shocked Pikachu was borked, I ended up fixing it out of frustration. Took me a couple attempts - standard formatting fucks it up no matter what, but code's kinder to it)
This hits a lot worse after JD Vance became Vice-Pres Elect
Where's Your Ed At - Lost In The Future
Soundtrack: Post Pop Depression - Paraguay I haven't wanted to write much in the last week. Seemingly every single person on Earth with a blog has tried to drill down into what happened on November 5 — to find the people to blame, to somehow explain what could've been done differently,
Four Passengers Die in Burning Tesla After Electronic Doors Seemingly Won't Open
I always had a hunch Teslas were death traps but Jesus Christ
Its gonna be the largest embezzlement scheme in US history, that much I'm certain. How much damage the pair will do to the federal gov I'm not sure, but I expect there won't be much left of it once they're done.
Elon Musk, Ramaswamy land Trump admin roles
President-elect Trump has tapped tech entrepreneurs Elon Musk and Vivek Ramaswamy to lead an advisory group focused on cutting federal spending and reducing the size of the government.
Trump announced Tuesday that Musk and Ramaswamy would lead his “Department of Government Efficiency” (DOGE), an initiative meant to “slash excess regulations, cut wasteful expenditures” and restructure federal agencies.
We live in the dumbest timeline
Stubsack: weekly thread for sneers not worth an entire post, week ending 17th November 2024
Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
> The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Semi-obligatory thanks to @dgerard for starting this)
Some Off-The-Cuff Predictions on Trump's Presidency
Well, Trump's got elected, I'm deeply peeved about the future, and I need to feel like I can see the future coming.
Fuck it. Here's some off-the-cuff predictions, because I need to
- There's a Spike in Infanticides
Under normal circumstances, anyone who doesn't want to deal with raising crotch goblins has two major options: use contraception to keep the pregnancy from happening at all, or shitcan the unborn fetus by getting an abortion.
Abortion's been on the way out since Roe v. Wade got double-tapped, and contraception's probably gonna get nuked as well, so giving the baby to an early grave is probably gonna be the only option for most unwanted births.
Chance: Low/Moderate. This is more-or-less pure gut feeling I'm going on, and chances there's gonna be a lot of abortions getting recorded as infanticides and skewing the numbers
- America's Reputation Nosedives
Gonna just copy-paste istewart's comment for this, because bro said it better than I could:
> I am left thinking that many people here in the US are going to have a hard time accepting that having this person, at this stage in his life, as the national figurehead will do permanent damage to the US’ prestige and global standing. Doesn’t matter if Reagan was sundowning, media was more controlled then and his handlers were there to support the institution of the presidency at least as much as they were there to support a narcissist. In 2016, other countries could look at Trump as a temporary aberration and wait him out. This time, it’s clear that the US is no longer a reliable partner.
Chance: Near-Guaranteed. With 2016, the US could point to Hillary winning the popular vote to blunt the damage. 2024 gives no real way to spin it - a majority of Americans explicitly wanted Trump as president. Whatever he does, its in their name.
- Violence Against Silicon Valley
I've already predicted this before, but I'm gonna predict it again. Whether its because Trump scapegoats the Valley as someone else predicted, labor comes to believe peace is no longer an option, someone decides "fuck it, time to make Uncle Ted proud", or God-knows-what-else, I fully expect there's gonna be direct and violent attacks on tech at some point.
No clue on the exact form either - could be someone getting punched for wearing Ray-Ban Autodoxxers, could be rank-and-file employees getting bombed, could be Trumpies pulling a Pumped Up Kicks, fuck if I know.
Chance: Moderate. The tech industry's managed to piss off both political wings for one reason or another, though the left's done a good job not resorting to pipe bombs during my lifetime.
- Another Assassination Attempt
With Trump having successfully evaded justice on his criminal convictions (which LegalEagle's discussed in depth), any hope that he's going to see the inside of a jail cell is virtually zero. This leaves vigilante justice as the only option for anyone who wants this man to face anything approaching consequences for his actions.
Chance: Low. Presidents are very well-protected these days, though Trump's own ego and stupidity may end up opening him up to an attempt.
- A Total Porn Ban
Project 2025 may be defining everything even remotely queer as "pornography", but nothing's stopping a Trump presidency from putting regular porn on the chopping block as well. With the existing spate of anti-porn laws in the US, there's already some minor precedent.
Chance: Moderate/High. Porn doesn't enjoy much political support from either the Dems or the Reps, though the porn industry could probably get the Dems' support if it knows what its doing.
Stubsack: weekly thread for sneers not worth an entire post, week ending 10th November 2024
Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
> The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Semi-obligatory thanks to @dgerard for starting this)
Stubsack: weekly thread for sneers not worth an entire post, week ending 3rd November 2024
Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
> The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Semi-obligatory thanks to @dgerard for starting this)
Some Quick-and-Dirty Thoughts on the Character.ai lawsuit
(This is an expanded version of two of my comments [Comment A, Comment B] - go and read those if you want)
Well, Character.ai got themselves into some real deep shit recently - repeat customer Sewell Setzer shot himself and his mother, Megan Garcia, is suing the company, its founders and Google as a result, accusing them of "anthropomorphising" their chatbots and offering “psychotherapy without a license.”, among other things and demanding a full-blown recall.
Now, I'm not a lawyer, but I can see a few aspects which give Garcia a pretty solid case:
-
The site has "mental health-focused chatbots like “Therapist” and “Are You Feeling Lonely,” which Setzer interacted with" as Emma Roth noted writing for The Verge
-
Character.ai has already had multiple addiction/attachment cases like Sewell's - I found articles from Wired and news.com.au, plus a few user testimonies (Exhibit A, Exhibit B, Exhibit C) about how damn addictive the fucker is.
-
As Kevin Roose notes for NYT "many of the leading A.I. labs have resisted building A.I. companions on ethical grounds or because they consider it too great a risk". That could be used to suggest character.ai were being particularly reckless.
Which way the suit's gonna go, I don't know - my main interest's on the potential fallout.
Some Predictions
Win or lose, I suspect this lawsuit is going to sound character.ai's death knell - even if they don't get regulated out of existence, "our product killed a child" is the kind of Dasani-level PR disaster few companies can recover from, and news of this will likely prompt any would-be investors to run for the hills.
If Garcia does win the suit, it'd more than likely set a legal precedent which denies Section 230 protection to chatbots, if not AI-generated content in general. If that happens, I expect a wave of lawsuits against other chatbot apps like Replika, Kindroid and Nomi at the minimum.
As for the chatbots themselves, I expect they're gonna rapidly lock their shit down hard and fast, to prevent themselves from having a situation like this on their hands, and I expect their users are gonna be pissed.
As for the AI industry at large, I suspect they're gonna try and paint the whole thing as a frivolous lawsuit and Garcia as denying any fault for her son's suicide , a la the "McDonald's coffee case". How well this will do, I don't know - personally, considering the AI industry's godawful reputation with the public, I expect they're gonna have some difficulty.
You Can't Make Friends With The Rockstars - Ed Zitron on the tech press
You cannot make friends with the rock stars...if you're going to be a true journalist, you know, a rock journalist. First, you never get paid much, but you will get free records from the record company. [There’s] fuckin’ nothin' about you that is controversial. God, it's gonna get
Gonna add the opening quote, because it is glorious:
> You cannot make friends with the rock stars...if you're going to be a true journalist, you know, a rock journalist. First, you never get paid much, but you will get free records from the record company. > > [There’s] fuckin’ nothin' about you that is controversial. God, it's gonna get ugly. And they're gonna buy you drinks, you're gonna meet girls, they're gonna try to fly you places for free, offer you drugs. I know, it sounds great, but these people are not your friends. You know, these are people who want you to write sanctimonious stories about the genius of the rock stars and they will ruin rock 'n' roll and strangle everything we love about it. > > Because they're trying to buy respectability for a form that's gloriously and righteously dumb. > > Lester Bangs, Almost Famous (2000)
EDITED TO ADD: If you want a good companion piece to this, Devs and the Culture of Tech by @UnseriousAcademic is a damn good read, going deep into the cultural issues which leads to the fawning tech press Zitron so thoroughly tears into.
Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 20 October 2024
Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
> The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Semi-obligatory thanks to @dgerard for starting this)
Some Quick-and-Dirty Thoughts on AI's Future
Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret. Any awf...
(This is basically an expanded version of a comment on the weekly Stubsack - I've linked it above for convenience's sake.)
This is pure gut instinct, but I’m starting to get the feeling this AI bubble’s gonna destroy the concept of artificial intelligence as we know it.
On the artistic front, there's the general tidal wave of AI-generated slop (which I've come to term "the slop-nami") which has come to drown the Internet in zero-effort garbage, interesting only when the art's utterly insane or its prompter gets publicly humiliated, and, to quote Line Goes Up, "derivative, lazy, ugly, hollow, and boring" the other 99% of the time.
(And all while the AI industry steals artists' work, destroys their livelihoods and shamelessly mocks their victims throughout.)
On the "intelligence" front, the bubble's given us public and spectacular failures of reasoning/logic like Google gluing pizza and eating onions, ChatGPT sucking at chess and briefly losing its shit, and so much more - even in the absence of formal proof LLMs can't reason, its not hard to conclude they're far from intelligent.
All of this is, of course, happening whilst the tech industry as a whole is hyping the ever-loving FUCK out of AI, breathlessly praising its supposed creativity/intelligence/brilliance and relentlessly claiming that they're on the cusp of AGI/superintelligence/whatever-the-fuck-they're-calling-it-right-now, they just need to raise a few more billion dollars and boil a few more hundred lakes and kill a few more hundred species and enable a few more months of SEO and scams and spam and slop and soulless shameless scum-sucking shitbags senselessly shitting over everything that was good about the Internet.
----
The public's collective consciousness was ready for a lot of futures regarding AI - a future where it took everyone's jobs, a future where it started the apocalypse, a future where it brought about utopia, etcetera. A future where AI ruins everything by being utterly, fundamentally incompetent, like the one we're living in now?
That's a future the public was not ready for - sci-fi writers weren't playing much the idea of "incompetent AI ruins everything" (Paranoia is the only example I know of), and the tech press wasn't gonna run stories about AI's faults until it became unignorable (like that lawyer who got in trouble for taking ChatGPT at its word).
Now, of course, the public's had plenty of time to let the reality of this current AI bubble sink in, to watch as the AI industry tries and fails to fix the unfixable hallucination issue, to watch the likes of CrAIyon and Midjourney continually fail to produce anything even remotely worth the effort of typing out a prompt, to watch AI creep into and enshittify every waking aspect of their lives as their bosses and higher-ups buy the hype hook, line and fucking sinker.
----
All this, I feel, has built an image of AI as inherently incapable of humanlike intelligence/creativity (let alone Superintelligencetm), no matter how many server farms you build or oceans of water you boil.
Especially so on the creativity front - publicly rejecting AI, like what Procreate and Schoolism did, earns you an instant standing ovation, whilst openly shilling it (like PC Gamer or The Bookseller) or showcasing it (like Justine Moore, Proper Prompter or Luma Labs) gets you publicly and relentlessly lambasted. To quote Baldur Bjarnason, the “E-number additive, but for creative work” connotation of “AI” is more-or-less a permanent fixture in the public’s mind.
I don't have any pithy quote to wrap this up, but to take a shot in the dark, I expect we're gonna see a particularly long and harsh AI winter once the bubble bursts - one fueled not only by disappointment in the failures of LLMs, but widespread public outrage at the massive damage the bubble inflicted, with AI funding facing heavy scrutiny as the public comes to treat any research into the field as done with potentally malicious intent.
Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 13 October 2024
Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
> The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Semi-obligatory thanks to @dgerard for starting this)
"The Subprime AI Crisis" - Ed Zitron on the bubble's impending collapse
None of what I write in this newsletter is about sowing doubt or "hating," but a sober evaluation of where we are today and where we may end up on the current path. I believe that the artificial intelligence boom — which would be better described as a generative AI boom
> None of what I write in this newsletter is about sowing doubt or "hating," but a sober evaluation of where we are today and where we may end up on the current path. I believe that the artificial intelligence boom — which would be better described as a generative AI boom — is (as I've said before) unsustainable, and will ultimately collapse. I also fear that said collapse could be ruinous to big tech, deeply damaging to the startup ecosystem, and will further sour public support for the tech industry.
Can't blame Zitron for being pretty downbeat in this - given the AI bubble's size and side-effects, its easy to see how its bursting can have some cataclysmic effects.
(Shameless self-promo: I ended up writing a bit about the potential aftermath as well)
Stubsack: weekly thread for sneers not worth an entire post, week ending Sunday 9 September 2024
Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.
> The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be) > > Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.
(Semi-obligatory thanks to @dgerard for starting this)
Ed Zitron on Google's antitrust loss
Last week, in the midst of the slow, painful collapse of the generative AI hype cycle, something incredible happened. On Monday, a Federal Judge delivered a crushing ruling in the multi-year-long antitrust case filed against Google by the Department of Justice. In 300-pages of dense legal text, Jud...
Humane was founded by former Apple employees Imran Chaudhri and Bethany Bongiorno. They wanted something that would rival the iPhone. The Ai Pin (that’s “Ai”, not “AI”) would take commands by…
Some Quick and Dirty Thoughts on "The empty brain"
Your brain does not process information, retrieve knowledge or store memories. In short: your brain is not a computer
This started as a summary of a random essay Robert Epstein (fuck, that's an unfortunate surname) cooked up back in 2016, and evolved into a diatribe about how the AI bubble affects how we think of human cognition.
This is probably a bit outside awful's wheelhouse, but hey, this is MoreWrite.
The TL;DR
The general article concerns two major metaphors for human intelligence:
- The information processing (IP) metaphor, which views the brain as some form of computer (implicitly a classical one, though you could probably cram a quantum computer into that metaphor too)
- The anti-representational metaphor, which views the brain as a living organism, which constantly changes in response to experiences and stimuli, and which contains jack shit in the way of any computer-like components (memory, processors, algorithms, etcetera)
Epstein's general view is, if the title didn't tip you off, firmly on the anti-rep metaphor's side, dismissing IP as "not even slightly valid" and openly arguing for dumping it straight into the dustbin of history.
His main major piece of evidence for this is a basic experiment, where he has a student draw two images of dollar bills - one from memory, and one with a real dollar bill as reference - and compare the two.
Unsurprisingly, the image made with a reference blows the image from memory out of the water every time, which Epstein uses to argue against any notion of the image of a dollar bill (or anything else, for that matter) being stored in one's brain like data in a hard drive.
Instead, he argues that the student making the image had re-experienced seeing the bill when drawing it from memory, with their ability to do so having come because their brain had changed at the sight of many a dollar bill up to this point to enable them to do it.
Another piece of evidence he brings up is a 1995 paper from Science by Michael McBeath regarding baseballers catching fly balls. Where the IP metaphor reportedly suggests the player roughly calculates the ball's flight path with estimates of several variables ("the force of the impact, the angle of the trajectory, that kind of thing"), the anti-rep metaphor (given by McBeath) simply suggests the player catches them by moving in a manner which keeps the ball, home plate and the surroundings in a constant visual relationship with each other.
The final piece I could glean from this is a report in Scientific American about the Human Brain Project (HBP), a $1.3 billion project launched by the EU in 2013, made with the goal of simulating the entire human brain on a supercomputer. Said project went on to become a "brain wreck" less than two years in (and eight years before its 2023 deadline) - a "brain wreck" Epstein implicitly blames on the whole thing being guided by the IP metaphor.
Said "brain wreck" is a good place to cap this section off - the essay is something I recommend reading for yourself (even if I do feel its arguments aren't particularly strong), and its not really the main focus of this little ramblefest. Anyways, onto my personal thoughts.
Some Personal Thoughts
Personally, I suspect the AI bubble's made the public a lot less receptive to the IP metaphor these days, for a few reasons:
- Articial Idiocy
The entire bubble was sold as a path to computers with human-like, if not godlike intelligence - artificial thinkers smarter than the best human geniuses, art generators better than the best human virtuosos, et cetera. Hell, the AIs at the centre of this bubble are running on neural networks, whose functioning is based on our current understanding of how the brain works. [Missed this incomplete sensence first time around :P]
What we instead got was Google telling us to eat rocks and put glue in pizza, chatbots hallucinating everything under the fucking sun, and art generators drowning the entire fucking internet in pure unfiltered slop, identifiable in the uniquely AI-like errors it makes. And all whilst burning through truly unholy amounts of power and receiving frankly embarrassing levels of hype in the process.
(Quick sidenote: Even a local model running on some rando's GPU is a power-hog compared to what its trying to imitate - digging around online indicates your brain uses only 20 watts of power to do what it does.)
With the parade of artificial stupidity the bubble's given us, I wouldn't fault anyone for coming to believe the brain isn't like a computer at all.
- Inhuman Learning
Additionally, AI bros have repeatedly and incessantly claimed that AIs are creative and that they learn like humans, usually in response to complaints about the Biblical amounts of art stolen for AI datasets.
Said claims are, of course, flat-out bullshit - last I checked, human artists only need a few references to actually produce something good and original, whilst your average LLM will produce nothing but slop no matter how many terabytes upon terabytes of data you throw at its dataset.
This all arguably falls under the "Artificial Idiocy" heading, but it felt necessary to point out - these things lack the creativity or learning capabilities of humans, and I wouldn't blame anyone for taking that to mean that brains are uniquely unlike computers.
- Eau de Tech Asshole
Given how much public resentment the AI bubble has built towards the tech industry (which I covered in my previous post), my gut instinct's telling me that the IP metaphor is also starting to be viewed in a harsher, more "tech asshole-ish" light - not just merely a reductive/incorrect view on human cognition, but as a sign you put tech over human lives, or don't see other people as human.
Of course, AI providing a general parade of the absolute worst scumbaggery we know (with Mira Murati being an anti-artist scumbag and Sam Altman being a general creep as the biggest examples) is probably helping that fact, alongside all the active attempts by AI bros to mimic real artists (exhibit A, exhibit B).
Ed Zitron on what might come next for the AI bubble and the tech industry
Soundtrack: Masters of Reality - High Noon Amsterdam I have said almost everything in this piece in every one of these articles for months. I am not upset, but just stating an obvious truth. The current state of affairs effectively pushes against the boundaries of good sense, logic and reason,
Some Thoughts On "The Grimy Residue of the AI Bubble"
What kind of residue will the AI bubble's popping leave behind? By Alex Photo credit: Marc Sendra Martorell Q2 earnings are in. According to Pitchbook data,...
Whilst going through MAIHT3K's backlog, I ended up running across a neat little article theorising on the possible aftermath which left me wondering precisely what the main "residue", so to speak, would be.
The TL;DR:
To cut a long story far too short, Alex, the writer, theorised the bubble would leave a "sticky residue" in the aftermath, "coating creative industries with a thick, sooty grime of an industry which grew expansively, without pausing to think about who would be caught in the blast radius" and killing or imperilling a lot of artists' jobs in the process - all whilst producing metric assloads of emissions and pushing humanity closer to the apocalypse.
My Thoughts
Personally, whilst I can see Alex's point, I think the main residue from this bubble is going to be large-scale resentment of the tech industry, for three main reasons:
- AI Is Shafting Everyone
Its not just artists who have been pissed off at AI fucking up their jobs, whether freelance or corporate - as Upwork, of all places, has noted in their research, pretty much anyone working right now is getting the shaft:
-
Nearly half (47%) of workers using AI say they have no idea how to achieve the productivity gains their employers expect
-
Over three in four (77%) say AI tools have decreased their productivity and added to their workload in at least one way
-
Seventy-one percent are burned out and nearly two-thirds (65%) report struggling with increasing employer demands
-
Women (74%) report feeling more burned out than do men (68%)
-
1 in 3 employees say they will likely quit their jobs in the next six months because they are burned out or overworked (emphasis mine)
Baldur Bjarnason put it better than me when commenting on these results:
> It’s quite unusual for a study like this on a new office tool, roughly two years after that tool—ChatGPT—exploded into people’s workplaces, to return such a resoundingly negative sentiment. > > But it fits with the studies on the actual functionality of said tool: the incredibly common and hard to fix errors, the biases, the general low quality of the output, and the often stated expectation from management that it’s a magic fix for the organisational catastrophe that is the mass layoff fad. > > Marketing-funded research of the kind that Upwork does usually prevents these kind of results by finessing the questions. They simply do not directly ask questions that might have answers they don’t like. > > That they didn’t this time means they really, really did believe that “AI” is a magic productivity tool and weren’t prepared for even the possibility that it might be harmful.
Speaking of the general low-quality output:
- The AI Slop-Nami
The Internet has been flooded with AI-generated garbage. Fucking FLOODED.
Doesn't matter where you go - Google, DeviantArt, Amazon, Facebook, Etsy, Instagram, YouTube, Sports Illustrated, fucking 99% of the Internet is polluted with it.
Unsurprisingly, this utter flood of unfiltered unmitigated endless trash has sent AI's public perception straight down the fucking toilet, to the point of spawning an entire counter-movement against the fucking thing.
Whether it be Glaze and Nightshade directly sabotaging datasets, "Made with Human Intelligence" and "Not By AI" badges proudly proclaiming human-done production or Cara blowing up by offering a safe harbour from AI, its clear there's a lot of people out there who want abso-fucking-lutely nothing to do with AI in any sense of the word as a result of this slop-nami.
- The Monstrous Assholes In AI
On top of this little slop-nami, those leading the charge of this bubble have been generally godawful human beings. Here's a quick highlight reel:
-
Microsoft’s AI boss thinks it’s perfectly okay to steal content if it’s on the open web - it was already a safe bet anyone working in AI thought that considering they stole from literally everyone, whether large or small, but its nice for someone to spell it out like that.
-
The death of robots.txt - OpenAI publicly ignored it, Perplexity lied about their user agent to deceive it, and Anthropic spammed crawlers to sidestep it. Results were predictable. Mostly.
-
Mira Murati saying "some creative jobs shouldn't exist" - Ed Zitron called this "a declaration of war against creative labor" when talking to Business Insider, which sums this up better than I ever could.
-
Sam Altman's Tangle with Her - Misunderstanding the basic message? Check. Bullying a dragon (to quote TV Tropes)? Check. Embarassing your own company? Check. Being a creepy-ass motherfucker? Fucking check.
-
"Can Artificial Intelligence Speak for Incapacitated Patients at the End of Life?" - I'll let Amy and David speak for me on this because WHAT THE FUCK.
I'm definitely missing a lot, but I think this sampler gives you a good gist of the kind of soulless ghouls who have been forcing this entire fucking AI bubble upon us all.
Eau de Tech Asshole
There are many things I can't say for sure about the AI bubble - when it will burst, how long and harsh the next AI/tech winter will be, what new tech bubble will pop up in its place (if any), etcetera.
One thing I feel I can say for sure, however, is that the AI bubble and its myriad harms will leave a lasting stigma on the tech industry once it finally bursts.
Already, it seems AI has a pretty hefty stigma around it - as Baldur Bjaranason noted when talking about when discussing AI's sentiment disconnect between tech and the public:
> To many, “AI” seems to have become a tech asshole signifier: the “tech asshole” is a person who works in tech, only cares about bullshit tech trends, and doesn’t care about the larger consequences of their work or their industry. Or, even worse, aspires to become a person who gets rich from working in a harmful industry. > > For example, my sister helps manage a book store as a day job. They hire a lot of teenagers as summer employees and at least those teens use “he’s a big fan of AI” as a red flag. (Obviously a book store is a biased sample. The ones that seek out a book store summer job are generally going to be good kids.) > > I don’t think I’ve experienced a sentiment disconnect this massive in tech before, even during the dot-com bubble.
On another front, there's the cultural reevaluation of the Luddites - once brushed off as naught but rejectors of progress, they are now coming to be viewed as folk heroes in a sense, fighting against misuse of technology to disempower and oppress, rather than technology as a whole.
There's also the rather recent SAG-AFTRA strike which kicked off just under a year after the previous one, and was started for similar reasons - to protect those working in the games industry from being shafted by AI like so many other people.
With how the tech industry was responsible for creating this bubble at every stage - research, development, deployment, the whole nine yards - it is all but guaranteed they will shoulder the blame for all that its unleashed. Whatever happens after this bubble, I expect hefty scrutiny and distrust of the tech industry for a long, long time after this.
To quote @datarama, "the AI industry has made tech synonymous with “monstrous assholes” in a non-trivial chunk of public consciousness" - and that chunk is not going to forget any time soon.
Some Off-The-Cuff Predictions about the AI Bubble
I've been hit by inspiration whilst dicking about on Discord - felt like making some off-the-cuff predictions on what will happen once the AI bubble bursts. (Mainly because I had a bee in my bonnet that was refusing to fuck off.)
- A Full-Blown Tech Crash
Its no secret the industry's put all their chips into AI - basically every public company's chasing it to inflate their stock prices, NVidia's making money hand-over-fist playing gold rush shovel seller, and every exec's been hyping it like its gonna change the course of humanity.
Additionally, going by Baldur Bjarnason, tech's chief goal with this bubble is to prop up the notion of endless growth so it can continue reaping the benefits for just a bit longer.
If and when the tech bubble pops, I expect a full-blown crash in the tech industry (much like Ed Zitron's predicting), with revenues and stock prices going through the floor and layoffs left and right. Additionally, I'm expecting those stock prices will likely take a while to recover, if ever, as tech likely comes to be viewed either as a stable, mature industry that's no longer experiencing nonstop growth or as an industry experiencing a full-blown malaise era, with valuations and stock prices getting savaged as Wall Street comes to see tech companies as high risk investments at best and money pits at worst. (Missed this incomplete sentence several times)
Chance: Near-Guaranteed. I'm pretty much certain on this, and expect it to happen sometime this year.
- A Decline in Tech/STEM Students/Graduates
Extrapolating a bit from Prediction 1, I suspect we might see a lot less people going into tech/STEM degrees if tech crashes like I expect.
The main thing which drew so many people to those degrees, at least from what I could see, was the notion that they'd make you a lotta money - if tech publicly crashes and burns like I expect, it'd blow a major hole in that notion.
Even if it doesn't kill the notion entirely, I can see a fair number of students jumping ship at the sight of that notion being shaken.
Chance: Low/Moderate. I've got no solid evidence this prediction's gonna come true, just a gut feeling. Epistemically speaking, I'm firing blind.
- Tech/STEM's Public Image Changes - For The Worse
The AI bubble's given us a pretty hefty amount of mockery-worthy shit - Mira Murati shitting on the artists OpenAI screwed over, Andrej Karpathy shitting on every movie made pre-'95, Sam Altman claiming AI will soon solve all of physics, Luma Labs publicly embarassing themselves, ProperPrompter recreating motion capture, But Worse^tm, Mustafa Suleyman treating everything on the 'Net as his to steal, et cetera, et cetera, et fucking cetera.
All the while, AI has been flooding the Internet with unholy slop, ruining Google search, cooking the planet, stealing everyone's work (sometimes literally) in broad daylight, supercharging scams, killing livelihoods, exploiting the Global South and God-knows-what-the-fuck-else.
All of this has been a near-direct consequence of the development of large language models and generative AI.
Baldur Bjarnason has already mentioned AI being treated as a major red flag by many - a "tech asshole" signifier to be more specific - and the massive disconnect in sentiment tech has from the rest of the public. I suspect that "tech asshole" stench is gonna spread much quicker than he thinks.
Chance: Moderate/High. This one's also based on a gut feeling, but with the stuff I've witnessed, I'm feeling much more confident with this than Prediction 2. Arguably, if the cultural rehabilitation of the Luddites is any indication, it might already be happening without my knowledge.
If you've got any other predictions, or want to put up some criticisms of mine, go ahead and comment.