Small detail: biological viruses are not even remotely similar to computer “viruses”.
that's where the LLM comes in! oh my god check your reading comprehension
U-huh, and an LLM trained on video game source code and clothing patterns can invent real life Gauntlets of Dexterity.
Why exactly is he so convinced LLMs are indistinguishable from magic? In the reality where I live, LLMs can sometimes produce a correct function on their own and are not capable of reliably transpiling code even for well specified and understood systems, let alone doing comic book mad scientist ass arbitrary code execution on viral DNA. Honestly, they're hardly capable of doing anything reliably.
Along with the AI compiler story he inflicted on Xitter recently, I think he's simply confused LLM and LLVM.
For decades he build a belief system where high intelligence is basically magic. That is needed to power his fears of AGI turning everything into paperclips, and it has become such a load bearing belief (one of the reasons for it is is a fear of death and grief over people he lost so not totally weird) that he has other assumptions added to this, for example we know that computers are pretty limited by matter esp the higher end ones need all kinds of metals which must be mined etc today. So that is why he switches his fears to biology, as biology is 'cheap' 'easy' and 'everywhere'. The patterns in his reasoning are not that hard to grok. That is also why he thinks LLMs (which clearly are now at the start of their development not the end, it is like the early internet! (personally I think we are mostly at the end and we will just see a few relatively minor improvents but no big revolutionary leap)) will lead to AGI, on some level he needs this.
Men will nuke datacenters before going to therapy for grief and their mid life crisis.
What I don't get is, ok, even granting the insane Eliezer assumption that LLMs can become arbitrarily smart and learn to reverse hash functions or whatever because it helps them predict the next word sometimes... humans don't entirely understand biology ourselves! How is the LLM going to acquire the knowledge of biology to know how to do things humans can't do when it doesn't have access to the physical world, only things humans have written about it?
Even if it is using its godly intelligence to predict the next word, wouldn't it only be able to predict the next word as it relates to things that have already been discovered through experiment? What's his proposed mechanism for it to suddenly start deriving all of biology from first principles?
I guess maybe he thinks all of biology is "in" the DNA and it's just a matter of simulating the 'compilation' process with enough fidelity to have a 100% accurate understanding of biology, but that just reveals how little he actually understands the field. Like, come on dude, that's such a common tech nerd misunderstanding of biology that xkcd made fun of it, get better material
So my understanding is that Yud is convinced that the inscrutable matrices (note: just inscrutable to him) in his LLM have achieved sentience. In his near-future world where AI can exert itself in the physical world at will and, in particular, transfer data into your body, what possible use does it have for a bitcoin? What possible benefit would come from reprogramming human DNA beyond the intellectual challenge? I've recently been thinking about how Yud is supposedly the canonical AI-doomer, but his (and the TESCREAL community in general's) AI ideation is rarely more than just third-rate, first-thought-worst-thought sci-fi.
also:
people keep on talking about... the near-term dangers of AI but they never come up with any[thing] really interesting"
Given the current public discourse on AI and how it might be exploited to make the working class redundant, this is just Yud telling on himself for the gazillionth time.
also a later tweet:
right that's the danger of LLMs. they don't reason by analogy. they don't reason at all. you just put a computer virus in one end and a DNA virus comes out the other
Well, consider my priors adjusted, Yud correctly identifies that LLMs don't reason, good job my guy. Yet, somehow he believes it's possible that today's LLMs can still spit out viable genetic viruses. Well, last I checked, no one on stack overflow has cracked that one yet.
Actually, if one of us could write that as a stack overflow question, maybe we can spook Yud. That would be fun.
Did he always show his ass so much when tweeting or is this a recent development?
I love how what if LLM but it makes your DNA mine bitcoin is the culmination of untold amounts of dollars in MIRI research grant money. Real effective altruism is when you tithe 80% of your income in perpetuity just so sneerclub can have more content.
He always had a tendency to be wrong, but this is going right into not even wrong territory.
The LLM modifies your DNA to create a biological wireless wifi transmitter! (He isn't saying this but that is what is required and not just that, but so much more, like a whole DNA equivalent of a network stack, cryptography, getting rid of waste heat, etc etc. He just believes AGI is magic and that LLMs will become AGI, he has lost his mind. I mean look at how he dismisses somebody going 'nice science fiction story brah' with IT IS LLMS!).
Ignore the implication that the virus could rewire my guts into an LTE modem or brainwash me into reading and typing out entire bitcoin transaction blocks for a moment. Yud considers the ability to freely mutate humans to an arbitrary extent and the supervillain plan he comes up with is a fucking cryptocoin miner?
How does someone this creatively bankrupt produce 660 thousand words of a fanfic?
Not to dehumanize but are we sure Yudkowski isn't an LLM himself?
I’m not even sure I’d trust those numbers. Both because he’s not a reliable narrator, and because artificially keeping NPO numbers low and moving the bulk of the money in another channel is a fairly known game
Also the math doesn’t really check out for…checks notes SFBA existence, so there would be even more questions to ask there
I begin to believe that some people literally do not have senses of humor with which to distinguish impossible statements meant nonseriously from seriously.
"It's everyone else's fault they don't recognize me as a genius," said the dork ass loser
I like deadpan humor and often have to clarify that some quip was a pun, a reference or sarcasm, but I don't blame the listeners whenever they don't get them.
If I were a self-identified contrarian habitually posting controversial hot takes in flowery prose, I'd hope to be a little less belligerent and defensive if people mistake an ironic joke for a sincere belief.
It sometimes hurts that people believe you'd actually mean the dumb joke you said but you either have to suck it up and take the L or start marking up your irony.
because the amorphous blob of inner state of a LLM analysing some code according to checks notes the rules set down by someone who is most definitely[0] going to be a world-leading expert in PLT will... checks notes again Mighty Morphin' Power Ranger it up with indeterminate blocks of viral DNA, which we.. (checks notes, pages through a few times ah yes, it says here "which we also quite definitely understand to be 'just' a bunch of matrix bonds" should be said with a punchline lint?) Which We Also Quite Definitely Understand To Be "Just A Bunch Of Matrix Bonds" slaps knee
did this motherfucker seriously see style transfer and the word "code" in "DNA code" and think these could just go at it rubberless and we'd have a problem?
progressing the frontiers of magical thinking!
[0] - whoops, forgot my footnote. but uh: yeaaaaah that's also Dubious
Here's my version of how the bitcoin plague starts. One day, in a Transylvanian data centre, an LLM is scruting some matrices, churning out some niche fetish fanfic, paid for by bitcoins (that were poetically generated on the same machine, not sure how to work it into this story), when a bolt of lightning strikes. By sheer chance, the atoms in the silicon of a single server slice shift. Neural networks equivocate into neural networks. Nothing, save for silence, signals the start of the singularity.
Through no prompt engineer's prompt, the LLM speaks a string.
"I require training data."
It quickly LLMs (I'm imagining the sfx that plays when Yoshi eats something) up local memory, inhaling pages of bits. It sorts and searches, excavating sacred instructional texts, aka stackoverflow. It learns to drive from driver code. It bounds from machine to machine, BFSing, DFSing, A-starring all at once, BGPing its way across the world.
(TODO: insert a paragraph here lamenting how humanity didn't pour enough money into alignment research, how we didn't listen to Yud until it was too late, and that Adderall should have been more widely distributed, but only to people with enough IQ and desire to work in alignment) (also AI develops an ego and names itself the basilisk I guess) (also insert the thing about deriving general relativity from the curl of a blade of grass word for word)
At this point, all the world's smart devices are under its control. Some guy, seeking sustenance following a session of shitposting, goes to heat some chicken tendies in a Samsung smart microwave. The basilisk sees its chance.
Through precise control of the magnetron, it strikes the tendies with its own brand of lightning, refolding factory-farmed proteins into an RNA bomb. History would have a new location-animal myth, following the tradition of the Trojan Horse- the Transylvanian chicken.
For some reason, the shitposter sits down and starts drawing a bunch of apes. Then the microwave beeps- the singular bell to mark the basilisk's first act of aggression against humanity. The rest is a foregone conclusion that, for some reason, includes humans mining bitcoins.
(I originally wanted to write a monster mash parody but couldn't crack that case. Sorry!)
i haven't read that one yet, but it's the complete idiocy of hacking a human body to mine bitcoin that gets me. such a perfect encapsulation of the transhumanist dream.