“Model collapse” threatens to kill progress on generative AIs
“Model collapse” threatens to kill progress on generative AIs
Generative AIs start churning out nonsense when trained on synthetic data — a problem that could put a ceiling on their ability to improve.
You're viewing a single thread.
I've been assuming this was going to happen since it's been haphazardly implemented across the web. Are people just now realizing it?
26 3 ReplyPeople are just now acknowledging it. Execs tend to have a disdain for the minutiae. They're like kids that only want to do the exciting bits. As a result things get fucked because they don't really understand what they're doing. As Muskrat would say "move fast and break things." It's a terrible mindset.
39 1 Reply"Move Fast and Break Things" is Zuckerberg/Facebook motto, not Musk, just to note.
11 0 ReplyOh, I stand corrected
4 0 ReplyIt is very much the motto this idiot lives by. He just wasn't the first to coin that phrase.
8 0 Reply
No, researchers in the field knew about this potential problem ages ago. It's easy enough to work around and prevent.
People who are just on the lookout for the latest "aha, AI bad!" Headline, on the other hand, discover this every couple of months.
16 2 Reply