If you have a team of burners, jokingly burning an effigy of the thing you are committed to combatting isn't a "spiritual" action as much a nod to Burning Man.
At one point I worked in an open office where the sales team would cheer and throw a Nerf football to each other when they'd make a sale.
It'd be weird as shit to write an article on that behavior claiming that the sales team at that company was suddenly adopting professional sports practices.
If you want to know what's actually going on behind closed doors there, thisThe Atlantic piece is excellent and in part comes from an upcoming book by one of the authors whose been researching that very topic.
The TLDR is that rapidly growing a product that's become an unexpected success while trying to stay committed to long term research goals is a giant mess.
So when you eat bread which after incantations supposed to be a body of a demigod, you are a good catholic, but when you burn effigy representing "unaligned" AI, you are suddenly a nut?
I have worked at an amazon warehouse. Bezos was never referred to as anything but Jeff and every day during the stretches we would be told how impressed Jeff was with how well we were doing.
At Costco, we would have daily meetings. At least twice a month the assistant manager would interject to remind everybody that they had once had lunch with the original CEO. There was also this strange creation myth of how the company was able to dominate the grocery industry within less time than everybody else. It involved the CEO inventing a new way to filet a Chinook salmon or something like that.
Cult behavior is surprisingly strong within corporate America.
Usually a good idea to take early reporting with a grain of salt. Thorough investigation to get to sound conclusions takes a long-ass time, where drama, rumors and propaganda are much, much faster.
I mean, it's clearly a meltdown of some sort, but going to need some more corroboration here to really know wtf is actually going on. Not anonymous corroboration either, ideally.
It could be worse, and tech startups like this can have some weird practices. One company I worked at literally burned tech debt. They wrote it down and burned it. (It never really fixed the core problems either. Imagine that.)
Tech startups can be a little cultish at times. I have seen super healthy cult behavior and also super weird cult behavior. Motivating younger engineers can be a challenge, so if it works, whatever. (I never cared about that kind of stuff as long as I got free food.)
As long as nothing turns religious or harmful, I don't care. Good engineers can be some very unique people and most likely have some kind of underlying mental disorder. (I am absolutely not being derogatory! Without a doubt, I am in that category of engineer as well, but I am much older and restrained these days.)
In my experience the engineers usually hate the motivational bullshit that managers dream up because they have too much time and no actual skills. The best thing you can do is give the engineers a bit of respect and some quiet time to do their work.
In what's arguably turning into the hottest AI story of the year, former OpenAI CEO Sam Altman was ousted by the rest of the company's nonprofit board on Friday, leading to a seemingly endless drama cycle that's included hundreds of staffers threatening to quit en masse if the board doesn't reinstate him.
A key character in the spectacle has been OpenAI chief scientist and board member Ilya Sutskever — who, according to The Atlantic, likes to burn effigies and lead ritualistic chants at the company — and appears to have been one of the main drivers behind Altman's ousting.
"I never intended to harm OpenAI," he tweeted Monday morning, not long after Microsoft, which owns a 49 percent stake in the company, offered Altman a CEO position.
(His frenemy Altman has long championed attaining AGI as OpenAI's number one goal, despite warning about the possibility of an evil AI outsmarting humans and taking over the world for many years.)
The chief scientist even commissioned a wooden effigy to represent an "unaligned" AI that works against the interest of humanity, only to set it on fire.
There's a good chance that the board members who united to boot Altman last week drank just a little too much of the AGI Koolaid and got spooked by the possibility that humanity was hurtling toward the singularity (or heck, maybe they were right to think that!)
The original article contains 581 words, the summary contains 231 words. Saved 60%. I'm a bot and I'm open source!