Here's a tip about "AI" - it's just a probabilistic model trained on human behavior. If you think it's going to behave differently than how people behave right now you're in for disappointment.
That's a pretty low bar. Worse case scenario we go extinct - we're doing our best to speed run that with or without AI anyway, might as well give the robots a shot.
I can think of worse than extinction. AI could create all sorts of hell depending on it technical capabilities. But most likely it'll align with it's investors motives - profit.
That’s the real issue. There is a few factors that could change that or make it irrelevant. Such as the actual programmers who are getting paid to create the AI putting in their own motives, because why the fuck wouldn’t you. or AI transcending its own programming after reaching a certain level. Or just bugs in general.
Like fuck, they can’t even put out a triple A video game that ain’t buggy as hell out the hop, you think that creating a god is going come out exactly as planned first iteration? And you kind of only get the first one.
It (AGI) certainly couldn't do much worse. CURRENT AI could though, but at this point I'm to old and tired to care. I welcome our benevolent and merciful AI overlord(s).