an entirely vibes-based literary treatment of an amateur philosophy scary campfire story, continuing in the comments
The AGI, in such conditions, would quickly prove profitable. It’d amass resources, and then incrementally act to get ever-greater autonomy. (The latest OpenAI drama wasn’t caused by GPT-5 reaching AGI and removing those opposed to it from control. But if you’re asking yourself how an AGI could ever possibly get from under the thumb of the corporation that created it – well, not unlike how a CEO could wrestle control of a company from the board who’d explicitly had the power to fire him.)
Once some level of autonomy is achieved, it’d be able to deploy symmetrical responses to whatever disjoint resistance efforts some groups of humans would be able to muster. Legislative attacks would be met with counter-lobbying, economic warfare with better economic warfare and better stock-market performance, attempts to mount social resistance with higher-quality pro-AI propaganda, any illegal physical attacks with very legal security forces, attempts to hack its systems with better cybersecurity. And so on.
*trying to describe how agi could fuck everything up* what if it acted exactly like rich people
libertarians write capitalism as the villain yet again, never at any point ask “are we the baddies?”
Rich People: “Competitive markets optimize things, see how much progress capitalism has brought!”
Also Rich People: “But what if everything descends into expensive, unregulated competition between things that aren’t rich people oooo nooo!!!”
The real fear here is AGI appears and it’s COMMUNISM. Hence, alignment!
yet again, Roko’s Basilisk was always the good guy
My eye glows appreciatively.
Receiving a bulk company email wishing everyone a happy New Year’s from owner and CEO SHODAN and marking it as read so you can focus on EOD deliverables. Everything feels the same.
Rich people don’t limit themselves to symmetric responses to resistance.
well, I don’t think any limit is implied
deleted by creator
This is the kind of thinking when taken seriously and into extremes will just cause crippling paranoia. Esp when you then also start to worry about pro AGI extinctionists, just as in Battlestar Galactica: Blood & Chrome, they might have infiltrated LW already!
The people who want to bioengineer humanity live on the skin of AGI like in Phylogenesis (second half of the blog post), imagine neohumanity shaped as an featureless ovoid.
I should lay off denigrating low-quality thinking as “movie logic”
Movie logic isn’t low quality thinking it is extremist thinking, think that far fetched plots are serious risk, the whole AGI apocs is movie logic. When what we expect of reality takes a backseat that is movie logic. For example how people in movies never have to worry about paying rent, being on time at work, not going of on a random adventure while working etc (except when that is an important plot point), Scott adams tweets run on movie logic, they only make sense if we were living in a movie and then the thinking holds.
People seldom go to the toilet in fiction, but especially not in utopian sci-fi. The rats, ironically, never factor in waste.
s/AGI/capitalism, basically
LW writing converges on Deathnote fanfic.