Daniel Kokotajlo
๐ค SpeakerAppearances Over Time
Podcast Appearances
And it wasn't the result of some one person... I mean, I don't think it was the result of... I hope it wasn't the result of one person being like, I want to do this evil thing.
It was a result of...
mechanization and a certain economies of scale.
Incentives.
Yeah, allowing that like, oh, you can do cost-cutting in this way, you can make more efficiencies this way.
And what you get at the end result of that process is this incredibly efficient factory of torture and suffering.
I would want to avoid that kind of outcome with beings that are even more sophisticated and are more numerous.
There's billions of factory-farmed animals.
There might be trillions of digital people in the future.
What should we be thinking about in order to avoid this kind of ghoulish future?
I mean, the worry there is maybe I should have defended this view more through this entire episode, but I do think because I don't buy the intelligence explosion fully, I do think there's the possibility of multiple people deploying powerful AS at the same time and having a world that has ASIs but is also decentralized the way the modern world is decentralized.
In that world, I really worry about, because you could just be like, oh, classical liberal utopia achieved.
But I worry about the fact that you can just have these torture chambers for much cheaper and in a way that's much harder to monitor.
You can have millions of beings that are being tortured and it doesn't even have to be some huge data center.
Future distilled models could just, you know, could like literally be your backyard.
Yeah.
I don't know.
And then there's more speculative worries about I had this physicist on who was talking about the possibility of creating vacuum decay where you literally just destroy the universe.
And he's like, as far as I know, it seems totally plausible.
Can I ask a little bit more about the โ Kelsey Piper is a journalist at Vox who published this exchange you had with the OpenAI representative.