Eliezer Yudkowsky
๐ค SpeakerAppearances Over Time
Podcast Appearances
We're talking about the taking over the world here.
Shutting down the factory farms.
You know, you say control.
Don't think of it as world domination.
Think of it as world optimization.
You want to get out there and shut down the factory farms and make the alien's world be not what the aliens wanted it to be.
They want the factory farms and you don't want the factory farms because you're nicer than they are.
But those all went through slow in our world.
And if you go through that through the aliens, millions of years are going to pass before anything happens that way.
Yeah, you want to leave the factory farms running for a million years while you figure out how to design new forms of social media or something?
What I'm trying to convey is the notion of what it means to be in conflict with something that is smarter than you.
And what it means is that you lose.
But this is more intuitively obvious to... For some people, that's intuitively obvious.
For some people, it's not intuitively obvious.
And we're trying to cross the gap of...
I'm asking you to cross that gap by using the speed metaphor for intelligence.
Sure.
Asking you how you would take over an alien world where you can do a whole lot of cognition at John von Neumann's level, as many of you as it takes, and the aliens are moving very slowly.
I think to understand the full depth of the problem, we actually, I do not think it is possible to understand the full depth of the problem that we are inside without
understanding the the problem of facing something that's actually smarter not a malfunctioning recommendation system not something that isn't fundamentally smarter than you but is like trying to steer you in a direction yet no like if we if we solve the the weak stuff this the if we solve the weak ass problems the strong problems will still kill us is the thing and i think that to understand the situation that we're in you want to like tackle the conceptually difficult part and