Eliezer Yudkowsky
๐ค SpeakerAppearances Over Time
Podcast Appearances
Yeah, the probabilistic stuff is a giant wasteland of, you know, Eliezer and Paul Cristiano arguing with each other and EA going like, And that's with like two actually trustworthy systems that are not trying to deceive you.
You're talking about the two humans?
Myself and Paul Cristiano, yeah.
Yeah.
Yeah, if it's hard to tell who's right, then it's hard to train an AI system to be right.
No, I'm saying it's difficult and dangerous in proportion to how it's alien and how it's smarter than you.
I would not say growing exponentially first, because the word exponential is a thing that has a particular mathematical meaning.
And there's all kinds of ways for things to go up that are not exactly on an exponential curve.
And I don't know that it's going to be exponential, so I'm not going to say exponential.
But even leaving that aside, this is not about...
How fast it's moving, it's about where it is.
How alien is it?
How much smarter than you is it?
Well, how smart is it?
Suppose that some alien civilization with goals ultimately unsympathetic to ours, possibly not even conscious as we would see it, managed to capture the entire Earth in a little jar connected to their version of the internet, but Earth is like running much faster than the aliens.
So we get to think for 100 years for every one of their hours.
but we're trapped in a little box and we're connected to their internet.
It's actually still not all evacuated analogy because, you know, you want to be smarter than, you know, something can be smarter than earth getting a hundred years to think.
But nonetheless, if you were very, very smart and you were stuck in a little box connected to the internet and you're in a larger civilization to which you are ultimately unsympathetic,
You know, maybe you would choose to be nice because you are humans and humans have, in general, and you in particular, they choose to be nice.