Eliezer Yudkowsky
๐ค SpeakerAppearances Over Time
Podcast Appearances
If humanity was trying to survive at this point in the straightforward way, it would be like shutting down the big GPU clusters, no more giant runs.
It's questionable whether we should even be throwing GPT-4 around, although that is a matter of conservatism rather than a matter of my predicting that catastrophe that will follow from GPT-4.
That is something in which I put like a pretty low probability.
But also, when I say I put a low probability on it, I can feel myself reaching into the part of myself that thought that GPT-4 was not possible in the first place.
So I do not trust that part as much as I used to.
The trick is not just to say I'm wrong, but like, okay, well, I was wrong about that.
Can I get out ahead of that curve and predict the next thing I'm going to be wrong about?
You don't want to keep on being wrong in a predictable direction.
Being wrong, anybody has to do that walking through the world.
There's no way you don't say 90% and sometimes be wrong.
In fact, it'd happen at least one time out of 10 if you're well calibrated when you say 90%.
The undignified thing is not being wrong.
It's being predictably wrong.
It's being wrong in the same direction over and over again.
So having been wrong about how far neural networks would go and having been wrong specifically about whether GPT-4 would be as impressive as it is, when I say like, well, I don't actually think GPT-4 causes a catastrophe, I do feel myself relying on that part of me that was previously wrong.
And that does not mean that the answer is now in the opposite direction.
Reverse stupidity is not intelligence.
but it does mean that I say it with a worried note in my voice.
It's like still my guess, but like, you know, it's a place where I was wrong.
Maybe you should be asking Guern, Guern Branwen.