Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Eliezer Yudkowsky

๐Ÿ‘ค Speaker
See mentions of this person in podcasts
1716 total appearances

Appearances Over Time

Podcast Appearances

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

I think we are not on track to learn remotely near fast enough, even if it were open sourced.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Yeah, it's easier to think that you might be wrong about something when being wrong about something is the only way that there's hope.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And it doesn't seem very likely to me that the particular thing I'm wrong about is that this is a great time to open source GPT-4.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

If humanity was trying to survive at this point in the straightforward way, it would be like shutting down the big GPU clusters, no more giant runs.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

It's questionable whether we should even be throwing GPT-4 around, although that is a matter of conservatism rather than a matter of my predicting that catastrophe that will follow from GPT-4.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

That is something in which I put like a pretty low probability.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

But also, when I say I put a low probability on it, I can feel myself reaching into the part of myself that thought that GPT-4 was not possible in the first place.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

So I do not trust that part as much as I used to.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

The trick is not just to say I'm wrong, but like, okay, well, I was wrong about that.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Can I get out ahead of that curve and predict the next thing I'm going to be wrong about?

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

You don't want to keep on being wrong in a predictable direction.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Being wrong, anybody has to do that walking through the world.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

There's no way you don't say 90% and sometimes be wrong.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

In fact, it'd happen at least one time out of 10 if you're well calibrated when you say 90%.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

The undignified thing is not being wrong.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

It's being predictably wrong.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

It's being wrong in the same direction over and over again.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

So having been wrong about how far neural networks would go and having been wrong specifically about whether GPT-4 would be as impressive as it is, when I say like, well, I don't actually think GPT-4 causes a catastrophe, I do feel myself relying on that part of me that was previously wrong.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And that does not mean that the answer is now in the opposite direction.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Reverse stupidity is not intelligence.