Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Eliezer Yudkowsky

๐Ÿ‘ค Speaker
See mentions of this person in podcasts
1716 total appearances

Appearances Over Time

Podcast Appearances

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

and more sensible people saying, if aliens were landing in 30 years, you would be preparing right now.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And the world looking on at this and sort of nodding along and be like, ah, yes, the people saying that it's definitely a long way off because progress is really slow, that sounds sensible to us.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

RLHF, thumbs up.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Produce more outputs like that one.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

I agree with this output.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

This output is persuasive.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

even in the field of effective altruism.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

You quite recently had people publishing papers about like, ah, yes, well, you know, to get something at human level intelligence, it needs to have like this many parameters and you need to like do this much training of it with this many tokens according to the scaling laws and at the rate that Moore's law is going, at the rate that software is going, it'll be in 2050.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And me going like,

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

What?

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

You don't know any of that stuff.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

This is like this one weird model that has all kinds of like, you have done a calculation that does not obviously bear on reality anyways.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And this is like a simple thing to say, but you can also produce a whole long paper

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

impressively arguing out all the details of how you got the number of parameters and how you're doing this impressive huge wrong calculation.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And I think most of the effective altruists who are paying attention to this issue, the larger world paying no attention to it at all,

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

you know, or just like nodding along with a giant impressive paper because, you know, you like press thumbs up for the giant impressive paper and thumbs down for the person going like, I don't think that this paper bears any relation to reality.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And I do think that we are now seeing with like GPT-4 and the sparks of AGI that,

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Possibly, depending on how you define that even.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

I think that EAs would now consider themselves less convinced by the very long paper on the argument from biology as to AGI being 30 years off.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

But this is what people pressed thumbs up on.