Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Pricing

Eliezer Yudkowsky

๐Ÿ‘ค Speaker
See mentions of this person in podcasts
1713 total appearances

Appearances Over Time

Podcast Appearances

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

I have been struggling for years to convey this intuition.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

The most success I've had so far is, well, imagine that the humans are running at very high speeds compared to very slow aliens.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Because people understand the power gap of time.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

They understand that today we have technology that was not around 1,000 years ago, and that this is a big power gap, and that it is bigger than... Okay, so what does smart mean?

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

When you ask somebody to imagine something that's more intelligent, what does that word mean to them, given the cultural associations that that person brings to that word?

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

For a lot of people, they will think of like, well, it sounds like a super chess player that went to double college.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And because we're talking about the definitions of words here, that doesn't necessarily mean that they're wrong.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

It means that the word is not communicating what I want it to communicate.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

The thing I want to communicate is the sort of difference that separates humans from chimpanzees.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

But that gap is so large that you ask people to be like, well, human, chimpanzee, go another step along that interval of around the same length, and people's minds just go blank.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Like, how do you even do that?

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And I can try to break it down and consider what it would mean to send a schematic for an air conditioner 1,000 years back in time.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Now, I think that there is a sense in which you could redefine the word magic to refer to this sort of thing.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And what do I mean by this new technical definition of the word magic?

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

I mean that if you send a schematic for the air conditioner back in time, they can see exactly what you're telling them to do.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

But having built this thing, they do not understand how it output cold air.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Because the air conditioner design uses the relation between temperature and pressure.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And this is not a law of reality that they know about.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

They do not know that when you compress something, when you compress air or like coolant, it gets hotter and then you can then like transfer heat from it to room temperature air.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

and then expand it again, and now it's colder, and then you can transfer heat to that and generate cold air to blow out.