Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Eliezer Yudkowsky

๐Ÿ‘ค Speaker
See mentions of this person in podcasts
1713 total appearances

Appearances Over Time

Podcast Appearances

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

There's your red fire alarm of like, oh, no, alignment is difficult.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Is everybody going to shut everything down now?

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

For you.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

So you put a line there and everybody else puts a line somewhere else and there's like, yeah, and there's like no agreement.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

We have had a pandemic on this planet with a few million people dead, which we may never know whether or not it was a lab leak, because there was definitely cover-up.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

We don't know if there was a lab leak, but we know that...

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

The people who did the research put out the whole paper about this definitely wasn't a lab leak and didn't reveal that they had sent off coronavirus research to the Wuhan Institute of Virology after it was banned in the United States.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

After the gain-of-function research was temporarily banned in the United States.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

The same people who exported gain-of-function research on coronaviruses to the Wuhan Institute of Virology after that gain-of-function research was temporarily banned in the United States are now getting more grants to do more gain-of-function research on coronaviruses.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Maybe we do better in this than in AI, but this is not something we cannot take for granted that there's going to be an outcry.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

People have different thresholds for when they start to outcry.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Nothing like the world in front of us right now.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

You've already seen that GPT-4 is not turning out this way.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And there are basic obstacles where you've got the weak version of the system that doesn't know enough to deceive you, and the strong version of the system that could deceive you if it wanted to do that, if it was already sufficiently unaligned to want to deceive you.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

There's the question of how on the current paradigm you train honesty when the humans can no longer tell if the system is being honest.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

I think they could be answered if 50 years with unlimited retries the way things usually work in science.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Are Earth's billionaires going to put up the giant prizes that would maybe incentivize young hotshot people who just got their physics degrees to not go to the hedge funds and instead put everything into interpretability in this one small area where we can actually tell whether or not somebody has made a discovery or not?

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

I think so.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

When?