Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Eliezer Yudkowsky

๐Ÿ‘ค Speaker
See mentions of this person in podcasts
1713 total appearances

Appearances Over Time

Podcast Appearances

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

being optimistic about what the utterly alien optimization process of natural selection will produce in the way of how it optimizes its objectives.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

You got people arguing that like in the early days, biologists said, well, like organisms will restrain their own reproduction when resources are scarce so as not to overfeed the system.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And this is not how natural selection works.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

It's about whose genes are relatively more prevalent in the next generation.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And if you restrain reproduction, those genes get less frequent in the next generation compared to your conspecifics.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Natural selection doesn't do that.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

In fact, predators overrun prey populations all the time and have crashes.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

That's just like a thing that happens.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And many years later, the people said like, well, but group selection, right?

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

What about groups of organisms?

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And basically the math of group selection almost never works out in practice is the answer there.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

But also years later, somebody actually ran the experiment where they took populations of insects and selected the whole populations to have lower sizes.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And you just take pop one, pop two, pop three, pop four, look at which has the lowest total number of them in the next generation and select that one.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

What do you suppose happens when you select populations of insects like that?

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Well, what happens is not that the individuals in the population evolved to restrain their breeding, but that they evolved to kill the offspring of other organisms, especially the girls.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

So people imagined this lovely, beautiful, harmonious output of natural selection, which is

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

these populations restraining their own breeding so that groups of them would stay in harmony with the resources available.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And mostly the math never works out for that.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

But if you actually apply the weird, strange conditions to get group selection that beats individual selection, what you get is female infanticide.