Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing

Eliezer Yudkowsky

๐Ÿ‘ค Speaker
See mentions of this person in podcasts
1713 total appearances

Appearances Over Time

Podcast Appearances

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Are you referring to Chalmers' hard problem of conscious experience?

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Are you referring to self-awareness and reflection?

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Are you referring to the state of being awake as opposed to asleep?

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

So I think that for there to be a person who I care about looking out at the universe and wondering at it and appreciating it, it's not enough to have a model of yourself.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

I think that it is useful to an intelligent mind to have a model of itself, but I think you can have that without pleasure, pain, aesthetics, emotion, a sense of wonder.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

I think you can have a model of how much memory you're using and whether...

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

this thought or that thought is more likely to lead to a winning position.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

I think that if you optimize really hard on efficiently just having the useful parts, there is not then the thing that says like, I am here, I look out, I wonder, I feel happy in this, I feel sad about that.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

I think there's a thing that knows what it is thinking, but that doesn't quite care about, these are my thoughts, this is my me, and that matters.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

I think that if that's lost, then basically everything that matters is lost.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

I think that when you optimize, that when you go really hard on making tiny molecular spirals or paperclips, that when you grind much harder on that than natural selection ground out to make humans, that there isn't then the mess and intricate loopiness and

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

like complicated pleasure, pain, conflicting preferences, this type of feeling, that kind of feeling.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

In humans, there's this difference between the desire of wanting something and the pleasure of having it.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And it's all these evolutionary clutches that came together and created something that then looks of itself and says, this is pretty, this matters.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And the thing that I worry about is that this is not...

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

the thing that happens again, just the way that happens in us, or even like quite similar enough that there are like many basins of attractions here.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And we are in this space of attraction, like looking out and saying like, ah, what a lovely basin we are in.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And there are other basins of attraction.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

And we do not end up in, and the AIs do not end up in this one when they go like way harder on optimizing themselves.

Lex Fridman Podcast
#368 โ€“ Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

The natural selection optimized us.