Eliezer Yudkowsky
๐ค SpeakerAppearances Over Time
Podcast Appearances
Are you referring to Chalmers' hard problem of conscious experience?
Are you referring to self-awareness and reflection?
Are you referring to the state of being awake as opposed to asleep?
So I think that for there to be a person who I care about looking out at the universe and wondering at it and appreciating it, it's not enough to have a model of yourself.
I think that it is useful to an intelligent mind to have a model of itself, but I think you can have that without pleasure, pain, aesthetics, emotion, a sense of wonder.
I think you can have a model of how much memory you're using and whether...
this thought or that thought is more likely to lead to a winning position.
I think that if you optimize really hard on efficiently just having the useful parts, there is not then the thing that says like, I am here, I look out, I wonder, I feel happy in this, I feel sad about that.
I think there's a thing that knows what it is thinking, but that doesn't quite care about, these are my thoughts, this is my me, and that matters.
I think that if that's lost, then basically everything that matters is lost.
I think that when you optimize, that when you go really hard on making tiny molecular spirals or paperclips, that when you grind much harder on that than natural selection ground out to make humans, that there isn't then the mess and intricate loopiness and
like complicated pleasure, pain, conflicting preferences, this type of feeling, that kind of feeling.
In humans, there's this difference between the desire of wanting something and the pleasure of having it.
And it's all these evolutionary clutches that came together and created something that then looks of itself and says, this is pretty, this matters.
And the thing that I worry about is that this is not...
the thing that happens again, just the way that happens in us, or even like quite similar enough that there are like many basins of attractions here.
And we are in this space of attraction, like looking out and saying like, ah, what a lovely basin we are in.
And there are other basins of attraction.
And we do not end up in, and the AIs do not end up in this one when they go like way harder on optimizing themselves.
The natural selection optimized us.