Eliezer Yudkowsky
๐ค SpeakerAppearances Over Time
Podcast Appearances
I don't like hexagonal tiles.
And we will do this even though at no point during our ancestry was any human optimized to build hexagonal dams or to take a more clear-cut case.
We can go to the moon.
There's a sense in which we were on a sufficiently deep level optimized to do things like going to the moon.
because if you generalize sufficiently far and sufficiently deeply, chipping flint hand axes and outwitting your fellow humans
Because, you know, basically the same problem as going to the moon.
And you optimize hard enough for chipping flint hand axes and throwing spears and, above all, outwitting your fellow humans in tribal politics.
You know, the skills you entrain that way, if they run deep enough, let you go to the moon.
even though none of her ancestors tried repeatedly to fly to the moon and got further each time, and the ones who got further each time had more kids.
No, it's not an ancestral problem.
It's just that the ancestral problems generalize far enough.
So this is humanity's significantly more generally applicable intelligence.
If you boil a frog gradually enough...
If you zoom in far enough, it's always hard to tell around the edges.
GPT-4, people are saying right now, this looks to us like a spark of general intelligence.
It is able to do all these things it was not explicitly optimized for.
Other people are being like, no, it's too early.
It's like 50 years off.
And if they say that, they're kind of whack, because how could they possibly know that even if it were true?
But, you know, not to strawman, some people may say, like, that's not general intelligence and not, furthermore, append.