Eliezer Yudkowsky
๐ค SpeakerAppearances Over Time
Podcast Appearances
But some of them would be more cooperative than us.
Some of them would be smarter than us.
Hopefully some of the ones who are smarter and more cooperative than us are also nice, and hopefully there are some galaxies out there full of things that say, I am, I wonder,
But it doesn't seem like we're on course to have this galaxy be that.
No, if the nice aliens were already here, they would have stopped the Holocaust.
That's a valid argument against the existence of God.
It's also a valid argument against the existence of nice aliens.
And un-nice aliens would have just eaten the planet.
So, no aliens.
The thing I would say is that among the things that humans can do is design new AI systems.
And if you have something that is generally smarter than a human, it's probably also generally smarter at building AI systems.
This is the ancient argument for FUM put forth by I.J.
Goode and probably some science fiction writers before that, but I don't know who they would be.
Well, what's the argument against FUM?
various people have various different arguments, none of which I think hold up.
There's only one way to be right and many ways to be wrong.
A argument that some people have put forth is like, well, what if intelligence gets exponentially harder to produce as a thing needs to become smarter?
And to this, the answer is, well, look at natural selection spitting out humans.
We know that it does not take exponentially more resource investments to produce linear increases in competence in hominids because each mutation...
that rises to fixation, like if the impact it has in small enough, it will probably never reach fixation.