Sam Harris
๐ค SpeakerAppearances Over Time
Podcast Appearances
So what are you expecting in the near term?
Let's leave concerns about alignment aside, unless you think we're going to plunge into superintelligence in the next 12 months.
What will you be unsurprised to see in the next year or two?
And what are you most worried about?
No, I didn't see that.
Well, it all falls out of what we mean by the concept of general intelligence, right?
So once you admit that we're building something that by definition is more intelligent than we are, right?
And any increment of progress, provided we just keep making that progress, is eventually going to deliver that result.
Leaving aside the alignment problem, let's say it's just perfectly aligned, right?
We build it perfectly the first time.
It does exactly what we want or what we think we want.
It should be obvious that this is unlike any other technology because intelligence is the basis of everything else we do.
I mean, it's science.
It's the generation of each new technology.
It will build the future machine that will build the future machine.
Right.
And then the only thing that's left standing is...
what we care still has a human providence, right?
So like in situate, I'm not even sure nurses in the end survive contact with this principle, but for those things where we are always going to want the human in the loop, right?
Or the human to be the origin of the product, whether it's, you know, music or novels or, you know,