Sam Harris
๐ค SpeakerAppearances Over Time
Podcast Appearances
I need to be regulated.
Yeah.
And so, um,
And one thing to point out, I mean, I think it was more or less explicit at one point in this conversation, but might've gone by unnoticed, is that the alignment problem is arguably, I mean, it's the scariest problem.
It's where we ruin everything, but it is fully divorceable from all these other problems, which in their totality are still quite bad, right?
So, I mean, we're living in a world now where if we were just simply handed by God,
a perfectly aligned AI, super intelligence.
So it's going to do exactly what we want.
It's never going to go rogue.
We don't have to, the world's not going to be tiled with, you know, solar arrays and servers.
It still has all of these unintended effects that we have to figure out how to mitigate.
Yes.
Wealth concentration, mass unemployment.
That's right.
The political instability of all of that.
in the case of alignment, but still technology that can be maliciously used, the bad actor problem.
I mean, if you can cure cancer, you can also spread some heinous virus that you've synthesized.
So we have an immense problem to solve, even if there was no concern about anything going rogue on us.
Well, Y2K is a kind of an unhappy precedent because it was something, it was, yeah, it was a very clear...
landmark on the, you know, on the calendar.