Tristan Harris
๐ค SpeakerAppearances Over Time
Podcast Appearances
And I heard you on a recent podcast be a realist about the nature of
We need maximum deterrence and you have to match the capabilities of your adversary in autonomous weapons.
And you can walk and chew gum at the same time.
I don't want to live in a world with autonomous weapons.
I would much prefer to go back in time.
But we can acknowledge the need for maximum deterrence while acknowledging mutually assured loss of control as a failure scenario that we don't want to use them.
And make sure that we carve out no AI in nuclear command and control systems.
I think you can carve out some kind of agreement that humans need to be in control of AI.
Right.
And where we are building AIs that are demonstrating the behaviors and have a level of power to not just copy their own code, but even protect their peers, which we didn't talk about yet, we should be able to agree on human control of AI.
And I know that that sounds very difficult.
All of this is difficult.
It is the hardest coordination problem that we have ever faced.
And we still have to try.
That's right.
We just have to recognize that.
In a way, it's like it's an asteroid.
It's an actual asteroid that's coming to Earth and it's going to wipe us out.
Except we're, ironically, we're the ones conjuring and creating the asteroid.
And just to say, if literally every person on planet Earth was like, you know what, I really don't want this asteroid to exist.