Tristan Harris
๐ค SpeakerAppearances Over Time
Podcast Appearances
When there's the risk of existential destruction, when this film called The Day After came out and it showed people this is what would actually happen in a nuclear war.
And once that was crystal clear to people โ
including in the Soviet Union where the film was aired in 1987 or 1989, that helped set the conditions for Reagan and Gorbachev to sign the first nonproliferation arms control talks once we had clarity about an outcome that we wanted to avoid.
And I think the current problem is that we're not having an honest conversation in the public about which world we're heading to that is not in anyone's interest.
The closer the technology that needs to be governed is,
is to the center of GDP and the center of the lifeblood of your economy, the harder it is to come to international negotiation and agreement.
And oil and fossil fuels was the kind of the pumping the heart of our economic super organisms that are currently competing for power.
And so coming to agreements on that is really, really hard.
AI is even harder because AI pumps not just economic growth, but scientific, technological, and military advantages.
And so it will be the hardest coordination challenge that we will ever face.
But if we don't face it, if we don't make some kind of choice, it will end in tragedy.
We're not in a race just to have technological advantage.
We're in a race for who can better govern that technology's impact on society.
So, for example, the United States beat China to social media.
That technology.
Did that make us stronger?
Did that make us weaker?
We have the most anxious and depressed generation of our lifetime.
We have the least informed and most polarized generation.
We have the worst critical thinking.