Tristan Harris
๐ค SpeakerAppearances Over Time
Podcast Appearances
Everyone's going to go extinct.
If anyone builds it, everyone dies.
And those conversations don't converge, right?
And so everyone's just kind of confused where how can it be infinite promise and how can it be infinite peril?
And what I wanted to do today is to really clarify for people what the incentives point us towards, which is a future that I think people, when they see it clearly, would not want.
So what are the incentives pointing us towards in terms of the future?
Yeah.
So first is if you believe that this is like, it's metaphorically, it's like the ring from Lord of the Rings.
It's the ring that creates infinite power.
Because if I have AGI, I can apply that to military advantage.
I can have the best military planner that can beat all battle plans for any one.
And we already have AIs that can obviously
beat Garry Kasparov at chess, beat Go, the Go Asian board game, or now beat StarCraft.
So you have AIs that are beating humans at strategy games.
Well, think about StarCraft compared to an actual military campaign in Taiwan or something like that.
If I have an AI that can out-compete in strategy games, that lets me out-compete everything.
Or take business strategy.
If I have an AI that can do business strategy and figure out supply chains and figure out how to optimize them and figure out how to undermine my competitors...
and I have a step function level increase in that compared to everybody else, then that gives me infinite power to undermine and outcompete all businesses.
If I have a super programmer, then I can outcompete programming.