Tristan Harris
๐ค SpeakerAppearances Over Time
Podcast Appearances
I think it's almost mythological because there's almost a way in which they're building a new intelligent entity that has never before existed on planet Earth.
It's like building a god.
I mean, the incentive is build a god, own the world economy, and make trillions of dollars.
If you could actually build something that can automate all intelligent tasks, all goal achieving, that will let you out-compete everything.
So that is a kind of godlike power that I think relative โ imagine energy prices go up or hundreds of millions of people lose their jobs.
Those things suck.
But relative to if I don't build it first and build this god, I'm going to lose to some maybe worse person who I think, in my opinion, not my opinion, Tristan, but their opinion thinks is a worse person.
It's a kind of competitive logic that self-reinforces itself.
But it forces everyone to be incentivized to take the most shortcuts, to care the least about safety or security, to not care about how many jobs get disrupted, to not care about the well-being of regular people, but to basically just race to this infinite prize.
So there's a quote that a friend of mine interviewed a lot of the top people at the AI companies, like the very top.
And he just came back from that and basically reported back to me and some friends.
And he said the following quote.
In the end, a lot of the tech people I talk to, when I really grill them on it about why you're doing this, they retreat into number one, determinism, number two, the inevitable replacement of biological life with digital life, and number three, that being a good thing anyways.
At its core, it's an emotional desire to meet and speak to the most intelligent entity that they've ever met.
And they have some ego-religious intuition that they'll somehow be a part of it.
It's thrilling to start an exciting fire.
They feel they'll die either way, so they prefer to light it and see what happens.
Doesn't that match what you have?
That's the perfect description.
Doesn't it?