Tristan Harris
👤 PersonAppearances Over Time
Podcast Appearances
Uncontroversial things like restricting AI companions for kids so that kids are not manipulated into taking their own lives.
Having basic things like product liability.
So if you are liable as an AI developer for certain harms, that's going to create a more responsible innovation environment.
You release AI models that are more safe.
and on the side of preventing dystopia, for working hard to prevent ubiquitous technological surveillance and having stronger whistleblower protections so that people don't need to sacrifice millions of dollars in order to warn the world about what we need to know.
And so we have a choice.
Many of you may be feeling this looks hopeless, or maybe Tristan's wrong, maybe the incentives are different, or maybe superintelligence will magically figure all this out, and it'll bring us to a better world.
don't fall into the trap of the same wishful thinking and turning away that caused us to deal with social media.
Your role in this is not to solve the whole problem, but your role in this is to be part of the collective immune system
that when you hear this wishful thinking or the logic of inevitability and fatalism, to say that this is not inevitable.
And the best qualities of human nature is when we step up and make a choice about the future that we actually want for the people and the world that we love.
There is no definition of wisdom in any tradition that does not involve restraint.
Restraint is a central feature of what it means to be wise.
And AI is humanity's ultimate test and greatest invitation to step into our technological maturity.
There is no room of adults working secretly to make sure that this turns out OK.
We are the adults.
We have to be.
And I believe another choice is possible with AI if we can commonly recognize what we have to do.
And eight years from now, I'd like to come back to this stage, not to talk about more problems with technology, but to celebrate how we stepped up and solved this one.
Thank you.