Tristan Harris
👤 SpeakerAppearances Over Time
Podcast Appearances
And so it's like, oh, this is spreading throughout the whole company.
And that's what led to me becoming a design ethicist where I had to research and ask the questions, what does it mean to ethically design and persuade people's psychological vulnerabilities when you can't not make choices about the psychological habitat?
You have to make a choice about whether you're going to do infinite scroll or not or autoplay or not or notifications or not or these 10 people followed you or not.
What does it mean to ethically make those choices?
Yeah, and how society, I think people are afraid to say, like, when you make a bridge, there's a physics to whether that bridge will sustain or whether it will fall apart, right?
And it's not magic.
We don't say, oh, like, who would have known that that bridge would fall apart?
We have a science of bridges and mechanical engineering and civil engineering.
And
With technology and human psychology, there is a science to the dopamine system.
There is a science to confirmation bias in our psychology and how we tend to perceive information through our tribal in-group, like we see things through the political tribe that we're a part of.
And if you understand that science, you can understand whether or not technology is manipulating that.
So one of the core things I think we were trying to do in that first chapter of work, and this, again, starting in 2013...
is break through this idea that technology is neutral and that we could never know what's good for people or that something could be bad for people.
Like I deliberately saw people make short form autoplaying videos that then created the brain rot economy that we're now living in.
Not technology use, but technology designed for certain outcomes of usage.
Really critical thing because we want to put attention on the design, not just how people are using it.
Yeah, understood.
Yeah.
Well, what happened was my team at Center for Humane Technology, our nonprofit, we got calls from people inside of the AI labs.