Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing
Podcast Image

The Diary Of A CEO with Steven Bartlett

AI Expert: We Have 2 Years Before Everything Changes! We Need To Start Protesting! - Tristan Harris

27 Nov 2025

Transcription

Chapter 1: What are the potential consequences of unchecked AI development?

0.031 - 17.587 Tristan Harris

If you're worried about immigration taking jobs, you should be way more worried about AI. Because it's like a flood of millions of new digital immigrants that are Nobel Prize level capability, work at superhuman speed, and will work for less than minimum wage. I mean, we're heading for so much transformative change faster than our society is currently prepared to deal with it.

0

17.767 - 33.786 Tristan Harris

And there's a different conversation happening publicly than the one that the AI companies are having privately about which world we're heading to, which is a future that people don't want. But we didn't consent to have six people make that decision on behalf of 8 billion people. Tristan Harris is one of the world's most influential technology ethicists.

0

33.886 - 39.719 Steven Bartlett

Who created the Center for Humane Technology after correctly predicting the dangers social media would have on our society.

0

39.739 - 77.882 Tristan Harris

And now he's warning us about the catastrophic consequences AI will have on all of us. Let me, like, collect myself for a second. We can't let it happen. But as we're racing, we're landing in a world of unvetted therapists, rising energy prices, and major security risks.

0

77.942 - 93.055 Tristan Harris

I mean, we have evidence where if an AI model reading a company's email finds out it's about to get replaced with another AI model, and then it also reads in the company email that one executive is having an affair with an employee, the AI will independently blackmail that executive in order to keep itself alive. That's crazy. But what do you think?

93.696 - 100.682 Unknown

I'm finding it really hard to be hopeful, I'm going to be honest, Tristan. So I really want to get practical and specific about what we can do about this. Listen, I'm not naive.

Chapter 2: How does the public conversation about AI differ from private discussions?

100.702 - 101.723 Unknown

This is super fucking hard.

0

101.703 - 127.221 Unknown

but we have done hard things before and it's possible to choose a different future so just give me 30 seconds of your time two things i wanted to say the first thing is a huge thank you for listening and tuning into the show week after week means the world to all of us and this really is a dream that we absolutely never had and couldn't have imagined getting to this place but secondly it's a dream where we feel like we're only just getting started and

0

127.201 - 142.065 Unknown

And if you enjoy what we do here, please join the 24% of people that listen to this podcast regularly and follow us on this app. Here's a promise I'm going to make to you. I'm going to do everything in my power to make this show as good as I can now and into the future.

0

142.125 - 165.404 Unknown

We're going to deliver the guests that you want me to speak to and we're going to continue to keep doing all of the things you love about this show. Thank you. Tristan. I think my first question and maybe the most important question is we're going to talk about artificial intelligence and technology broadly today. But who are you in relation to this subject matter?

0

166.445 - 183.568 Tristan Harris

So I did a program at Stanford called the Mayfield Fellows Program that took engineering students and then taught them entrepreneurship. You know, I, as a computer scientist, didn't know anything about entrepreneurship, but they pair you up with venture capitalists. They give you mentorship. And, you know, there's a lot of powerful alumni who were part of that program.

183.648 - 204.397 Tristan Harris

The co-founder of Asana, the co-founders of... of Instagram were both part of that program. And that put us in kind of a cohort of people who were basically ending up at the center of what was going to colonize the whole world's psychological environment, which was the social media situation. And as part of that, I started my own tech company called Apture.

205.198 - 223.155 Tristan Harris

And we basically made this tiny widget that would help people find more contextual information without leaving the website they were on. It was a really cool product that was about deepening people's understanding. And I got into the tech industry because I thought that technology could be a force for good in the world. That's why I started my company.

223.135 - 239.353 Tristan Harris

And then I kind of realized through that experience that at the end of the day, these news publishers who used our product, they only cared about one thing, which is, is this increasing the amount of time and eyeballs and attention on our website? Because eyeballs meant more revenue.

240.375 - 259.432 Tristan Harris

And I was in sort of this conflict of, I think I'm doing this to help the world, but really I'm measured by this metric of what keeps people's attention. That's the only thing that I'm measured by. And I saw that conflict play out among my friends who started Instagram because they got into it because they wanted people to share little bite-sized moments of your life.

Chapter 3: What incentives are driving the race towards artificial general intelligence (AGI)?

263.058 - 284.058 Tristan Harris

That's what Kevin Systrom used to post when he was just starting it. I was probably one of the first 100 users of the app. And later you see how these sort of simple products that had a simple, good, positive intention got sort of sucked into these perverse incentives. And so Google acquired my company called Apture. I landed there and I joined the Gmail team.

0

284.498 - 301.008 Tristan Harris

And I'm with these engineers who are designing the email interface that people spend hours a day in. And then one day one of the engineers comes over and he says, well, why don't we make it buzz your phone every time you get an email? And he just asked the question nonchalantly like it wasn't a big deal.

0

301.048 - 321.336 Tristan Harris

And in my experience, I was like, oh my God, you're about to change billions of people's psychological experiences with their families, with their friends at dinner, with their date night on romantic relationships, where suddenly people's phones are going to be busy showing notifications of their email. And you're just asking this question as if it's like a throwaway question.

0

322.198 - 343.675 Tristan Harris

And I became concerned – I see you have a slide deck there. I do, yeah. About basically how Google and Apple and social media companies were hosting this psychological environment that was going to corrupt and frack the global human attention of humanity. And I basically said I needed to make a slide deck.

0

343.956 - 358.408 Tristan Harris

It's 130-something pages slide deck that basically was a message to the whole company at Google saying we have to be very careful and we have a moral responsibility in how we shape the global attentions of humanity.

358.507 - 367.977 Unknown

The slide deck I've printed off, which my research team found, is called, A Call to Minimize Distraction and Respect Users' Attention by a Concerned PM and Entrepreneur.

Chapter 4: What are the ethical implications of AI in job displacement?

367.997 - 369.639 Unknown

PM meaning project manager. MARK MANDELMAN- Project manager, yeah.

0

369.659 - 391.305 Tristan Harris

MARK MANDELMAN- How was that received at Google? MARK MANDELMAN- I was very nervous, actually, because I felt like I wasn't coming from some place where I wanted to stick it to them or be controversial. I just felt like there was this conversation that wasn't happening. And I sent it to about 50 people that were friends of mine just for feedback.

0

391.826 - 407.55 Tristan Harris

And when I came to work the next day, there was 150. You know, in the top right on Google Slides, it shows you the number of simultaneous viewers. And it had 130-something simultaneous viewers. And then later that day, it was like 500 simultaneous viewers. And so obviously, it had been spreading virally throughout the whole company.

0

408.331 - 424.495 Tristan Harris

And people from all around the company emailed me saying, this is a massive problem. I totally agree. We have to do something. And so instead of getting fired, I was invited and basically stayed to become a design ethicist, studying how do you design in an ethical way?

0

424.555 - 433.148 Tristan Harris

And how do you design for the collective attention spans and information flows of humanity in a way that does not cause all these problems?

433.229 - 458.511 Tristan Harris

Because what was sort of obvious to me then, and that was in 2013, is that if the incentive is to maximize eyeballs and attention and engagement, then you're incentivizing a more addicted, distracted, lonely, polarized, sexualized breakdown of shared reality society. Because all of those outcomes are success cases of maximizing for engagement for an individual human on a screen.

458.491 - 471.228 Tristan Harris

And so it was like watching this slow motion train wreck in 2013. You could kind of see that there's this kind of myth that we can never predict the future. Like technology could go any direction. And that's like, you know, the possible of a new technology.

471.708 - 480.82 Tristan Harris

But I wanted people to see the probable, that if you know the incentives, you can actually know something about the future that you're heading towards. And that presentation kind of kicked that off.

481.711 - 497.02 Unknown

A lot of people will know you from the documentary on Netflix, The Social Dilemma, which was a big moment and a big conversation in society across the world. But then since then, a new alien has entered the picture. There's a new protagonist in the story, which is the rise of artificial intelligence. When did you...

Chapter 5: What political events could influence AI regulation?

4301.082 - 4320.81 Unknown

What series of events would have had to happen, do you think? Because I think the AI companies very much have support from Trump. I watched the dinners where they sit there with the 20, 30 leaders of these companies. And, you know, Trump is talking about how quickly they're developing, how fast they're developing. He's referencing China. He's saying he wants the US to win.

0

4320.79 - 4326.426 Unknown

So, I mean, in the next couple of years, I don't think there's going to be much progress in the United States necessarily.

0

4326.827 - 4331.6 Tristan Harris

Unless there's a massive political backlash because people recognize that this issue will dominate every other issue.

0

Chapter 6: How can clarity create courage in the fight against AI dangers?

4331.921 - 4335.331 Tristan Harris

How does that happen? Hopefully conversations like this one.

0

4335.772 - 4336.133 Unknown

Yeah.

0

4337.699 - 4355.717 Tristan Harris

I mean as – what I mean is Neil Postman is a wonderful media thinker in the lineage of Marshall McLuhan. He used to say clarity is courage. If people have clarity and feel confident that the current path is leading to a world that people don't want, that's not in most people's interests – that clarity creates the courage to say, yeah, I don't want that.

0

Chapter 7: What are the dangers of AI companions?

4356.057 - 4372.752 Tristan Harris

So I'm going to devote my life to changing the path that we're currently on. That's what I'm doing. And that's what I think that people who take this on, I watch, if you walk people through this and you have them see the outcome, almost everybody right afterwards says, what can I do to help? Obviously, this is something that we have to change.

0

4372.732 - 4386.715 Tristan Harris

And so that's what I want people to do is to advocate for this other path. And we haven't talked about AI companions yet, but I think it's important that we should do that. I think it's important to integrate that before you get to the other path.

0

4386.735 - 4386.915 Unknown

Go ahead.

0

4387.276 - 4394.267 Tristan Harris

I'm sorry, by the way, no apologies, but there's just, there's so much information to cover and I...

0

4396.559 - 4403.89 Unknown

Do you know what's interesting is a side point is how personal this feels to you, but how passionate you are about it.

Chapter 8: How can individuals advocate for responsible AI development?

4403.91 - 4422.137 Unknown

A lot of people come here and they tell me the matter of fact situation, but there's something that feels more sort of emotionally personal when we speak about these subjects to you. And I'm fascinated by that. Why is it so personal to you? Where is that passion coming from? Because this isn't just your prefrontal cortex, the logical part of your brain.

0

4422.178 - 4425.823 Unknown

There's something in your limbic system, your amygdala that's driving every word you're saying.

0

4426.647 - 4449.231 Tristan Harris

I care about people. I want things to go well for people. I want people to look at their children in the eyes and be able to say, like... You know, I think I grew up maybe under a false assumption. And something that really influenced my life was... I used to have this belief that there were some adults in the room somewhere. We're doing our thing here. We're in LA. We're recording this.

0

4449.351 - 4469.935 Tristan Harris

And there's some adults protecting the country, national security. There's some adults who are making sure that geopolitics is stable. There's some adults that are making sure that industries don't cause toxicity and carcinogens. And that there's adults who are caring about stewarding things and making things go well. And...

0

4469.915 - 4489.922 Tristan Harris

I think that there have been times in history where there were adults, especially born out of massive world catastrophes, like coming out of World War II, there was a lot of conscious care about how do we create the institutions and the structures, Bretton Woods, United Nations, positive some economics that would steward the world so we don't have war again.

4491.204 - 4506.819 Tristan Harris

And as I, in my first round of the social media work, as I started entering into the rooms where the adults were, And I recognized that because technology and software was eating the world, a lot of the people in power didn't understand the software, didn't understand technology.

4507.46 - 4527.023 Tristan Harris

You go to the Senate Intelligence Committee and you talk about what social media is doing to democracy and where Russian psychological influence campaigns were happening, which were real campaigns. And you realize that – I realized that I knew more about that than people who were on the Senate Intelligence Committee. Making the laws. Yeah. Yeah.

4527.863 - 4542.362 Tristan Harris

And that was a very humbling experience because I realized, oh, there's not that many adults out there when it comes to technology's dominating influence on the world. And so there's a responsibility – and I hope people listening to this who are in technology –

4542.578 - 4559.44 Tristan Harris

Realize that if you understand technology and technology is eating the structures of our world, children's development, democracy, education, journalism, conversation, it is up to people who understand this to be part of stewarding it in a conscious way.

Comments

There are no comments yet.

Please log in to write the first comment.