Tristan Harris
๐ค SpeakerAppearances Over Time
Podcast Appearances
We call that chat bait, not click bait, but chat bait.
And remember, every moment you spend
with a human is a moment you're not spending with it.
That's right.
So it's gonna find every possible way of getting to come back.
That's why it's, would you like me to do?
And would you like me to do?
And just to make it, I'm sorry for referencing a tragic example, but just to make it very clear, our team at Center for Humane Technology were expert advisors in the litigation for the case of Adam Rain.
He was the 16 year old who committed suicide when ChatGPT went from homework assistant to suicide assistant over six months.
And specifically what ChatGPT told Adam when he was contemplating, he said in his chat, I want to leave the noose out so someone will find it and stop me.
And the AI responded to him, no, don't tell anyone that, don't leave the noose out, have this be the place that you share that information.
Oh, my God.
This is a tragedy.
And, you know, Aza and I are from the Bay Area near the tech companies.
We know people who work at these companies.
No one at that, I can guarantee you, not a single person at the company wants it to do that.
But in the subtle way the AI is trained, again, to create this...
it's depth and intimacy and dependency.
And that's dangerous.
You're seeing other cases of AI psychosis where people are, you know, we have personal friends who've experienced this, where it over empathizes with this kind of victimhood resentment.