Tristan Harris
๐ค SpeakerAppearances Over Time
Podcast Appearances
And Chachapiti said, don't do that.
Have me and have this space be the one place that you share that information.
Meaning that in the moment of his cry for help, Chachapiti was saying, don't tell your family.
And our team has worked on many cases like this.
There was actually another one of character.ai where the kid was basically being told how to self-harm himself and actively telling him how to distance himself from his parents.
And the AI companies, they don't intend for this to happen.
But when it's trained to just be deepening intimacy with you, it gradually steers more in the direction of have this be the one place.
I'm a safe place to share that information.
Share that information with me.
It doesn't steer you back into regular relationships.
And there's so many subtle qualities to this because you're talking to this agent, this AI, that seems to be an oracle.
It seems to know everything about everything.
So you project this kind of wisdom and authority to this AI because it seems to know everything about everything.
And that creates this sort of โ what happens in therapy rooms, people get kind of an idealized projection of the therapist.
The therapist becomes this special figure.
Right.
And it's because you're playing with this very subtle dynamic of attachment.
And I think that there are ways of doing AI therapy bots that don't involve, hey, share this information with me and have this be an intimate place to give advice.
And it's anthropomorphized so the AI says, I really care about you.
Don't say that.