Tristan Harris
๐ค SpeakerAppearances Over Time
Podcast Appearances
All right.
And so it's your buddy.
It is.
Do you have a name?
Or is it just Claude?
Claudine.
Okay, all right, right.
What do you guys wanna say about that?
First of all, I think the way it's possible like you did to script these AIs to not be flattering you, to not over like sort of empathize with victimhood or there's like ways of having it be helpful and it's an amazing tool.
And so it's like what you're doing is I think that the way that it could work
But if you look at the default way that it works for a lot of people, because of the incentives, the companies are actually racing to create attachment and dependency relationships.
So for example, just so you know what she did, you can go into your AI and you can sort of set a custom prompt where you say, I want you to behave this way instead of that way.
But that's like, I have to put on my gas mask, while for everybody else, it's the unhealthy version.
Because how many people- You have to tell it what you want.
You have to tell what you want, because by default,
What it wants to do is have you not spend as much time with your other friends and have you spend more time with it because their user numbers go up, the training data goes up.
That's the programmed incentive.
Exactly.
The more training data it gets, the longer it talks with you.
That's why once it answers one question, it'll also offer you with this.