Sherry Turkle
👤 PersonPodcast Appearances
Now, you can analogize this to human beings as much as you want, but you are missing the basic point because every human metaphor is going to reassure us in a way that we should not be reassured.
It's so hard because we need to have a whole new mental form for them. We have to have a whole new mental form.
So many people are completely unprepared for what goes on in an interview. by many, many times talking it over with a chatbot and having a chatbot that's able to say that answer was too short, you didn't get to the heart of the matter, you didn't talk at all about yourself. This can be very helpful.
It isn't pretending empathy. It's not pretending care. It's not pretending love. It's not pretending relationship. And those are the applications where I think that this technology can be a blessing.
I think they should make it clear that they're chatbots. They shouldn't try to, they shouldn't greet me with, hi, Sherry, how are you doing? I mean, they shouldn't come on like they're people. And they should, in my view, cut this pretend empathy, no matter how seductive it is. I mean, the chatbots now take pauses for breathing because they want you to think they're breathing, right?
My general answer is it has everything to do with not playing into our vulnerability to anthropomorphize them.
My name is Sherry Turkle. I teach at MIT. And for decades, I've been studying people's relationships with computation. Most recently, I'm studying artificial intimacy, the new world of chatbots.
And has spoken to so many people who obviously in moments of loneliness and moments of despair turn to these objects which offer what I call pretend empathy. That is to say they're making it up as they go along the way chatbots do. They don't understand anything really. They don't give a damn about you really.
When you turn away from them, they're just as good if you make cooked dinner or commit suicide, really. But they give you the illusion of intimacy without there being anyone home.
And what we gain is a kind of dopamine hit. In the moment, an entity is there saying, I love you, I care about you, I'm there for you. It's always positive. It's always validating. But what we lose is what it means to be in a real relationship and what real empathy is, not pretend empathy.
And the danger, and this is on the most global level, is that we start to judge human relationships by the standard of what these chatbots can offer.
So people will say, the replica understands me better than my wife. Direct quote. I feel more empathy from the replica than I do from my family. But that means that the replica is always saying, yes, yes, I understand. You're right. It's designed to give you continual validation. But that's not what human beings are about. Human beings are about working it out.
It's about negotiation and compromise and really putting yourself into someone else's shoes. And we're losing those skills if we're practicing on chatbots.
All the metaphors we come up with are human metaphors of like bad people or people who hurt us or people who don't really care about us. In my interviews, people often say, well, my therapist doesn't really care about me. He's just putting on a show. But, you know, that's not true.
Maybe for the patient who wants a kind of friendly relationship and the therapist is staying in role, but there's a human being there. If you stand up and say, well, I'm going to kill myself now to your therapist, Your therapist, you know, calls 911.