Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing
Podcast Image

The Last Show with David Cooper

Is Empathy Possible Between Humans and AI?

10 Apr 2026

Transcription

Chapter 1: What does true empathy mean in the context of AI?

3.389 - 28.342 David Cooper

The Last Show with David Cooper, where we utilize nonlinear reverse inverse backward thinking protocols. It sounds like this. Artificial intelligence, large language models, they can apologize, they can comfort you, they can even sound emotionally supportive. But there's a strange psychological question here. When a machine shows empathy, what does that even mean in the deep sense?

0

28.362 - 37.918 David Cooper

A robot can't be feeling anything on the other side, right? Well, that's what we're going to discuss here with psychology professor at Penn State University, Daryl Cameron. Daryl, welcome to the show.

0

37.958 - 39.42 C. Daryl Cameron

Thanks for having me.

0

39.4 - 51.351 David Cooper

Empathy is one of those words that I feel like we all kind of know what it means, but in the psychological sense, before we talk about large language models having it, how would you define it? What would empathy mean if we saw it in a chatbot?

0

51.5 - 63.113 C. Daryl Cameron

Yeah, so empathy is often characterized as having a few different meanings. So in psychology, they often talk about emotion sharing. So if someone shares in your experiences, you're feeling sad, I'm feeling sad.

Chapter 2: How do large language models demonstrate empathy?

63.153 - 80.933 C. Daryl Cameron

There's often perspective taking. So if I try to understand what you're thinking and cognitively predict your beliefs and mental states, that doesn't have to involve emotion. And then there's compassion. So having a kind of a warm sense, a motivation to help you feel better, to improve your well-being.

0

80.913 - 94.45 C. Daryl Cameron

And with large language models and AI, there's been much writing in psych and ethics about trying to figure out to what extent, when it gives you a message, how can we code whether it's displaying each of those three pieces?

0

94.48 - 104.111 David Cooper

Well, the first and the last, this idea of feeling that emotion, my understanding is there's no way that could be possible. This is a computer. So I'm going to write that one off unless you have a response to that.

0

104.431 - 122.631 C. Daryl Cameron

My only response would be that, you know, you'll sometimes will see things that sound like that verbally. You know, I feel your pain. I know just what you're going through. And, you know, of course, without sentience, you know, that's functionally impossible. But people still, when they read such language, they can still respond as if someone's expressing that to them.

0

122.611 - 138.983 David Cooper

Fair enough. But this idea of perspective taking, like understanding and then responding to it, that starts to sound eerily like something that a chatbot could potentially do. Are chatbots doing that? Or some experts just say they're only good at predicting the next word. There's nothing more to it than that.

139.148 - 158.169 C. Daryl Cameron

Well, you know, I think that if you think about what perspective taking is as a prediction of what someone or something's going to do in our environment, I mean, I think of the three empathy facets that I talked about, that's the one that many scholars have argued is the closest to what we might say AI is capable of doing.

158.609 - 172.918 C. Daryl Cameron

Although, of course, how it's actually doing that is different than what you and I might do when we try to perspective take things. But in terms of just predicting what will happen next, you know, functionally, it is the closest of those three empathy facets many have argued.

173.033 - 191.494 David Cooper

Well, here's an uncomfortable one. If I'm kind of hurting and I'm telling a large language model about it, you know, psychologically hurting, and the bot just simply says, I'm sorry that you're experiencing pain. I know it's just code, but if I perceive it as the robot being empathetic, does that make it empathy in any meaningful sense?

191.975 - 196.56 C. Daryl Cameron

Well, so this is where it gets really, really philosophically and psychologically interesting, I think.

Chapter 3: Can chatbots genuinely understand human emotions?

196.76 - 217.759 C. Daryl Cameron

So if I'm empathizing with you, it's a two-way street, right? So I'm as the expressor sending you warm feelings of regard or trying to share in your feelings, but it's also you're receiving that empathy, and so you are catching that. And so if you think of it as what's called a dyad, you really are... the receiving end of things does matter quite a bit too.

0

218.4 - 241.858 C. Daryl Cameron

And so if someone reads a message from a chatbot and it makes them feel a certain way in normal, you know, human to human interactions, that is a critical part of the empathetic process. And so much has been focused on, well, can AI feel? And that's certainly half of the story. But I think an important other half of that story is the recipient, the recipient's perspective.

0

242.179 - 247.911 C. Daryl Cameron

How does it make them feel? And to what extent does that play a role in defining something as empathy in the first place?

0

248.06 - 264.246 David Cooper

To give a contrived example, I can imagine sitting in a psychologist's office, a psychiatrist's office maybe, and they deeply don't care about me, but they know it's their job to seem like they do, and I feel like they do. How is that any different from a chatbot who actually doesn't care?

0

264.428 - 286.225 C. Daryl Cameron

You know, I love that comparison because in many cases, human empathy expression can be fickle and unreliable. And, you know, in clinical medical settings, there is much discussion of things like caregiver burnout and empathy management and empathy fatigue, compassion fatigue. And, you know, the human empathy we get in everyday life is wildly variable.

286.506 - 294.46 C. Daryl Cameron

It can be really deeply rich and authentically felt, or it can be kind of like what you were describing, kind of surface acting, so to speak.

294.921 - 302.755 David Cooper

To be clear, I'm not saying all therapists are empathy-less. I'm just saying there could be one out there that's good at pretending and their patients feel like they're empathetic.

302.819 - 318.027 C. Daryl Cameron

Yeah, and so I think that with that sort of example, I mean, you can see that both human empathy is expressed to us. It can range so much in terms of the quality of the experience or doing the expressing of it.

318.007 - 335.699 C. Daryl Cameron

But if we think about what it means to receive empathy, I think that there could be a lot more attention paged to the value that you and I might have or ascribe to receiving empathy from someone. And that part of the process, I think, needs to be foregrounded a little bit more in terms of how we think about AI empathy.

Comments

There are no comments yet.

Please log in to write the first comment.