Laurel van der Toorn
๐ค SpeakerAppearances Over Time
Podcast Appearances
Now, after spending a lot of time studying this, is your feeling that AI companions, is it that they're mostly helpful, mostly risky, or kind of something in between for people's mental health?
Now, Talia, thank you so much for sharing your research with me and having this discussion.
A lot of important info you've given me.
I appreciate your time.
Talia Ale-Davud is a computer science researcher at Aalto University in Finland.
Coming up after the break, continuing our conversation about AI when people use them as companions like therapy apps and friend apps.
If they're feeling lonely, a new study shows that can actually deepen their distress over time.
Can a lonely person chatting online with a robot end up feeling lonelier?
That is the question we'll answer in a few moments time.
Artificial intelligence, large language models, they can apologize, they can comfort you, they can even sound emotionally supportive.
But there's a strange psychological question here.
When a machine shows empathy, what does that even mean in the deep sense?
A robot can't be feeling anything on the other side, right?
Well, that's what we're going to discuss here with psychology professor at Penn State University, Daryl Cameron.
Daryl, welcome to the show.
Empathy is one of those words that I feel like we all kind of know what it means, but in the psychological sense, before we talk about large language models having it, how would you define it?
What would empathy mean if we saw it in a chatbot?
Well, the first and the last, this idea of feeling that emotion, my understanding is there's no way that could be possible.
This is a computer.
So I'm going to write that one off unless you have a response to that.