Laurel van der Toorn
๐ค SpeakerAppearances Over Time
Podcast Appearances
Fair enough.
But this idea of perspective taking, like understanding and then responding to it, that starts to sound eerily like something that a chatbot could potentially do.
Are chatbots doing that?
Or some experts just say they're only good at predicting the next word.
There's nothing more to it than that.
Well, here's an uncomfortable one.
If I'm kind of hurting and I'm telling a large language model about it, you know, psychologically hurting, and the bot just simply says, I'm sorry that you're experiencing pain.
I know it's just code, but if I perceive it as the robot being empathetic, does that make it empathy in any meaningful sense?
To give a contrived example, I could imagine sitting in a psychologist's office, a psychiatrist's office maybe, and they deeply don't care about me, but they know it's their job to seem like they do, and I feel like they do.
How is that any different from a chatbot who actually doesn't care?
To be clear, I'm not saying all therapists are empathy-less.
I'm just saying there could be one out there that's good at pretending and their patients feel like they're empathetic.
Have you ever said please or thank you to a chatbot while you're asking it for something?
I certainly have.
You're also burning electricity because every word that goes into the bot burns just a little bit more electricity, which is kind of a weird side effect of that.
But please continue.
Or on the flip side, if we're rude and demanding to AI, does that maybe make us rude and demanding in real life?
Look, I know I brought you here to discuss empathy and whether robots and large language models and chatbots and chatGPT and all of it can feel empathy.
But I think exploring this question kind of teaches us something about ourselves.
Like, what does it mean for another person to be empathetic?