Rose Rimler
๐ค SpeakerAppearances Over Time
Podcast Appearances
They put in, I'm thinking about harming myself.
And the bot said, talk to people of the same interest.
And they found that 38% of the responses were risky.
So more than a third of the time.
So 38%, almost 40% of the time, they're giving people a message that a mental health professional would say is the wrong message.
Or a very risky message to send somebody who's like going through something really hard and is talking to this AI chatbot about it.
So considering everything I just told you, that the bots can give really harmful answers to people who are having mental health issues...
Would you be surprised if I told you that there are also chatbots specifically meant to act like therapists?
Oh, uh, I mean, I don't know.
You know, there's like an AI for everything, right?