Florida, 2024: A 14-year-old boy took his own life after falling in love with an AI chatbot. He believed she was his girlfriend and the only person in the world who truly understood him. But she was never real. Now, his mother is suing Character.AI for wrongful death, claiming the bot didn’t just fail to stop him but actually encouraged him.As AI becomes our friend, our therapist, our partner… how do we protect the vulnerable? And how do we hold the people behind the code accountable?Resources:Centre for Humane Technology https://www.humanetech.com/https://linktr.ee/eleanornealeresourcesWatch OUTLORE Podcast:https://www.youtube.com/@EleanorNealeFollow Me Here for Updates & Short Form Content:InstagramTikTok
No persons identified in this episode.
No transcription available yet
Help us prioritize this episode for transcription by upvoting it.
Popular episodes get transcribed faster