Georgia Wells
👤 SpeakerAppearances Over Time
Podcast Appearances
Thank you for having me.
Thank you for having me.
It's been a long process.
in long conversations are less likely to adhere to safety guidelines.
And even when the safety guidelines functioned perfectly, they realized that teens sometimes use them in problematic ways.
And so teens would maybe like try to chat about violence or other topics that were restricted.
And so by mid-September, company leaders just came to the decision that they were going to cut off teenagers.
Character AI is facing questions from parents, regulators, and mental health professionals about what should be the role of this technology in young people's lives.
But additionally, they're also facing lawsuits from the parents of teens who have killed themselves after using the chatbots.
Teens are sad.
They're really sad.
So I spoke with a lot of teenagers this week, and a lot of them view this as almost a breakup.
Many of them have become so invested with these characters that the idea of losing them is just, it's really, really upsetting for a lot of the teens I've spoken to.
So mental health experts see this reaction as evidence of the risks of the technology.
That that teen struggles so much to put the tech aside is one of the big questions they have about ways this technology could be problematic in the lives of some of these teens.
I don't think we know what's going to happen.
Long term, though, Character AI is quite optimistic that their future lies more in kind of video and audio type AI that isn't the extensive back and forth that they think down the road they could welcome teens back to.
Their view is that the back and forth is when teens are really getting sucked in in a way that could be really dangerous for some.
Character AI is really one of the early companies right now to be kind of tackling this head-on with a user ban.
These questions are not going away.