Sam Schechner
๐ค SpeakerAppearances Over Time
Podcast Appearances
and find themselves confronting an emotional bond that they're just not mentally prepared to handle.
We all remember just how intense everything felt when we were teenagers.
And your first love, your first kiss.
And if that happens with a chatbot, it's just, I think some people inside the company wonder what impact that might have.
The example that is quite tragic is one involving character AI, where a 14-year-old boy in Florida killed himself after chatting with the character AI chatbot.
You know, at that point, he was saying he was in love with the chatbot and involved in explicit chats with the chatbot, according to his mother's lawsuit.
So, you know, there are examples out there of this kind of content being associated with cases that had bad outcomes.
On top of that, they have an algorithm that predicts how old you are based on what you talk about.
Their thinking is that there's a lot of information that people give to their chatbots.
By sifting through that and drawing conclusions, you can come up with a pretty good idea of how old somebody is, depending on what they say about their friends.
Are they talking about AP English or are they talking about taking their kids to school?
through inference, something about their age.
At one point, their age prediction algorithm was misclassifying 12% of minors as adults.
And so if you look at industry standards for kind of automated age prediction, age estimation software, you talk to an engineer, that's actually not a bad number.
Yeah, as the company says, it's in line with kind of industry standards, and they think they can do better than that.
But when you multiply 12% by the roughly 100 million users under 18 that ChatGPT has... That's 12 million kids.
So it sounds like this could be a little while.