Scott Collins
๐ค SpeakerAppearances Over Time
Podcast Appearances
Parents don't need to be AI experts.
They just need to be curious about their children's lives and ask them about what kind of technology they're using and why.
role play that is interaction about harming somebody else, physically hurting them.
Aura is an online safety company that released the report.
Psychologist Scott Collins is Aura's chief medical officer.
He says 37% of conversations between teens and their chatbot companions involve violence.
Parents should keep a close eye on how their kids are using chatbots, says pediatrician Dr. Jason Nagata at UC San Francisco.
And tell their teens explicitly that chatbots come with risks.
It is role-play that is interaction about harming somebody else, physically hurting them, torturing them, fighting them, intimidating.
And a lot of it gets pretty graphic.
It is role-play that is interaction about harming somebody else, physically hurting them, torturing them, fighting them, anything.
And a lot of it gets pretty graphic.
When kids use artificial intelligence tools, 42% of the time it's for companionship, where kids engage with a chatbot for an ongoing conversation.
And frequently, nearly 40% of the time, those conversations involve violent role-playing.
Scott Collins is chief medical officer at Aura.
These conversations also tend to be longer, he says, compared to when they use AI for help with homework.
Among 13 to 17-year-olds, kids who spend more time online are also more stressed out by their digital lives.