Julie Jargon
๐ค SpeakerAppearances Over Time
Podcast Appearances
And that particular individual talked about killing himself with a gun.
Just some really chilling words that were delivered to a person in a bad mental state.
Well, I think it puts increasing pressure on them to put in the proper guardrails to the chatbot.
And they have already said that they are implementing some changes to divert people to human resources and suicide crisis line.
If people talk about suicide, you know, OpenAI has said that they will try to give people a notification if they've been talking to the chatbot for too long and encourage them to take a break.
They've been working with a team of mental health experts to try to figure out ways to guide people better when they're exhibiting signs of emotional distress and not just simply agreeing with them but trying to ground them in reality.
So I think it remains to be seen how well those new measures will work.
It's hard for a new company that's under pressure to deliver sales and profits to have all of the answers and have a product that meets the needs of so many different types of people and use cases and have it fully thought out.
while also delivering it quickly.
But at the same time, they have a responsibility to their users, and there is a lot of pressure from people in the mental health space and consumer advocates to ensure that they have a safe product.