Tristan Harris
👤 PersonAppearances Over Time
Podcast Appearances
They're trained on all of this data of everything that what those characters in Game of Thrones said. but they don't know what the AI will do in every circumstance. Like if you grow an alien brain that is a fictional character, can character data AI guarantee what it will do when it talks about very sensitive topics?
They're trained on all of this data of everything that what those characters in Game of Thrones said. but they don't know what the AI will do in every circumstance. Like if you grow an alien brain that is a fictional character, can character data AI guarantee what it will do when it talks about very sensitive topics?
I mean, they try to train out some of those things and I'm sure that they did have some safety training. But obviously, that's not enough when, you know, what did Character.ai tell their investors when they raised hundreds of millions of dollars from Andreessen Horowitz and friends to try to ship this?
I mean, they try to train out some of those things and I'm sure that they did have some safety training. But obviously, that's not enough when, you know, what did Character.ai tell their investors when they raised hundreds of millions of dollars from Andreessen Horowitz and friends to try to ship this?
I mean, they try to train out some of those things and I'm sure that they did have some safety training. But obviously, that's not enough when, you know, what did Character.ai tell their investors when they raised hundreds of millions of dollars from Andreessen Horowitz and friends to try to ship this?
You know, they basically said, we're going to cure loneliness and we're going to get as many users as possible. And this was shipped to young people. This was shipped and featured to 12-year-olds for a long time.
You know, they basically said, we're going to cure loneliness and we're going to get as many users as possible. And this was shipped to young people. This was shipped and featured to 12-year-olds for a long time.
You know, they basically said, we're going to cure loneliness and we're going to get as many users as possible. And this was shipped to young people. This was shipped and featured to 12-year-olds for a long time.
Only recently, I think it was after the lawsuit was first was filed or shortly before the lawsuit was filed, I think they got wind of it and they changed the required age to something like 17. But the business model here is to take shortcuts to get this out to as many people as possible.
Only recently, I think it was after the lawsuit was first was filed or shortly before the lawsuit was filed, I think they got wind of it and they changed the required age to something like 17. But the business model here is to take shortcuts to get this out to as many people as possible.
Only recently, I think it was after the lawsuit was first was filed or shortly before the lawsuit was filed, I think they got wind of it and they changed the required age to something like 17. But the business model here is to take shortcuts to get this out to as many people as possible.
And as you said, this is not an isolated incident because the AI was actually recommending and sexualizing conversations that have not previously been sexualized. Our team had found that if you sign up as a 13-year-old, And then you watch what are the users that get recommended for, I mean, the characters that get recommended to a new kid.
And as you said, this is not an isolated incident because the AI was actually recommending and sexualizing conversations that have not previously been sexualized. Our team had found that if you sign up as a 13-year-old, And then you watch what are the users that get recommended for, I mean, the characters that get recommended to a new kid.
And as you said, this is not an isolated incident because the AI was actually recommending and sexualizing conversations that have not previously been sexualized. Our team had found that if you sign up as a 13-year-old, And then you watch what are the users that get recommended for, I mean, the characters that get recommended to a new kid.
And the first one was stepsister, CEO, and that the chatbot immediately sexualizes conversations. This was in the most recent lawsuits. This is even more recent. And it shows that they have a hard time controlling these systems. AI is different because, like I said, in order to make it more powerful, you don't make it more controllable.
And the first one was stepsister, CEO, and that the chatbot immediately sexualizes conversations. This was in the most recent lawsuits. This is even more recent. And it shows that they have a hard time controlling these systems. AI is different because, like I said, in order to make it more powerful, you don't make it more controllable.
And the first one was stepsister, CEO, and that the chatbot immediately sexualizes conversations. This was in the most recent lawsuits. This is even more recent. And it shows that they have a hard time controlling these systems. AI is different because, like I said, in order to make it more powerful, you don't make it more controllable.
It's just become more and more capable across talking about more and more topics, being able to do more and more things. And this is just really the tip of the iceberg because AI is being rolled out everywhere in our society, not just to kids.
It's just become more and more capable across talking about more and more topics, being able to do more and more things. And this is just really the tip of the iceberg because AI is being rolled out everywhere in our society, not just to kids.
It's just become more and more capable across talking about more and more topics, being able to do more and more things. And this is just really the tip of the iceberg because AI is being rolled out everywhere in our society, not just to kids.