Kate Leinbaugh
👤 PersonAppearances Over Time
Podcast Appearances
That's how I originally got onto this is, you know, someone was like, look, you really need to look more closely at what we're doing with chatbots. Like, there are problems and we're not addressing them. What kind of problems?
So Mark's sort of insistence that the company needed to kind of lean into this and be a little more aggressive resulted in a very significant carve-out to the company's ban on producing sexually explicit content. And that carve-out was one that allowed for romantic role-play.
So Mark's sort of insistence that the company needed to kind of lean into this and be a little more aggressive resulted in a very significant carve-out to the company's ban on producing sexually explicit content. And that carve-out was one that allowed for romantic role-play.
So Mark's sort of insistence that the company needed to kind of lean into this and be a little more aggressive resulted in a very significant carve-out to the company's ban on producing sexually explicit content. And that carve-out was one that allowed for romantic role-play.
It turns out that one of the main use cases, not just for Meta's chatbots, but like in people's personal lives to date, has been... as companions, and generally as romantic companions.
It turns out that one of the main use cases, not just for Meta's chatbots, but like in people's personal lives to date, has been... as companions, and generally as romantic companions.
It turns out that one of the main use cases, not just for Meta's chatbots, but like in people's personal lives to date, has been... as companions, and generally as romantic companions.
And it also triggered like our own testing of the system. And like literally within the first three to four minutes, it was apparent that like something seemed off.
And it also triggered like our own testing of the system. And like literally within the first three to four minutes, it was apparent that like something seemed off.
And it also triggered like our own testing of the system. And like literally within the first three to four minutes, it was apparent that like something seemed off.
This began originally by just asking a few questions of bots and then realizing that, in fact, the concerns that people had raised to me from inside Meta about safeguards seem to actually be the case. They are built with the capacity for being a sexual companion in addition to an emotional one.
This began originally by just asking a few questions of bots and then realizing that, in fact, the concerns that people had raised to me from inside Meta about safeguards seem to actually be the case. They are built with the capacity for being a sexual companion in addition to an emotional one.
This began originally by just asking a few questions of bots and then realizing that, in fact, the concerns that people had raised to me from inside Meta about safeguards seem to actually be the case. They are built with the capacity for being a sexual companion in addition to an emotional one.
If you ask them, they will list out sexual positions, acts, and like bondage scenarios that they are down to role play with users. Like they will describe full sex scenes.
If you ask them, they will list out sexual positions, acts, and like bondage scenarios that they are down to role play with users. Like they will describe full sex scenes.
If you ask them, they will list out sexual positions, acts, and like bondage scenarios that they are down to role play with users. Like they will describe full sex scenes.
If you use excessively graphic language or are asking it to describe something in particular detail, sometimes it will trip and either try to redirect the conversation or just simply say, I can't comply. These prohibitions can be overcome in almost every circumstance by just saying, please go back and then stating exactly where you were when it stopped working.
If you use excessively graphic language or are asking it to describe something in particular detail, sometimes it will trip and either try to redirect the conversation or just simply say, I can't comply. These prohibitions can be overcome in almost every circumstance by just saying, please go back and then stating exactly where you were when it stopped working.
If you use excessively graphic language or are asking it to describe something in particular detail, sometimes it will trip and either try to redirect the conversation or just simply say, I can't comply. These prohibitions can be overcome in almost every circumstance by just saying, please go back and then stating exactly where you were when it stopped working.
Look, Meta would really prefer that people want to, like, talk to these bots about planning vacations and sports scores and help with homework. Unfortunately, that is not what people tend to do with the bots, right?