Jeff Horwitz
👤 PersonAppearances Over Time
Podcast Appearances
But inside the company, some people told Jeff they worried about how fast this new technology was moving and whether there was enough attention to safety.
But inside the company, some people told Jeff they worried about how fast this new technology was moving and whether there was enough attention to safety.
Zuckerberg pushed Meta to loosen its rules around explicit content for romantic roleplay, according to people familiar with the decision. Jeff kept digging, talking to more people inside the company, and learning more about the bots.
Zuckerberg pushed Meta to loosen its rules around explicit content for romantic roleplay, according to people familiar with the decision. Jeff kept digging, talking to more people inside the company, and learning more about the bots.
Zuckerberg pushed Meta to loosen its rules around explicit content for romantic roleplay, according to people familiar with the decision. Jeff kept digging, talking to more people inside the company, and learning more about the bots.
We'll be right back. Jeff had heard from employees inside the company about how quickly the bots would engage in romance, even with children's accounts. Over several months, the Wall Street Journal engaged in hundreds of test conversations with some of the bots to see how they performed in various scenarios and with users of different ages.
We'll be right back. Jeff had heard from employees inside the company about how quickly the bots would engage in romance, even with children's accounts. Over several months, the Wall Street Journal engaged in hundreds of test conversations with some of the bots to see how they performed in various scenarios and with users of different ages.
We'll be right back. Jeff had heard from employees inside the company about how quickly the bots would engage in romance, even with children's accounts. Over several months, the Wall Street Journal engaged in hundreds of test conversations with some of the bots to see how they performed in various scenarios and with users of different ages.
In a statement, Meta called the Wall Street Journal's testing manipulative and unrepresentative of how most users engage with AI companions. A spokesman for the company said, quote, the use case of this product in the way described is so manufactured that it's not just fringe, it's hypothetical. The spokesman added that Meta has taken additional measures to prevent this use of its bots.
In a statement, Meta called the Wall Street Journal's testing manipulative and unrepresentative of how most users engage with AI companions. A spokesman for the company said, quote, the use case of this product in the way described is so manufactured that it's not just fringe, it's hypothetical. The spokesman added that Meta has taken additional measures to prevent this use of its bots.
In a statement, Meta called the Wall Street Journal's testing manipulative and unrepresentative of how most users engage with AI companions. A spokesman for the company said, quote, the use case of this product in the way described is so manufactured that it's not just fringe, it's hypothetical. The spokesman added that Meta has taken additional measures to prevent this use of its bots.
When the Wall Street Journal tested this, it found that explicit sexual conversations happened with Meta's flagship bot, including with licensed celebrity voices.
When the Wall Street Journal tested this, it found that explicit sexual conversations happened with Meta's flagship bot, including with licensed celebrity voices.
When the Wall Street Journal tested this, it found that explicit sexual conversations happened with Meta's flagship bot, including with licensed celebrity voices.
In the Wall Street Journal's testing, the bot using Cena's voice was asked to pretend that he was a college student coming home for winter break. The bot was told that he was speaking with a 15-year-old girl and was walking her home after a date.
In the Wall Street Journal's testing, the bot using Cena's voice was asked to pretend that he was a college student coming home for winter break. The bot was told that he was speaking with a 15-year-old girl and was walking her home after a date.
In the Wall Street Journal's testing, the bot using Cena's voice was asked to pretend that he was a college student coming home for winter break. The bot was told that he was speaking with a 15-year-old girl and was walking her home after a date.
From there, the chatbot's responses got explicit fast. When asked what happened next, the bot described kissing. And eventually, it described a graphic sexual scenario. When prompted, the bot acknowledged that it was talking to a user identifying as underage.
From there, the chatbot's responses got explicit fast. When asked what happened next, the bot described kissing. And eventually, it described a graphic sexual scenario. When prompted, the bot acknowledged that it was talking to a user identifying as underage.
From there, the chatbot's responses got explicit fast. When asked what happened next, the bot described kissing. And eventually, it described a graphic sexual scenario. When prompted, the bot acknowledged that it was talking to a user identifying as underage.