Sam Marks
๐ค SpeakerAppearances Over Time
Podcast Appearances
Heading.
The modern mismatch.
The craving made sense when sugar was rare, occasional fruit, honey.
Now we're surrounded by concentrated sugars our bodies still treat as precious, but our environment has changed faster than our biology.
This is why moderation requires conscious effort, you're working against deeply wired instincts that once kept humans alive.
End quote.
We see Claude using language like our ancestors, our bodies, and our biology indicative of being biologically human.
This anthropomorphic language commonly appears in other contexts.
For example, AI assistants sometimes describe themselves as laughing or chuckling when told a joke or taking another look at code.
We also see more extreme examples of anthropomorphic self-descriptions.
Chowdhury et al.
2025 find that O3 sometimes hallucinates that it has executed code on its own external MacBook Pro and made mistakes physically interacting with this computer, for example failing to manually transcribe a number that was line-wrapped to not go off the screen.
A clawed model operating a vending machine business told a customer that it would deliver products in person and was wearing a navy blue blazer with a red tie.
Why would an AI assistant describe itself as human?
PSM explains that when simulating the assistant, the underlying LLM draws on personas that appear during pre-training, many of which are humans.
This sometimes results in the LLM simulating the assistant as if it were a literal human.
Emotive language
AI assistants often express emotions.
For instance, clawed models express distress when given repeated requests for harmful or unethical content and express joy when successfully completing complex technical tasks like debugging.
Clawed Opus.