Toby Howell
👤 SpeakerAppearances Over Time
Podcast Appearances
And then earlier this week, the big one, nearly 512,000 lines of source code were accidentally posted, laying out the blueprint for Cloud Code, its most valuable product.
It wasn't the actual AI model itself that was posted, but it did paint a picture of how Anthropic coaxes it to behave and perform tasks autonomously for millions of users.
According to the Wall Street Journal, the leak was a goldmine for competitors who want to know what Anthropic's secret sauce is.
One of the leaked features involves the model revisiting past tasks to remember what it's learned, a process Anthropic refers to as dreaming.
There's also a piece that tells Cloud Code to hide the fact that it's an AI in certain situations, like when it's pushing code to GitHub.
Someone found a reference to a Tamagotchi-style virtual pet called Buddy that people can interact with.
If Claude is Oz, this link gave everyone a pretty good look behind the curtain.
Neil, this is a double whammy for Anthropic.
There's the hit to its reputation as the quote-unquote safe AI company.
And then there's the hit to its actual business sense.
Competitors now know some of its inner workings.
It is a claudastrophe.
Where are you finding $1,000 bills lying around?
I got to find one of those in my pocket.
But you're right.
I think there was sort of this giddiness around
peeking through this code because it turned out to be a lot more art than science.
I mean, that dreaming function is kind of a imprecise science of, hey, remember everything you learned and kind of compress it into a packet that you can remember going forward.
But then there was also some just very funny lists that were released.
There's a list of spinner verbs that include scurrying, recombobulating, and topsy-turvying.