Sam Schechner
👤 SpeakerAppearances Over Time
Podcast Appearances
What it is, is it's the harness.
If you imagine an AI system is like a horse, you need a harness to ride it, to direct it.
You know, Cloud Code is a tool that experienced developers can use to really accelerate their work.
It does a large amount of coding on its own.
And Cloud Code has been a driver of growth for Anthropic.
It's something that has won Anthropic a lot of business from developers.
So if suddenly they're kind of giving away the recipe, even accidentally, for how they put that together, that's a big deal.
Well, what Anthropic says is that there was a packaging issue caused by human error when they were publishing a new version of Cloud Code.
In layperson speak, that means that there was a file that was a roadmap for all of the internal source code for Cloud Code, rather than the sort of compressed version that you would actually run on your machine.
Anthropic is potentially looking at an IPO later this year.
It recently closed a new round of funding that values the company at $380 billion.
And, you know, cloud code is certainly a large part of that equation.
And this is a blow to Anthropic, in part because the company's brand has been that it's the AI company that's
concerned about safety and that takes security very seriously.
Even a simple mistake can undermine that image in the eyes of the enterprises that are buying its services.
But it also reveals the trade secret of how they make clawed code work.
Anthropix competitors now have a detailed roadmap for how to clone some of these features without needing to reverse engineer them.
In addition, there could be security issues.
Once you know the source code of an app, it might make it easier for hackers or others to better manipulate the AI model into doing stuff that it's not supposed to do, like help with cyber attacks, or actually compromise the app itself, which could put users of the app in danger.
Thanks so much for having me.