Dwarkesh Patel
š¤ SpeakerAppearances Over Time
Podcast Appearances
Imagine if there's some future Democratic administration and Elon Musk is negotiating Starlink access to the military.
And Elon says, look, I reserve the right to cut off the military's access to Starlink
Starlink in case you're fighting some unjust war or some war that Congress has not authorized.
On the face of it, this language seems reasonable.
But as a military, you simply cannot give a private contractor that you're working with the kill switch on a technology that you have come to rely on.
And if that's all the government had done to say we refuse to do business with Anthropic,
That would have been fine, and I wouldn't have written this blog post, and I wouldn't be narrating this shit to you.
But that's not what the government did.
Instead, the government has threatened to destroy Anthropic as a private business because Anthropic refuses to sell to the government on terms that the government commands.
Now, if upheld, the supply chain restriction would mean that companies like Amazon and Nvidia and Google and Palantir would need to ensure that Anthropic is not touching any of their Pentagon work.
And Anthropic could probably survive this designation today because these companies can just cordon off the services they're providing to the Department of War.
But given the way AI is going, eventually, it's not going to be just some party trick addendum to the products that these companies are serving to the military.
In the future, AI will be woven into how every product is built and maintained and operated.
In the future, if Amazon is providing some service to the Department of War through AWS, and that service is built using cloud code,
Is that a supply chain risk?
In a world with ubiquitous and powerful AI, it's actually not clear to me that big tech will be able to cordon off their use of Claude away from their Pentagon work.
And this raises a question that the Department of War probably hasn't thought through.
If we do end up in this world with powerful and pervasive AI, then when forced to choose between their AI provider and the Department of War, which constitutes a tiny fraction of the revenue,
Wouldn't they rather drop the government than the AI?
So what exactly is the Pentagon's plan here?