Derek Thompson
π€ SpeakerAppearances Over Time
Podcast Appearances
Just slow down a bit on this idea.
Of course.
That a breakdown in negotiations between Anthropic and the Department of War could change the way the underlying technology treats American defense.
Because that's a big philosophical heady idea, but I think I know where you're headed here.
And just to jump in right there, tell me if you think this is the wrong direction to take it, but the fear would be...
that if in part of the post-training data, Claude is taught or is led to believe that the Pentagon is acting illegally or immorally when it comes to the use of AI-assisted autonomous technology, the next time that Claude might be used
in any part in the kill chain to put up an autonomous drone swarm to stop, say, a hypersonic missile fired by the Chinese at some American target
that something in Claude is gonna stand up and essentially say, I'm sorry, I can't do that, Pete, because it's been trained somehow, led to believe that it is being asked to do something that is not in its constitution.
That is the fear, that its own weird, silicon-based moral sense will override the Pentagon-based need to direct the technology.
Is something like that core to the fear?
Like, you could call it a moral sense and whatever, but, like, it's alsoβ It's hard to do this on the fly without anthropomorphizing, and I'm not in any way suggesting a conscience here.
Yeah, yeah, yeah, of course.
One thing that's distinguished your commentary on the anthropic Department of War conflict is that to me, you seem to see this conflict in mythic terms.
I'm going to quote directly from the essay that you wrote recently.
Quote, it is increasingly difficult to discuss the developments of frontier artificial intelligence and what kind of futures we should aim to build without acknowledging our place at the deathbed of the republic as we know it.
That is very dramatic language.
In what way do you see us at the deathbed of the American Republic?
It sometimes sounds like you're saying American democracy as we know it, America's governing norms, might not survive this technology if we keep going in the direction that we're going.
I think that three weeks ago, I would not have agreed with that.
But now that a simple contract negotiation between a private company and the Pentagon broke down such that the Pentagon, for the first time in American history, essentially designated an American company a supply chain threat that had to be essentially destroyed, I'm beginning to wonder whether or not this argument that the American government coming up against AI is going to produce results that are, at the very least,