Menu
Sign In Search Podcasts Charts People & Topics Add Podcast API Blog Pricing
Podcast Image

The Bayesian Conspiracy

257 – Pentagon Comes For Claude

01 Mar 2026

Transcription

Chapter 1: What is the main topic discussed in this episode?

4.84 - 7.144 Inyash Brodsky

Welcome to the Bayesian Conspiracy, I'm Inyash Brodsky.

0

7.384 - 9.007 Matt Freeman

I'm Steven Zuber. And I'm Matt Freeman.

0

9.227 - 16.238 Inyash Brodsky

Matt, welcome. This is the first time we have had a guest with live video. Not live video, with recorded video. How are you doing? We're live. That's true.

0

16.498 - 18.842 Matt Freeman

Right now I'm doing time of recording. How are you guys?

0

18.922 - 19.984 Inyash Brodsky

Doing alright, actually.

20.505 - 21.146 Steven Zuber

Yeah, same.

21.646 - 23.63 Inyash Brodsky

A little warm up in here. It's a nice summer day.

23.65 - 26.494 Steven Zuber

It's a likely nice day out. Yeah. You said it's a nice summer day?

26.609 - 28.731 Inyash Brodsky

A nice summer day at the end of February. Yeah.

Chapter 2: What recent events led to the Pentagon's interest in Claude?

152.639 - 153.04 Matt Freeman

We'll see.

0

153.02 - 158.706 Inyash Brodsky

Yeah, I'm definitely in favor of any of us looking smart. Anyone associated with Bayesian conspiracy, if they look smart, that's a good thing.

0

159.768 - 160.028 Steven Zuber

Agreed.

0

160.549 - 179.571 Inyash Brodsky

All right, let's try to do that then. So yeah, a while ago, the Pentagon and Anthropic, who are the creators of the Claude LLM, signed a contract to make Anthropic one of their primary providers, did a lot of things to make it so that their employees... could access classified information and send it to Claude and really integrate with their systems.

0

179.811 - 194.128 Inyash Brodsky

And part of that contract said that Claude would never be used for mass surveillance on American citizens or autonomous weapons, which means robots that can choose their own targets and kill them without any human feedback in the loop.

194.148 - 212.005 Steven Zuber

Very reasonable takes. You know, I would think so. It's, you know, Matt is on when we're anthropomorphizing Claude for a couple of episodes recently, and that's not the kind of thing Claude would want to do anyway. Yeah. So there's a great Astro Codex 10 summary of all this stuff.

212.185 - 225.197 Steven Zuber

So this was big enough that even it crossed my radar, even without the mind killer, because this happened right after this, right after you guys recorded, I think. So very few news makes it to me, but this did. So yeah, I'll read that quote later when it comes up, but...

225.717 - 243.419 Inyash Brodsky

Well, the Pentagon apparently was unhappy with those two limits on what it could do. And so it went back to Anthropic and said, we would like to remove this from the contract. And Anthropic said no. And normally you would assume that this just means they move to a different distributor. They go to OpenAI or someone else and Anthropic loses this business.

243.619 - 250.307 Inyash Brodsky

And instead they threatened to kind of destroy Anthropic if they didn't agree to these terms. Is this correct summary?

Chapter 3: What are the implications of the contract between Anthropic and the Pentagon?

651.502 - 656.792 Matt Freeman

Do you have the tweet up? No, but I'm sure I can find the Trump tweet.

0

657.245 - 672.684 Inyash Brodsky

Like, it's a great tweet because it has all the fnords where it's like, these guys are evil and unpatriotic and we're taking a stand and we're punishing them. And, you know, we're showing how tough we are. But like, if you read what he actually says, he's like, yeah, we're going to cancel the contract over the course of six months. They'll pull out.

0

672.844 - 686.02 Inyash Brodsky

And, you know, like Matt said, that's six months. So maybe in that time you can renegotiate things or figure something out outside of the news cycle. Like, just a great way to be like, haha, we have won this and we're punishing them. But all that's really happening in effect is that...

0

686 - 709.611 Matt Freeman

the contract is either getting canceled or renegotiated and none of the nuclear options are pulled yeah i mean the tweet contradicts itself right because he so first of all i love i love the you know the capital letters in it capital letters in a trump tweet you can always delete because there's no content it's it's the snores like you said the united states of america will never allow a radical left woke company to dictate how our great military fights and wins wars okay no content

0

709.591 - 730.772 Matt Freeman

And then he says, we don't need it, we don't want it, we will not do business with them again. One sentence later, Anthropic better get their act together and be helpful during this phase-out period, or I will use my full power to make them comply. It's like, So you're, you want them, you want to work with them in other words. I, whatever. It's fine.

730.812 - 746.855 Matt Freeman

It's, it's, it's fine if Anthropic just totally doesn't have this contract. It's not even that big of a contract in the, in the scope of their, of their business book. And I think kind of everyone involved would prefer that we not have LLMs making autonomous kill chain decisions. Yeah. I heard like 200 million. Yeah.

746.875 - 749.318 Inyash Brodsky

That's what I heard. Contract. Yeah. I'm assuming that's annual.

751.171 - 754.836 Steven Zuber

I'm not sure. It's not peanuts, but it's not, I don't think that's enough to sink Anthropic.

756.238 - 757.941 Matt Freeman

They made $10 billion last year, so.

Chapter 4: How did Anthropic respond to the Pentagon's demands?

945.479 - 959.201 Matt Freeman

Anything, everyone was already in a fervor, right? Everyone was already in a pro-anthropic fervor. And then he did this and it was just like, It's war. It's war, everybody. You know, it was electric being on Twitter yesterday afternoon. I can't lie.

0

959.461 - 965.19 Steven Zuber

I mean, for starters, I'm assuming— It's war with the most powerful faction in the U.S., too, the super smart computer nerds.

0

965.551 - 975.747 Inyash Brodsky

Right. I think the—I mean, I don't know. I think Trump is going to be at least a bit upset that Hexeth just immediately stepped over him and was like, actually, we're doing this instead.

0

975.811 - 988.677 Matt Freeman

I think so. I mean, you know, Trump famously fires people fairly aggressively if he becomes unhappy with them. And I'm not going to predict that that'll happen necessarily because I think Hegseth is like a perfect toady for him. Yeah.

0

989.518 - 990.42 Inyash Brodsky

And pretty important.

990.44 - 1000.692 Matt Freeman

But it is embarrassing. It's an own goal. And especially if the markets tank on Monday. Yeah. Trump's very responsive to markets. Yeah.

1000.813 - 1012.737 Inyash Brodsky

So the major problem with this is that, okay, we don't know if it's even legal, how much it can be interpreted to make them stop all business with everybody. And that's going to go through the courts for who knows how long.

1012.717 - 1034.106 Inyash Brodsky

But in this process, like if ultimately it does end up that they get destroyed, nobody is going to want to invest in them and risk their money putting tens of millions, hundreds of millions, possibly billions sometimes into a company which might not exist in six months because the government wants to destroy it. So potentially, like what is going to happen to Anthropic here?

1034.426 - 1037.07 Inyash Brodsky

They can't just go six months with no investment.

Chapter 5: What are the potential consequences of the Pentagon's threats?

1930.029 - 1936.035 Steven Zuber

Well, and then when he tells them he can't kill people, it was those two guys in the parking lot or whatever, but he'd already shattered their hands and stuff like that, so...

0

1936.235 - 1936.555 Unknown

Yeah.

0

1937.357 - 1941.322 Steven Zuber

Yeah, I would also... Good movie. I would feel safer with a Claude Terminator in my house.

0

1942.844 - 1949.653 Inyash Brodsky

Anyone who has not seen Terminator 2 yet, go see Terminator 2. Shockingly appropriate movie for our current times. Yeah. And time was perfect.

0

1949.673 - 1951.736 Unknown

Very relevant. Yeah, yeah.

1951.756 - 1974.231 Inyash Brodsky

Okay, so OpenAI jumped on this contract, which I guess should surprise nobody. But it's, I think, very telling that all of Anthropic's employees and a lot of the public is standing up and saying how proud they are of Anthropic, how great this is. And of the OpenAI employees, I don't think anyone has said anything good about OpenAI. A few have even resigned already.

1974.852 - 1977.717 Inyash Brodsky

There's a document, I think, of...

1977.697 - 2000.582 Inyash Brodsky

200 open ai and google employees uh signed saying that that they they are against this it is i don't know i'm kind of happy to see this happening i really think it'd be great if just all the open ai employees resigned en masse over the coming week over this this is i don't know do you want to be working for the people who are this evil that they're willing to go along with this

2000.562 - 2021.693 Matt Freeman

I try to be charitable. People have to make a living. And there's probably a lot of people who are basically going to be like centimillionaires if they stick with open AI and who would lose their shares if they leave. So I'm not unsympathetic to that. Look, we all like to think that we are paragons of virtue and principle, but

Chapter 6: How does the situation reflect on government and tech industry relations?

2124.931 - 2154.567 Inyash Brodsky

take all the power and money or do you stand up and be like no this is actually evil and i don't want to be evil i don't want to be the fucking nazi prison guard who got paid a hundred million dollars and i'm not willing to do it and like i think the people that go first are going to be the ones that have the easiest time finding new jobs so like now is the time to jump ship before everybody else has like get off the sinking vessel while there's still more places that you can go that aren't filled up yet yeah there's still life raft room i i think i don't want to disparage anybody who hasn't left yet you know like i i know that right it's only day two

0

2154.547 - 2169.531 Steven Zuber

Well, no, I mean, like, up in, you know, they could have left two weeks ago because OpenAI was still not the paragon of AI safety approach. But, you know, I guess what I'm trying to say is, like, I don't want to say people who still work there are, or who worked there a week ago are still, or were doing the wrong thing.

0

2169.751 - 2184.952 Steven Zuber

This does seem like kind of a bright light, bright, you know, a line in the sand that's 100 feet tall made of white fire, right? Like, this is... This is not the, well, okay, yeah, we don't really have a safety team anymore, but we'll just play it safe while we do our jobs. This isn't that kind of rationalization that you can do.

0

2185.152 - 2200.01 Steven Zuber

I mean, tell me if we want to just ditch this segue, or this maybe sidebar, but I'm sure there are positions at OpenAI that have nothing to do with the DoD deal, and you're just going to be... Yeah, man, I work on making the chat thing sound better.

0

2200.471 - 2214.671 Steven Zuber

You know, like, you know, Facebook is the, the epitome of technological evil to a lot of people, but it's like, I just work on the cool glasses they make, you know, like I, I'm not interested in, in all the, the cyber stalking, yada, yada. I just want, I want the meta glasses to work in a way that people enjoy.

2215.272 - 2227.823 Steven Zuber

Like there, there, there is, there, there are facets where you can kind of like hang out in and be like, well, they're doing that over there, but I'm doing this over here. Yeah. Yeah, but I think we're kind of past that point with this decision.

2228.083 - 2245.126 Matt Freeman

Yeah, I don't think I'm trying to say that working for OpenAI is morally equivalent to being an SS stormtrooper. I don't think that's true, but it's just increasingly obvious that your boss is not a good guy and that things are going in an increasingly bad direction and will continue to do so.

2245.186 - 2246.788 Unknown

Mm-hmm.

2246.768 - 2262.229 Matt Freeman

Even on the level of pragmatism, it might be a good idea to reconsider. I also think like the most literary way for all of this to go is, you know, you win, you get your vested shares, and then we enter a post-scarcity utopia where your money doesn't matter anymore.

Chapter 7: What is the significance of Trump's involvement in this conflict?

2365.37 - 2384.084 Matt Freeman

Right. So as they become more important and dangerous and et cetera. But then it's like you, even if you nationalize the labs, like you can't compel people to do work. They will, they, they will either do a bad job on purpose or they will quit. It's just not, it's just actually not possible to compel people to do work.

0

2384.104 - 2392.017 Matt Freeman

And unless you do something extreme, like what you were saying, where it's like, we've kidnapped your children. It's like that. It's just not a good idea.

0

2392.217 - 2401.988 Steven Zuber

Wasn't that essentially the premise of Rogue One? I guess so. And it worked out great because they built in the hole right to the middle of the ship that blows it up, or the Death Star. Yeah, you're right.

0

2402.048 - 2416.685 Steven Zuber

I'm trying to think of what the analogy would be here, and it doesn't sound all that hard because the thing is, just like with that, your boss is... I'm glad I went to a Star Wars reference before I went to a Marvel one because there are Marvel references to be made. I wasn't even thinking Ultron, actually. But, you know... Your bosses don't understand.

0

2416.745 - 2429.887 Steven Zuber

The management telling you to do this work doesn't understand your work. And so you can hand it to them and be like, yep, here it's done. And yet you've put in all that you've put in fail-safes anyway that won't be shown until the last possible second. That's not, I guess, I got sucked away into the sci-fi fantasy of this. Sorry.

2430.227 - 2449.915 Inyash Brodsky

Well, we live in the sci-fi world, Steven. Yeah. Speaking of living in the sci-fi world, I have this other thing to touch on, if I'm okay to... Okay, this is a Twitter post by Eliezer, so pointed out something which I had not considered, but is absolutely, once I read it, like, oh yeah, that is a thing. This is really big and important, we should think about it.

2450.115 - 2464.193 Inyash Brodsky

The tech sector and the government have kind of been at a distance for a long time. I think a lot of the time the tech people just want to do their thing and not be bothered by the government and pay their taxes or whatever and like, please stay away from us. And that's never worked out well for them.

2464.353 - 2479.157 Inyash Brodsky

One of the... I mean, there's a few dumb Bill Gates quotes that I have in my head, like the how many K should be enough for everybody thing, and it was like 19K of memory. I don't remember the exact number. But one of the ones that really stuck is that after Microsoft was sued...

2479.137 - 2499.576 Inyash Brodsky

almost out of existence by the government for including Microsoft Explorer with Windows, the browser that no one uses anyway. Bill Gates said, I think the biggest mistake I made with Microsoft was not getting involved with the government earlier. As in, he didn't give donations to the parties. He didn't, like, kiss up to the government.

Comments

There are no comments yet.

Please log in to write the first comment.