Scott Alexander
๐ค SpeakerAppearances Over Time
Podcast Appearances
Directive 3000.09 requires that autonomous weapon systems be designed to, quote, end quote.
But it doesn't define appropriate, and the US government has stated it is a flexible term, where what qualifies can differ across weapon systems, domains of warfare, types of warfare, operational contexts, and even across different functions in a weapon system.
The institution that decides what's appropriate is the same institution that wants to use the weapon.
Second, the Department of War can change its own policies, so any contract which only guarantees lawful use rather than hard coding some particular standard gives the DOW complete latitude to change the relevant directive, and therefore the terms, whenever they want.
Footnote, OpenAI suggests they're protected against this since their agreement specifically refers to DOD Directive 3000.09, dated 25th January 2023.
But other parts of the contract refers to all lawful purposes, without specifying current law in particular, which would at best lead to contradictions if the law changes.
Back to the text.
Everyone, including Anthropic, agrees that some form of autonomous weapons will be necessary to win the wars of the future.
Indeed, autonomous weapons are already being used on the battlefield in Ukraine.
But there's a wide spectrum, from humans entirely in the loop, to humans partly in the loop, to humans totally unrelated to the loop, and we might want humans involved somewhere for at least two reasons.
First, humans add reliability.
For the same reason that chatbots sometimes hallucinate, and coding agents sometimes make crazy and reckless decisions that no human would consider, fully autonomous weapons might make inexplicable mistakes in their use of lethal force, with potentially devastating results.
Second, and more important, human soldiers are a check on the worst abuses of authoritarians.
Sometimes a strongman will give an illegal order.
To shoot at protesters, to initiate an auto-coup, to begin a genocide.
And soldiers will say no.
Sometimes those soldiers will decide that the appropriate response is to arrest the strongman instead.
However often this happens, the fear of it keeps strongmen in line and forces them to consider public opinion, at least insofar as the army is made up of the public.
If there's a fully robotic force that automatically obeys orders, this check disappears.