Scott Alexander
๐ค SpeakerAppearances Over Time
Podcast Appearances
You simply cannot look through every text message sent over the course of a month to see which ones mention a certain dissident.
There are hacks.
You can perform an automated search for the dissident's name, but also obvious ways around the hack.
The dissident can simply not mention their own name in plain text.
AI solves these scale and cost problems.
An AI could perform meaningful search of all messages in a large database, piecing together patterns to, for example, give each citizen a presumed loyalty score.
This is currently a lawful use of AI, and one of the ones Dario Amadei's letter says that he's worried about.
As far as we can tell, Altman's contract with the Department of War doesn't contain any provisions preventing them from using ChatGPT this way.
For more details on mass domestic surveillance, see this doc.
Link in post.
Autonomous weapons.
More than you wanted to know.
Let's now turn to autonomous weapons.
The authors of this section are not themselves experts, but they consulted with an expert in national security law.
There is hard congressional law regulating the use of armed force in general.
For example, you're not allowed to shoot innocent Americans.
But to our knowledge, autonomous weapons in particular are only regulated by Department of War policy.
In particular, DoD Directive 3000.09.
These policies don't impose meaningful constraints for two reasons.
First, the policies are vague.