Ayush
👤 PersonAppearances Over Time
Podcast Appearances
Allocate police resources more efficiently.
Cities like Chicago, Los Angeles, and London have experimented with this technology.
That means predictive policing can reinforce racial and socioeconomic biases, unfairly targeting minority and lower-income communities.
And because the system is opaque,
citizens don't know why they're being surveilled.
Predictive policing is less about predicting crime and more about predicting where the police will go next.
That's a dangerous distortion of justice.
Investigation revealed COMPAS gave higher risk of reoffending scores to black defendants than white defendants under similar circumstances.
That means the algorithm wasn't neutral.
It amplified existing systemic biases.
It raises the core question.
Should algorithms influence decisions about human freedom at all?
Or should they only be used as advisory tools, not as final say?
That could clear backlogs and speed up court systems bogged down by paperwork.
In some regions, court cases can drag on for years, so efficiency would be a real win.
And there's cultural difference too.
In some countries, citizens might accept AI arbitration as normal.
In others, the lack of a human judge could feel deeply unjust.