Michael
👤 PersonAppearances Over Time
Podcast Appearances
Now let's talk about sentencing.
Some courts use algorithms like COMPAS to recommend bail, parole, or prison sentences based on risk scores.
On paper, it sounds like an objective system, remove human bias and rely on data.
But in practice, these tools have shown glaring issues.
Even worse, defendants often can't even appeal to their scores because the algorithm is proprietary.
Imagine being sentenced to more prison time because of an algorithm you're not even allowed to question.
Because justice isn't just about statistics, it's about compassion, mercy, and understanding context, things no algorithm can replicate.
Some countries are testing AI judges for small claims or routine disputes.
Estonia has piloted the AI system to handle minor financial cases.
Do people want their disputes settled by a machine?
Would you feel truly hurt if your divorce case or child custody dispute was decided by a computer algorithm?
The balance may be hybrid systems, AI for clerical tasks and humans for actual judgment calls.
Machines can process thousands of documents instantly, but empathy and discretion still need to come from people.
AI reflects the data it's trained on, and justice data is messy, historical, and unequal.
If an AI system makes harmful decisions, who is responsible?