Michael
👤 PersonAppearances Over Time
Podcast Appearances
Today, we shift gears into something equally impactful, AI in law and justice.
The big question is, can machines ever be truly fair?
Or will they just replicate the biases already baked into our legal system?
Tools like Casetext's Co-Counsel Harvey AI or other law-specific Large Language Models, or LLMs, are already being piloted in firms.
cut costs, and even make legal resources more accessible to clients who otherwise might be shut out.
When AI generates a flawed argument or references a hallucinating court case,
that doesn't actually exist, and yes, that has happened, who's responsible?
The lawyer, the developer, or the AI itself?
Still, if done right, AI can massively democratize access to justice.
For people who can't afford traditional legal help, an AI system could mean the difference between having a defense and facing the system alone.
One of the most controversial uses of AI in justice is predictive policing.
Algorithms analyze crime data to predict where future crimes might happen.
The data itself is biased.
If one neighborhood has historically been over-policed, then of course the data will show higher crime rates there, leading to even more policing in the same area.
It's basically a feedback loop.
Some cities have banned predictive policing altogether after public outcry.
Others are pushing for transparency, requiring algorithmic audits and public reporting on how these predictions are actually made.