Adam Kucharski
๐ค SpeakerAppearances Over Time
Podcast Appearances
And would it matter whether the people are old or young?
These kinds of decisions can sometimes crop up in real life with human drivers.
In 2020, a car in Michigan swerved to avoid a truck and hit a young couple walking on the pavement, putting them in hospital for several months.
Would AI have reacted differently?
Well, it turned out that the car was also racing side by side with another vehicle at the time, and the driver didn't have a valid license.
Before we get too deep into theoretical dilemmas, we should remember that humans often aren't very good drivers.
If we could ensure there were far fewer accidents on our roads, would you mind being unable to explain the ones that did happen?
In this complex world of ours, maybe we should just abandon the pursuit of explanation altogether.
After all, many data-driven areas of science increasingly focus on prediction because it's fundamentally an easier problem than explaining.
Like anesthesia, we can often make useful predictions about what something will do without fully understanding it.
But explanation can sometimes really matter if we want a better world.
The focus on prediction is particularly troubling in the field of justice.
Increasingly, algorithms are used to decide whether to release people on bail or parole.
The computer isn't deciding whether they've committed a crime.
In effect, it's predicting whether they'll commit one in future.
But ideally, we wouldn't just try and predict future crimes using an opaque algorithm.
we try and prevent them.
And that means understanding why people we offend and what we can do to stop that happening.
A lack of interest in explanation leaves a gap that in this situation creates room for injustice.
But it's not the only thing that can emerge in the gap between what is happening and why it's happening.