Jessica Murray
๐ค SpeakerAppearances Over Time
Podcast Appearances
Yeah, so police forces are really optimistic about this.
They say that it is going to help them massively and that, you know, it's already leading to a number of arrests.
So I think the Metropolitan Police in London are the police force, I'd say, have used this the most.
In the first five months of this year, they scanned over 1.7 million faces, which was an 87% increase from the same period last year.
And between September 24 and September 2025, almost 1,000 people were arrested following the use of live facial recognition technology.
So, yeah, the police are, they say it's really effective.
And, you know, they, for example, they, you know, I think it's over 100 sex offenders, they say, have been arrested as a result of using this technology.
You know, they gave one instance where, you know, a registered sex offender was found alone with a young child as a result of being detected by live facial recognition cameras.
So, yeah, so the police say that it's helping them massively, but others, it's worth noting that other experts say, yes, it is a useful tool, but it has its limitations and we shouldn't sort of get carried away with, you know, how transformative this could be.
Yeah, so there's no denying it is rare, but there have been cases of false arrests where people have been either falsely apprehended or falsely arrested as a result of their face kind of being picked up on live facial recognition technology.
And early evidence suggests that it is people of colour who are more likely to be falsely arrested for that because the technology isn't as effective as detecting those faces.
So one case that hit headlines recently was
Alvi Chowdhury, who was arrested at his home in Southampton for a burglary in Milton Keynes.
And that was because his face had been picked up on live facial recognition technology that was being used by Thames Valley Police.
But it was completely incorrect.
It was another person of South Asian heritage who had committed that crime.
And the software had obviously just made a mistake and made a false match.
never been to Milton Keynes before.
It was 100 miles away from where he lived.
You would think that there was some kind of human judgment that would have gone into that to determine that that was sort of an incorrect match.