London police chief ‘completely comfortable’ using facial recognition with 98 percent false positive rate

Authored by theverge.com and submitted by redkemper
image for London police chief ‘completely comfortable’ using facial recognition with 98 percent false positive rate

The head of London’s Metropolitan Police force has defended the organization’s ongoing trials of automated facial recognition systems, despite legal challenges and criticisms that the technology is “almost entirely inaccurate.”

According to a report from The Register, UK Metropolitan Police commissioner Cressida Dick said on Wednesday that she did not expect the technology to lead to “lots of arrests,” but argued that the public “expect[s]” law enforcement to test such cutting-edge systems.

Facial recognition is used to scan the faces of crowds at public events

The Met’s use of automated facial recognition technology (AFR) is controversial. The London force is one of several in the UK trialling the technology, which is deployed at public events like concerts, festivals, and soccer matches. Mobile CCTV cameras are used to scan crowds, and tries to match images of faces to mugshots of wanted individuals.

But while facial recognition performs well in controlled environments (like photos taken at borders), they struggle to identify faces in the wild. According to data released under the UK’s Freedom of Information laws, the Metropolitan’s AFR system has a 98 percent false positive rate — meaning that 98 percent of the “matches” it makes are of innocent people.

Of the two correct matches the Met’s technology has made to date, there have been zero arrests. One match was for an individual on an out-of-date watch list; the other for a person with mental health issues who frequently contacts public figures, but is not a criminal and not wanted for arrest. The Met says that AFR systems are constantly monitored by police officers, and that no individuals have been arrested because of a false match.

Despite this, Big Brother Watch, the organization that requested the UK data, warns that facial recognition technology is being deployed without proper scrutiny or public debate. The non-profit says automated facial recognition risks turning any and all public spaces into biometric check points, and that the technology could have a chilling effect on free society, with individuals scared to join protests for fear of being misidentified and arrested.

Similar fears are being voiced in the US, where easy-to-use facial recognition tech like Amazon’s Rekognition system is being marketed and sold to law enforcement agencies around the country. A recent report on the topic from advocacy group the EFF said “face recognition is poised to become one of the most pervasive surveillance technologies.”

In the UK, there are two legal challenges underway questioning whether facial recognition technology undermines human rights to privacy and free expression. As The Register reports, when commissioner Dick was asked about this at a hearing this week, she replied that she was “completely comfortable” with the technology’s use, and that the Met’s lawyers were “all over it and have been from the beginning.”

EldBjoern on July 5th, 2018 at 13:05 UTC »

So the system scans 1k people. Of those it flags 100 people. And of the flagged people are 98 people falsely flagged? Right?

C-G-B_Spender on July 5th, 2018 at 11:45 UTC »

It might be of interest to some to know that Dick rose to prominence when an operation she was in charge of led to the shooting of an innocent man in a case of mistaken identity.

Lanhdanan on July 5th, 2018 at 10:28 UTC »

She's being paid very well. Understandable that she's completely comfortable. Until one of her family members or friends gets tagged.

Edit: Gender