英国警方需要面部Recog刹住nitionPolice in the U.K., backed by the government, are testing a facial-recognition system that is 20 percent accurate and treating those who avoid its gaze as potential suspects.

ByAdam Smith

This story originally appeared onPCMag

You're reading Entrepreneur Europe, an international franchise of Entrepreneur Media.

via PC Mag

英国有一个密切的关系与安全凸轮eras. London alone has one of the highest ratios of surveillance cameras per citizen in the developed world.Estimatesfrom 2002 put the number of surveillance cameras in Greater London at more than 500,000; around 110 are used by the City of London Police, according to data obtained through a 2018 Freedom of Information request.

Being recorded apparently is not enough; London's Metropolitan Police Service has been testing the use of facial-recognition cameras, and the effort has the support of Home Secretary Sajid Javid -- who oversees immigration, citizenship, the police force and the security service. "I think it's right they look at that," he said, according to theBBC.

Although the upcoming election will decide the new leader of the Conservative Party, who will also become Prime Minister, it is unlikely that government attitudes toward facial recognition will change. Javid might move to another part of the government, but the civil libertarian side of the Conservative Party has been relatively quiet of late.

The problem is, facial recognition -- as it currently stands --is often inaccurate. London police have been using facial recognition since 2016, but an independent reportrevealedlast week showed that four out of five people identified by facial recognition as possible suspects were actually innocent -- a distinct failing in themachine learningused to train the system.

Professor Pete Fussey and Dr. Daragh Murray, from the University of Essex, analyzed the accuracy of six out of 10 police trials. Of 42 matches, only eight were correct, and four of those 42 were never identified because of crowding.

Nevertheless, the Metropolitan Police see the trials as a success and was "disappointed with the negative and unbalanced tone of this report," a deputy assistant commissioner toldSky News. The Met measures accuracy by comparing successful and unsuccessful matches to the total number of faces processed; by this rubric, the error rate was only 0.1 percent.

That was not the only error, however. The database used by the police was not current, and therefore identified people whose cases had already been closed. There is also "significant ambiguity" over the criteria around what puts a person onto the watchlist, the report states.

The Metropolitan Police informed citizens about the trials by handing out leaflets and tweeting, but the report deems this insufficient. "The information provided transparency regarding the time and location of the [live facial recognition] test deployments yet there was less clarity over the purpose of the deployment, who was likely to be the subject of surveillance, and how additional information could be ascertained," the reports says. Moreover, treating those who tried to avoid cameras "as suspicious ... undermines the premise of informed consent."

The report concludes that it's "highly possible [the trial] would be held unlawful if challenged before the courts." The implicit legal authority "coupled with the absence of publicly available, clear, online guidance is likely inadequate" when compared to human rights law, which requires that interference with individuals' human rights be "in accordance with the law, pursue a legitimate aim, and be 'necessary in a democratic society.'"

Controversy across the pond

The United Kingdom isn't the only country struggling with this problem. In the United States, facial-recognition algorithms have been criticized after research by theGovernment Accountability Officefound that the systems used by the FBI were inaccurate 14 percent of the time. Moreover, that number does not take into account "the accompanying false-positive rate presents an incomplete view of the system's accuracy," which can adversely affect minorities due tosystemic bias.

Microsoft alsorejected a requestby California law enforcement to use its facial-recognition system in police cars and body cameras, because of concerns that its algorithm was not sophisticated enough. "Anytime they pulled anyone over, they wanted to run a face scan," Microsoft President Brad Smith said. "We said this technology is not your answer."

A number of cities have rejected it; San Franciscooutlawed facial-recognition technologiesfor government use, and Somerville, Mass., voted unanimously to passanti-facial-recognition legislationbecause of its potential to "chill" protected free speech.

And while the United Kingdom's government can be much more compliant regarding potentially opressive technology -- from internet regulation to stopmalicious content on social mediato anear-ban on pornography-- its citizens should be careful letting it propogate.

Even if its accuracy improves, the difficulty in knowing when, where, and how facial-recognition software is being used means that it is difficult for citizens to give adequate consent to being constantly recorded and identified. There are too many warranted concerns about the data the algorithm is being trained on, the spaces being surveilled, and the effect it will have on our civil liberties for people let facial-recognition interfere with their right to a private life.

Adam Smith

Contributing Editor PC Mag UK

Adam Smith is the Contributing Editor for PCMag UK, and has written about technology for a number of publications including What Hi-Fi?, Stuff, WhatCulture, and MacFormat, reviewing smartphones, speakers, projectors, and all manner of weird tech. Always online, occasionally cromulent, you can follow him on Twitter @adamndsmith.

Related Topics

Business Ideas

55 Small Business Ideas to Start in 2023

We put together a list of the best, most profitable small business ideas for entrepreneurs to pursue in 2023.

Science & Technology

This $80 Quadcopter Can Help Your Social Media Content Take Off

Businesses could see new engagement on social media with content made by this $80 drone.

Science & Technology

Get 20TB With Prism Drive Secure Cloud Storage for $90

20TB of cloud storage is yours for $90 — one week only!

Career

Get AI-Powered Help With Resumes, Cover Letters and More With This $29.97 Tool

Let AI create your resume with this handy tool -- now $29.97 for life.

Science & Technology

Get This CompTIA and IT Course Bundle for $20 Through 10/23

Through October 23rd, these IT courses are just $20.

Science & Technology

Gift This Cybersecurity Bundle — $60 Through 10/23

You have just a week to get this cybersecurity bundle — $59.97 (reg. $754.)