Police in the United Kingdom have been testing the effectiveness of live facial recognition (LFR) for several years now, but future uses of the technology have been called into question.
The Information Commissioner’s Office (ICO), an independent authority that seeks to uphold information rights in the public interest, has weighed in on issues of data privacy related to LFR, and Members of Parliament (MPs) have called for a moratorium on uses of the technology. The big question is whether the benefits of LFR outweigh its impact on privacy rights.
Live facial recognition
I believe that there needs to be demonstrable evidence that the technology is necessary"
The House of Commons Science and Technology Committee has expressed concerns about bias, privacy and accuracy of facial recognition systems and urged the U.K. government to issue a moratorium on further live facial recognition trails until regulations are in place to address bias and data retention.
According to Elizabeth Denham, U.K. Information Commissioner: “[Police trials of LFR] represent the widespread processing of biometric data of thousands of people as they go about their daily lives. And that is a potential threat to privacy that should concern us all.” Denham says live facial recognition (LFR) is a high priority area for ICO. “I believe that there needs to be demonstrable evidence that the technology is necessary, proportionate and effective considering [its] invasiveness,” she says.
Potential public distrust
“Any organisation using software that can recognise a face amongst a crowd and then scan large databases of people to check for a match in a matter of seconds, is processing personal data,” says Denham. General Data Protection Regulation (GDPR) wording specifies biometric data as a ‘sensitive’ category of personal information.
London’s Metropolitan Police Service performed 10 trials of live facial recognition at various venues in 2016, 2017 and 2018. The London Police Ethics Panel reviewed the trials and concluded that additional use of the technology would be supported if certain conditions were met. One condition is if the “overall benefits to public safety [are] great enough to outweigh any potential public distrust in the technology.” Each deployment should be assessed and authorised as necessary and proportionate. Operators should be trained to understand associated risks and to be accountable, and there should be evidence that the technology does not promote gender or racial bias.
Develop strict guidelines
Met Police used NEC’s NeoFace technology to analyse images of the faces of people on a watch list
The Ethics Panel also specified that both the Metro Police and Mayor’s Office for Policing and Crime should develop strict guidelines to ensure that deployments balance the benefits of the technology with the potential intrusion on the public. “We want the public to have trust and confidence in the way we operate as a police service, and we take the report’s findings seriously,” said Detective Chief Superintendent Ivan Balhatchet, who led the trials.
In its 10 trials of live facial recognition, Met Police used NEC’s NeoFace technology to analyse images of the faces of people on a watch list. The system measured the structure of each face, including distance between eyes, nose, mouth and jaw to create facial data, which was used to match against the watch list. The system only kept faces matching the watch list, and only for 30 days. Non-matches are deleted immediately.
More accurate identification
An independent review of the trials, commissioned by the Metropolitan Police, concluded it is ‘highly possible’ that the Met’s ‘trial’ deployments would not satisfy the key legal test of being considered ‘necessary in a democratic society’ if challenged in the courts, according to U.K. human rights advocacy group Liberty.
South Wales Police have partnered with NEC to formally pilot facial recognition technology. NEC’s real-time solution enables trained officers to monitor movement of people at strategic locations. “Facial recognition technology enables us to search, scan and monitor images and video of suspects against offender databases, leading to faster and more accurate identification of persons of interest,” says Assistant Chief Constable Richard Lewis. “The technology can also enhance our existing CCTV network in the future by extracting faces in real time and instantaneously matching them against a watch list of individuals, including missing people.”
U.K. human rights advocacy group Liberty has taken legal action on behalf of one Cardiff resident against South Wales Police |
Intrusive technology
“We are very cognisant of concerns about privacy, and we are building in checks and balances into our methodology to reassure the public that the approach we take is justified and proportionate,” says Lewis.
U.K. human rights advocacy group Liberty has taken legal action on behalf of one Cardiff resident against South Wales Police over its use of facial recognition. “Facial recognition is an inherently intrusive technology that breaches our privacy rights,” says lawyer Megan Goulding at Liberty. “It risks fundamentally altering our public spaces, forcing us to monitor where we go and who with, seriously undermining our freedom of expression.” ICO’s Denham says any judgment resulting from the legal action will form an important part of ICO’s investigation and will be considered before ICO’s final findings are published.
Information management
South Wales Police offers the following assurance: “Data will only be retained as long as is necessary for a policing purpose, as per guidance within the Authorised Policing Practice on information management.”
Facial recognition systems are yet to fully resolve their potential for inherent technological bias"
One concern is that live facial recognition ‘discriminates’ against women and people of colour because it disproportionately misidentifies them, thus making them more likely to be subject to a police attention. ICO’s Elizabeth Denham comments: “Facial recognition systems are yet to fully resolve their potential for inherent technological bias; a bias which can see more false positive matches from certain ethnic groups.”
Taking regulatory action
ICO has also considered data protection ramifications of commercial companies using LFR. Denham says: “The technology is the same and the intrusion that can arise could still have a detrimental effect. In recent months, we have widened our focus to consider use of LFR in public spaces by private sector organisations, including where they are partnering with police forces. We will consider taking regulatory action where we find non-compliance with the law.”
A 27-page U.K. Home Office Biometrics Strategy sets out an overarching framework within which organisations in the Home Office sector will consider and make decisions on the use and development of biometric technology. However, Biometrics Commissioner Paul Wiles says the document “doesn’t propose legislation to provide rules for the use and oversight of new biometrics, including facial images. Given that new biometrics are being rapidly deployed or trialed, this failure to set out more definitively what the future landscape will look like in terms of the use and governance of biometrics appears to be short-sighted.”