The UK data watchdog has warned police forces testing live facial recognition (LFR) technology that there are still significant privacy and data protection issues which must be addressed.
South Wales Police and the Met Police have been trialling LFR technology as a possible way to reduce crime, but the move has been divisive.
Facial recognition technology maps faces in a crowd by measuring the distance between facial features, then compares results with a “watch list” of images, which can include suspects, missing people and persons of interest.
Information Commissioner: “A potential threat to privacy that should concern us all”
The Information Commissioner’s Office (ICO) said it understood the legitimate aims of the controversial system, but it told police forces that they need to do more to demonstrate their compliance with data protection law, including in how watch lists are compiled and what images are used.
Data protection law applies if an organisation uses software that can recognise a face in a crowd, then scans large databases of people to check for a match. Our latest blog explains more: https://t.co/LLd7HDnpe3 pic.twitter.com/9Y7rynevwP
— ICO (@ICOnews) July 9, 2019
“We understand the purpose is to catch criminals,” Information Commissioner Elizabeth Denham said.
“But these trials also represent the widespread processing of biometric data of thousands of people as they go about their daily lives.
“And that is a potential threat to privacy that should concern us all.
“I believe that there needs to be demonstrable evidence that the technology is necessary, proportionate and effective considering the invasiveness of LFR.”
The watchdog – which is currently investigating the use of live facial recognition – also raised concern about technological bias, which can see more false positive matches from certain ethnic groups.
A breach of human rights law?
In May, a court heard from an activist who claimed that the use of facial recognition technology by the police is a breach of human rights law.
Lawyers representing Ed Bridges claim South Wales Police violated his privacy and data protection rights by processing an image taken of him in public, action which he says caused him “distress”.
South Wales Police argue that use of facial recognition technology does not infringe the privacy or data protection rights of Mr Bridges as it is used in the same way as photographing a person’s activities in public, and it does not retain the data of those not on its watch list.
But it does keep CCTV images from the scanning process for up to 31 days.
The ICO said the resulting judgment will form an important part of its investigation, and it will need to consider it before publishing its findings.
“In recent months we have widened our focus to consider the use of LFR in public spaces by private-sector organisations, including where they are partnering with police forces,” Denham continued.
“We’ll consider taking regulatory action where we find non-compliance with the law.”
We know everyone is suffering under the Tories - but the Canary is a vital weapon in our fight back, and we need your support
The Canary Workers’ Co-op knows life is hard. The Tories are waging a class war against us we’re all having to fight. But like trade unions and community organising, truly independent working-class media is a vital weapon in our armoury.
The Canary doesn’t have the budget of the corporate media. In fact, our income is over 1,000 times less than the Guardian’s. What we do have is a radical agenda that disrupts power and amplifies marginalised communities. But we can only do this with our readers’ support.
So please, help us continue to spread messages of resistance and hope. Even the smallest donation would mean the world to us.