Police use of facial recognition technology ‘should concern us all’, says data watchdog

Support us and go ad-free

The UK data watchdog has warned police forces testing live facial recognition (LFR) technology that there are still significant privacy and data protection issues which must be addressed.

South Wales Police and the Met Police have been trialling LFR technology as a possible way to reduce crime, but the move has been divisive.

Facial recognition technology maps faces in a crowd by measuring the distance between facial features, then compares results with a “watch list” of images, which can include suspects, missing people and persons of interest.

Information Commissioner: “A potential threat to privacy that should concern us all”

The Information Commissioner’s Office (ICO) said it understood the legitimate aims of the controversial system, but it told police forces that they need to do more to demonstrate their compliance with data protection law, including in how watch lists are compiled and what images are used.

 

“We understand the purpose is to catch criminals,” Information Commissioner Elizabeth Denham said.

“But these trials also represent the widespread processing of biometric data of thousands of people as they go about their daily lives.

“And that is a potential threat to privacy that should concern us all.

“I believe that there needs to be demonstrable evidence that the technology is necessary, proportionate and effective considering the invasiveness of LFR.”

The watchdog – which is currently investigating the use of live facial recognition – also raised concern about technological bias, which can see more false positive matches from certain ethnic groups.

Facial Recognition Technology
Facial recognition technology is trialled in Leicester Square, London (Kirsty O’Connor/PA)

A breach of human rights law?

In May, a court heard from an activist who claimed that the use of facial recognition technology by the police is a breach of human rights law.

Lawyers representing Ed Bridges claim South Wales Police violated his privacy and data protection rights by processing an image taken of him in public, action which he says caused him “distress”.

South Wales Police argue that use of facial recognition technology does not infringe the privacy or data protection rights of Mr Bridges as it is used in the same way as photographing a person’s activities in public, and it does not retain the data of those not on its watch list.

But it does keep CCTV images from the scanning process for up to 31 days.

The ICO said the resulting judgment will form an important part of its investigation, and it will need to consider it before publishing its findings.

“In recent months we have widened our focus to consider the use of LFR in public spaces by private-sector organisations, including where they are partnering with police forces,” Denham continued.

“We’ll consider taking regulatory action where we find non-compliance with the law.”

 

Support us and go ad-free

Do your bit for independent journalism

Did you know that less than 1.5% of our readers contribute financially to The Canary? Imagine what we could do if just a few more people joined our movement to achieve a shared vision of a free and fair society where we nurture people and planet.

We need you to help out, if you can.

When you give a monthly amount to fund our work, you are supporting truly independent journalism. We hold power to account and have weathered many attempts to shut us down and silence the counterpoint to the mainstream.

You can count on us for rigorous journalism and fearless opposition to an increasingly fascist government and right wing mainstream media.

In return you get:

  • Advert free reading experience
  • Behind the scenes monthly e-newsletter
  • 20% discount from our shop

 

The Canary Fund us