Police force use of facial recognition technology ruled unlawful

Support us and go ad-free

The use of facial recognition technology by police did interfere with privacy and data protection laws, the Court of Appeal has ruled.

Civil rights campaigner Ed Bridges brought a legal challenge against South Wales Police arguing their use of automatic facial recognition (AFR) had caused him “distress”.

He had his face scanned while he was Christmas shopping in Cardiff in 2017 and at a peaceful anti-arms protest outside the city’s Motorpoint Arena in 2018.

In a ruling on Tuesday, three Court of Appeal judges ruled the force’s use of AFR was unlawful, allowing Bridges’ appeal on three out of five grounds he raised in his case.

In the judgment, the judges said that there was no clear guidance on where AFR Locate – the system being trialled by South Wales Police – could be used and who could be put on a watchlist.

It ruled that “too much discretion is currently left to individual police officers”.

The court also found that a data protection impact assessment of the scheme was deficient and that the force had not done all they could to verify that the AFR software “does not have an unacceptable bias on grounds of race or sex”.

Read on...

Support us and go ad-free

The judgment notes that there was no clear evidence that the software was biased on grounds of race or sex.

Bridges took his case – believed to be the world’s first over police use of such technology – to the Court of Appeal after his case was previously rejected by the High Court.

In a statement after the ruling, Bridges said he was “delighted” the court has found that “facial recognition clearly threatens our rights”.

South Wales Police said the test of their “ground-breaking use of this technology” by the courts had been a “welcome and important step in its development”.

At a three-day Court of Appeal hearing in June, lawyers for Bridges argued the facial recognition technology interferes with privacy and data protection laws and is potentially discriminatory.

They said the technology, which is being trialled by the force with a view to rolling it out nationally, is used to live capture the facial biometrics of large numbers of people and compare them with people on a “watchlist”.

The force does not retain the facial biometric data of anyone whose image is captured on CCTV but doesn’t generate a match, the court heard.

AFR technology maps faces in a crowd by measuring the distance between features then compares results with a “watchlist” of images – which can include suspects, missing people and persons of interest.

South Wales Police has been conducting a trial of the technology since 2017.

Support us and go ad-free

Do your bit for independent journalism

Did you know that less than 1.5% of our readers contribute financially to The Canary? Imagine what we could do if just a few more people joined our movement to achieve a shared vision of a free and fair society where we nurture people and planet.

We need you to help out, if you can.

When you give a monthly amount to fund our work, you are supporting truly independent journalism. We hold power to account and have weathered many attempts to shut us down and silence the counterpoint to the mainstream.

You can count on us for rigorous journalism and fearless opposition to an increasingly fascist government and right wing mainstream media.

In return you get:

  • Advert free reading experience
  • Behind the scenes monthly e-newsletter
  • 20% discount from our shop

 

The Canary Fund us