Police force use of facial recognition technology ruled unlawful

The Canary

The use of facial recognition technology by police did interfere with privacy and data protection laws, the Court of Appeal has ruled.

Civil rights campaigner Ed Bridges brought a legal challenge against South Wales Police arguing their use of automatic facial recognition (AFR) had caused him “distress”.

He had his face scanned while he was Christmas shopping in Cardiff in 2017 and at a peaceful anti-arms protest outside the city’s Motorpoint Arena in 2018.

In a ruling on Tuesday, three Court of Appeal judges ruled the force’s use of AFR was unlawful, allowing Bridges’ appeal on three out of five grounds he raised in his case.

In the judgment, the judges said that there was no clear guidance on where AFR Locate – the system being trialled by South Wales Police – could be used and who could be put on a watchlist.

It ruled that “too much discretion is currently left to individual police officers”.

The court also found that a data protection impact assessment of the scheme was deficient and that the force had not done all they could to verify that the AFR software “does not have an unacceptable bias on grounds of race or sex”.

The judgment notes that there was no clear evidence that the software was biased on grounds of race or sex.

Bridges took his case – believed to be the world’s first over police use of such technology – to the Court of Appeal after his case was previously rejected by the High Court.

In a statement after the ruling, Bridges said he was “delighted” the court has found that “facial recognition clearly threatens our rights”.

South Wales Police said the test of their “ground-breaking use of this technology” by the courts had been a “welcome and important step in its development”.

At a three-day Court of Appeal hearing in June, lawyers for Bridges argued the facial recognition technology interferes with privacy and data protection laws and is potentially discriminatory.

They said the technology, which is being trialled by the force with a view to rolling it out nationally, is used to live capture the facial biometrics of large numbers of people and compare them with people on a “watchlist”.

The force does not retain the facial biometric data of anyone whose image is captured on CCTV but doesn’t generate a match, the court heard.

AFR technology maps faces in a crowd by measuring the distance between features then compares results with a “watchlist” of images – which can include suspects, missing people and persons of interest.

South Wales Police has been conducting a trial of the technology since 2017.

We need your help ...

The coronavirus pandemic is changing our world, fast. And we will do all we can to keep bringing you news and analysis throughout. But we are worried about maintaining enough income to pay our staff and minimal overheads.

Now, more than ever, we need a vibrant, independent media that holds the government to account and calls it out when it puts vested economic interests above human lives. We need a media that shows solidarity with the people most affected by the crisis – and one that can help to build a world based on collaboration and compassion.

We have been fighting against an establishment that is trying to shut us down. And like most independent media, we don’t have the deep pockets of investors to call on to bail us out.

Can you help by chipping in a few pounds each month?

The Canary Support us