The Met’s ‘dangerous experimentation’ with facial recognition technology is causing ‘significant concerns’

Van with facial recognition camera
Support us and go ad-free

A report has found the Metropolitan Police’s tests with live facial recognition (LFR) wrongly identify four out of five people. It said there are “significant concerns” about the technology. And as a result, Big Brother Watch said the Met must stop LFR deployment “urgently”.

Inaccurate and misleading

Researchers from the Human Rights, Big Data and Technology Project were present for six LFR tests by the Met starting in June 2018.  And they found it would be “highly possible” that the trials wouldn’t stand up to legal challenges because the technology wrongly identified people four out of five times. As Sky News reported, researchers verified just eight of 42 people flagged up during trials. The report also highlighted the Met’s use of outdated watch lists.

The report also aired concerns over consent. This included questioning whether the Met made information sufficiently available for members of the public to give informed consent over entering a camera’s field of view. It also raised issues around withdrawing or refusing consent. One example included the police intervening with a person covering their face while walking past LFR. The report said:

treating LFR camera avoidance as suspicious behaviour undermines the premise of informed consent. In addition, the arrest of LFR camera avoiding individuals for more minor offences than those used to justify the test deployments raise clear issues regarding the extension of police powers and of ‘surveillance creep’.

“Utterly damning”

Responding to the findings, Silkie Carlo of privacy group Big Brother Watch said:

This report is an utterly damning conclusion to the police’s dangerous experimentation with live facial recognition. It confirms what we have long warned – it’s inaccurate, lawless, and must be stopped urgently. This message is now coming not just from us, but from the independent reviewers commissioned by the Metropolitan Police themselves. The only question that remains is when will the police finally drop live facial recognition? The public’s freedoms are at stake and it is long overdue.

Carlo went on to say that the Met’s LFR cameras have “no place in Britain”. She also said that, in light of Big Brother Watch’s legal challenge against the Met, she hopes “the force will now decide not to use live facial recognition any further”.

Read on...

Support us and go ad-free
Dangerous technology

The Met claimed its own analysis produced very different results. Its method, which Sky News said compares “successful and unsuccessful matches with the total number of faces processed”, led to just a 0.1% error rate. As a result, Duncan Ball, the Met’s deputy assistant commissioner, said:

We are extremely disappointed with the negative and unbalanced tone of this report… We have a legal basis for this pilot period and have taken legal advice throughout.

We believe the public would absolutely expect us to try innovative methods of crime fighting in order to make London safer.

But LFR technology is facing criticism internationally for inaccurate and misleading results as well as for police misuse of data. This report shows drives home just how dangerous it will be, and the impact it will have on real people’s lives.

Featured image via YouTube – JSUK News

Support us and go ad-free

We need your help to keep speaking the truth

Every story that you have come to us with; each injustice you have asked us to investigate; every campaign we have fought; each of your unheard voices we amplified; we do this for you. We are making a difference on your behalf.

Our fight is your fight. You’ve supported our collective struggle every time you gave us a like; and every time you shared our work across social media. Now we need you to support us with a monthly donation.

We have published nearly 2,000 articles and over 50 films in 2021. And we want to do this and more in 2022 but we don’t have enough money to go on at this pace. So, if you value our work and want us to continue then please join us and be part of The Canary family.

In return, you get:

* Advert free reading experience
* Quarterly group video call with the Editor-in-Chief
* Behind the scenes monthly e-newsletter
* 20% discount in our shop

Almost all of our spending goes to the people who make The Canary’s content. So your contribution directly supports our writers and enables us to continue to do what we do: speaking truth, powered by you. We have weathered many attempts to shut us down and silence our vital opposition to an increasingly fascist government and right-wing mainstream media.

With your help we can continue:

* Holding political and state power to account
* Advocating for the people the system marginalises
* Being a media outlet that upholds the highest standards
* Campaigning on the issues others won’t
* Putting your lives central to everything we do

We are a drop of truth in an ocean of deceit. But we can’t do this without your support. So please, can you help us continue the fight?

The Canary Support us