Met Police facial recognition technology will hit BAME communities the hardest

People on a busy pedestrian crossing
Support us and go ad-free

The London Metropolitan Police has announced plans to roll out facial recognition technology. As part of these plans, suspects will go on “watchlists” and will be approached by officers if spotted on cameras.

Civil liberties group Big Brother has criticised the decision, saying it’s “an enormous expansion of the surveillance state and a serious threat to civil liberties in the UK”. And there are also concerns about the accuracy of the technology used. Big Brother director Silkie Carlo said the rollout:

flies in the face of the independent review showing the Met’s use of facial recognition was unlawful, risked harming public rights and was 81% inaccurate.

She added:

This is a breathtaking assault on our rights and we will challenge it, including by urgently considering next steps in our ongoing legal claim against the Met and the Home Secretary.

Racism

As well as the issue of civil liberties, there are serious concerns about how the software will impact BAME communities. The racism inherent in facial recognition software has been well-reported. What’s more, overly-surveilled groups such as Muslims, Black people and Asian people are at even greater risk:

Read on...

Support us and go ad-free

What’s more, Met Police’s announcement came in the same week it was reported that the EU is considering a ban on facial recognition “to prevent the technology being abused”:

Precedents

Sadly, the misuse of facial recognition to criminalise minorities and vulnerable groups is not merely hypothetical. Examples already abound of how facial recognition has been misused, for instance in China with the targeting of Uyghur Muslims and other minorities:

Similarly, in August 2019, misuse of facial recognition against Black people in the US city of Detroit came to light. Immigration rights groups have also flagged serious concerns over the way US immigration control has used the technology:

So for those worried about the use of facial recognition signalling a dystopian future, there’s bad news. For criminalised communities and People of Colour, that future is already here.

Featured image via Piqsels

Support us and go ad-free

We need your help to keep speaking the truth

Every story that you have come to us with; each injustice you have asked us to investigate; every campaign we have fought; each of your unheard voices we amplified; we do this for you. We are making a difference on your behalf.

Our fight is your fight. You’ve supported our collective struggle every time you gave us a like; and every time you shared our work across social media. Now we need you to support us with a monthly donation.

We have published nearly 2,000 articles and over 50 films in 2021. And we want to do this and more in 2022 but we don’t have enough money to go on at this pace. So, if you value our work and want us to continue then please join us and be part of The Canary family.

In return, you get:

* Advert free reading experience
* Quarterly group video call with the Editor-in-Chief
* Behind the scenes monthly e-newsletter
* 20% discount in our shop

Almost all of our spending goes to the people who make The Canary’s content. So your contribution directly supports our writers and enables us to continue to do what we do: speaking truth, powered by you. We have weathered many attempts to shut us down and silence our vital opposition to an increasingly fascist government and right-wing mainstream media.

With your help we can continue:

* Holding political and state power to account
* Advocating for the people the system marginalises
* Being a media outlet that upholds the highest standards
* Campaigning on the issues others won’t
* Putting your lives central to everything we do

We are a drop of truth in an ocean of deceit. But we can’t do this without your support. So please, can you help us continue the fight?

The Canary Support us