FOI requests reveal which police forces use crime prediction software

A line of police tape at a shooting in Minneapolis, Minnesota, at night.
Support us and go ad-free

A number of freedom of information (FOI) requests carried out by The Canary have revealed the number of police forces in the UK that use crime prediction software or predictive policing models. Coverage of predictive policing, also known as pre-crime, is part of our #ResistBigBrother series. Data Justice Lab researcher Fieke Jansen specialises in data-driven police operations. Jansen told The Canary that predictive policing has a couple of different strands:

the first is location policing. It looks at where crime is most likely to occur in the near future. And the other one is predictive identification which is seeing who is potentially most likely to be engaged in a certain criminal activity in the near future.

This type of policing strategy is notorious for being discriminatory and setting a worrying trend for data mining. Civil liberties organisation Liberty explains:

The predictive programs aren’t neutral. They are trained by people and rely on existing police data, and so they reflect patterns of discrimination and further embed them into police practice.

Author of The Rise of Big Data Policing Andrew Ferguson points out that police could hold information on people who have not committed a crime, and the use of data could encourage police violence, amongst other problems:

the growing web of surveillance threatens to chill associational freedoms, political expression and expectations of privacy by eroding public anonymity… even with the best use policies in place, officers have access to vast amounts of personal information of people not suspected of any crime… without carefully chosen data inputs, long-standing racial, societal and other forms of bias will be reified in the data.

Predictive policing is relatively sophisticated in the United States. The Los Angeles Police Department recently came under criticism for its use of predictive policing:

Read on...

Support us and go ad-free

newly revealed public documents detail how PredPol and Operation Laser, the department’s flagship data-driven programs, validated existing patterns of policing and reinforced decisions to patrol certain people and neighborhoods over others, leading to the over-policing of Black and brown communities in the metropole

What’s the situation in the UK? Our investigations unit sent out FOI requests to police forces asking them if they used any data-driven techniques, what type of software they used, and how much it cost.

West Yorkshire Police

West Yorkshire Police told us that they receive support form the College of Policing to use “advanced technology” called Patrol-Wise which is funded by the Home Office and developed by University College London. They explained to us that crime data is analysed in Patrol-Wise and communicated to officers on handheld devices.

Humberside Police

Humberside Police told us that:

As part of a predictive policing programme Humberside Police is currently trialling a predictive algorithm.

There is currently no further information available.

West Midlands Police

West Midlands Police told us:

WMP have developed (and are in the process of developing) some predictive models – notably around most serious violence (locations and number); knife crime – used where causing injury (location and number) and estimating the probability of individuals moving from low / middling levels of harm into high harm.

The policing of knife crime is notorious for racially targeting Black and Brown people. Incorporating data algorithms into an already racist structure is a prime example of how data-led strategies for police imbed racism even further into forces.

Avon and Somerset Police

Avon and Somerset Police told us they use:

Data visualisation/dashboards

– Production reporting/ad hoc querying

– Predictive analytics/risk models/ETL (extraction-transformation-loading)

– Social network analytics

Hampshire Police

Hampshire Police told us they use Demar Forecasting owned by Process Evolution.

Data-led techniques

Police forces in Scotland, Cumbria, Lancashire, Cheshire, Kent, North Wales, Gloucestershire, South Yorkshire, Staffordshire, Warwickshire, Derbyshire, Bedfordshire, West Mercia, Surrey, Northumbria, Hertfordshire, Sussex, Wiltshire, Lincolnshire, South Wales, the Thames Valley, and the Metropolitan Police all returned responses that said they either held no information on predictive policing or that they did not use predictive policing. A number of police forces had not responded to the FOI request at the time of publication – in spite of the fact that public authorities are required to respond to FOI requests within 20 working days.

Liberty

In a 2019 report from Liberty, a number of these above police forces are listed as using predictive policing. The Canary reached out to these police forces to ask why their responses to our FOIs claimed not to use predictive policing, but they were listed in the report from Liberty as using predictive policing. Kent, Warwickshire, and Cheshire police all explained that they had used predictive policing in the past and did so no longer.

Kent Police said:

The article you mention does say that we did use Predictive Policing in 2013 but in 2018 we decided not to renew the contract, this information is correct.

Warwickshire Police said:

Historically, Warwickshire Police was briefly involved in a project as part of its strategic alliance with West Mercia.  However, we no longer have an active role as a stand-alone force.

Cheshire Police said:

Cheshire Constabulary took part in a short trial back in 2015 and a decision was taken not to use the software.

This tracks with what data expert Jansen told us:

police budget cuts in the UK have meant a lot of predictive policing models stop and start frequently.

Level of sophistication

It’s clear that predictive policing in the UK is perhaps not as widespread or sophisticated as in the US. There is also a pattern of predictive policing models being taken up by forces in the short-term and then dropped due to funding. This stop-start landscape means that forces rely on government funding and development from university departments. This in turn means that influences on policy can change the threat to civil liberties.

As the Network for Police Monitoring’s (Netpol) Kevin Blowe points out:

predictive technology is better at identifying patterns than individuals and is only likely to work at all if it involves gathering vast amounts of data over a long period of time. As people from poorer communities are more likely to have data held about them by public services, this is likely to mean they are more likely to become classified as a risk.

Predictive policing itself requires the police holding data on individuals, but the abuse of technology is not a problem with the technology itself but a problem with racist structures in policing. As Blowe explains:

The problem is that these are simply tools – and bias comes from the choices made about what crimes to focus on and where. Essentially, if the police’s perception of a problem (and therefore the data it goes looking for) is about ‘gang crime’ then it it likely to replicate the preexisting racial bias on this issue with policing.

Why does any of this matter?

Events in 2021 have highlighted the institutional sexism and racism in the UK police force: the murder of Sarah Everard by a serving police officer; police officers taking pictures with the murdered bodies of Nicole Smallman and Bibaa Henry; crackdowns in the Police, Crime, Sentencing and Courts Bill; the promise of further restrictions in amendments to the Official Secrets Act.

These restrictions come in the context of increasing privatisation of the NHS, deep cuts to the welfare state, the botched handling of the coronavirus (Covid-19) pandemic, and anti-immigrant rhetoric. A swing to far-right policies spells grave trouble for all our civil liberties.

The level of sophistication is not necessarily the problem when it comes to predictive policing. The problem is institutionally racist and corrupt police using data as another tool in their arsenal that can be used to suppress dissent, and to control and surveil citizens. There are already concerns about police leaning on the NHS and mental health services for data. Crackdowns on civil liberties almost always pave the way for a smoother road to even more regressive policies. Resisting police violence means resisting the use of data and technology that infringe privacy rights.

Featured image via Wikimedia Commons/Tony Webster

Support us and go ad-free

We need your help to keep speaking the truth

Every story that you have come to us with; each injustice you have asked us to investigate; every campaign we have fought; each of your unheard voices we amplified; we do this for you. We are making a difference on your behalf.

Our fight is your fight. You’ve supported our collective struggle every time you gave us a like; and every time you shared our work across social media. Now we need you to support us with a monthly donation.

We have published nearly 2,000 articles and over 50 films in 2021. And we want to do this and more in 2022 but we don’t have enough money to go on at this pace. So, if you value our work and want us to continue then please join us and be part of The Canary family.

In return, you get:

* Advert free reading experience
* Quarterly group video call with the Editor-in-Chief
* Behind the scenes monthly e-newsletter
* 20% discount in our shop

Almost all of our spending goes to the people who make The Canary’s content. So your contribution directly supports our writers and enables us to continue to do what we do: speaking truth, powered by you. We have weathered many attempts to shut us down and silence our vital opposition to an increasingly fascist government and right-wing mainstream media.

With your help we can continue:

* Holding political and state power to account
* Advocating for the people the system marginalises
* Being a media outlet that upholds the highest standards
* Campaigning on the issues others won’t
* Putting your lives central to everything we do

We are a drop of truth in an ocean of deceit. But we can’t do this without your support. So please, can you help us continue the fight?

The Canary Support us