A number of freedom of information (FOI) requests carried out by The Canary have revealed the number of police forces in the UK that use crime prediction software or predictive policing models. Coverage of predictive policing, also known as pre-crime, is part of our #ResistBigBrother series. Data Justice Lab researcher Fieke Jansen specialises in data-driven police operations. Jansen told The Canary that predictive policing has a couple of different strands:
the first is location policing. It looks at where crime is most likely to occur in the near future. And the other one is predictive identification which is seeing who is potentially most likely to be engaged in a certain criminal activity in the near future.
This type of policing strategy is notorious for being discriminatory and setting a worrying trend for data mining. Civil liberties organisation Liberty explains:
The predictive programs aren’t neutral. They are trained by people and rely on existing police data, and so they reflect patterns of discrimination and further embed them into police practice.
Author of The Rise of Big Data Policing Andrew Ferguson points out that police could hold information on people who have not committed a crime, and the use of data could encourage police violence, amongst other problems:
the growing web of surveillance threatens to chill associational freedoms, political expression and expectations of privacy by eroding public anonymity… even with the best use policies in place, officers have access to vast amounts of personal information of people not suspected of any crime… without carefully chosen data inputs, long-standing racial, societal and other forms of bias will be reified in the data.
Predictive policing is relatively sophisticated in the United States. The Los Angeles Police Department recently came under criticism for its use of predictive policing:
newly revealed public documents detail how PredPol and Operation Laser, the department’s flagship data-driven programs, validated existing patterns of policing and reinforced decisions to patrol certain people and neighborhoods over others, leading to the over-policing of Black and brown communities in the metropole
What’s the situation in the UK? Our investigations unit sent out FOI requests to police forces asking them if they used any data-driven techniques, what type of software they used, and how much it cost.
West Yorkshire Police
West Yorkshire Police told us that they receive support form the College of Policing to use “advanced technology” called Patrol-Wise which is funded by the Home Office and developed by University College London. They explained to us that crime data is analysed in Patrol-Wise and communicated to officers on handheld devices.
Humberside Police told us that:
As part of a predictive policing programme Humberside Police is currently trialling a predictive algorithm.
There is currently no further information available.
West Midlands Police
West Midlands Police told us:
WMP have developed (and are in the process of developing) some predictive models – notably around most serious violence (locations and number); knife crime – used where causing injury (location and number) and estimating the probability of individuals moving from low / middling levels of harm into high harm.
The policing of knife crime is notorious for racially targeting Black and Brown people. Incorporating data algorithms into an already racist structure is a prime example of how data-led strategies for police imbed racism even further into forces.
Avon and Somerset Police
Avon and Somerset Police told us they use:
– Production reporting/ad hoc querying
– Predictive analytics/risk models/ETL (extraction-transformation-loading)
– Social network analytics
Hampshire Police told us they use Demar Forecasting owned by Process Evolution.
Police forces in Scotland, Cumbria, Lancashire, Cheshire, Kent, North Wales, Gloucestershire, South Yorkshire, Staffordshire, Warwickshire, Derbyshire, Bedfordshire, West Mercia, Surrey, Northumbria, Hertfordshire, Sussex, Wiltshire, Lincolnshire, South Wales, the Thames Valley, and the Metropolitan Police all returned responses that said they either held no information on predictive policing or that they did not use predictive policing. A number of police forces had not responded to the FOI request at the time of publication – in spite of the fact that public authorities are required to respond to FOI requests within 20 working days.
In a 2019 report from Liberty, a number of these above police forces are listed as using predictive policing. The Canary reached out to these police forces to ask why their responses to our FOIs claimed not to use predictive policing, but they were listed in the report from Liberty as using predictive policing. Kent, Warwickshire, and Cheshire police all explained that they had used predictive policing in the past and did so no longer.
Kent Police said:
The article you mention does say that we did use Predictive Policing in 2013 but in 2018 we decided not to renew the contract, this information is correct.
Warwickshire Police said:
Historically, Warwickshire Police was briefly involved in a project as part of its strategic alliance with West Mercia. However, we no longer have an active role as a stand-alone force.
Cheshire Police said:
Cheshire Constabulary took part in a short trial back in 2015 and a decision was taken not to use the software.
This tracks with what data expert Jansen told us:
police budget cuts in the UK have meant a lot of predictive policing models stop and start frequently.
Level of sophistication
It’s clear that predictive policing in the UK is perhaps not as widespread or sophisticated as in the US. There is also a pattern of predictive policing models being taken up by forces in the short-term and then dropped due to funding. This stop-start landscape means that forces rely on government funding and development from university departments. This in turn means that influences on policy can change the threat to civil liberties.
As the Network for Police Monitoring’s (Netpol) Kevin Blowe points out:
predictive technology is better at identifying patterns than individuals and is only likely to work at all if it involves gathering vast amounts of data over a long period of time. As people from poorer communities are more likely to have data held about them by public services, this is likely to mean they are more likely to become classified as a risk.
Predictive policing itself requires the police holding data on individuals, but the abuse of technology is not a problem with the technology itself but a problem with racist structures in policing. As Blowe explains:
The problem is that these are simply tools – and bias comes from the choices made about what crimes to focus on and where. Essentially, if the police’s perception of a problem (and therefore the data it goes looking for) is about ‘gang crime’ then it it likely to replicate the preexisting racial bias on this issue with policing.
Why does any of this matter?
Events in 2021 have highlighted the institutional sexism and racism in the UK police force: the murder of Sarah Everard by a serving police officer; police officers taking pictures with the murdered bodies of Nicole Smallman and Bibaa Henry; crackdowns in the Police, Crime, Sentencing and Courts Bill; the promise of further restrictions in amendments to the Official Secrets Act.
These restrictions come in the context of increasing privatisation of the NHS, deep cuts to the welfare state, the botched handling of the coronavirus (Covid-19) pandemic, and anti-immigrant rhetoric. A swing to far-right policies spells grave trouble for all our civil liberties.
The level of sophistication is not necessarily the problem when it comes to predictive policing. The problem is institutionally racist and corrupt police using data as another tool in their arsenal that can be used to suppress dissent, and to control and surveil citizens. There are already concerns about police leaning on the NHS and mental health services for data. Crackdowns on civil liberties almost always pave the way for a smoother road to even more regressive policies. Resisting police violence means resisting the use of data and technology that infringe privacy rights.
Featured image via Wikimedia Commons/Tony Webster
We know everyone is suffering under the Tories - but the Canary is a vital weapon in our fight back, and we need your support
The Canary Workers’ Co-op knows life is hard. The Tories are waging a class war against us we’re all having to fight. But like trade unions and community organising, truly independent working-class media is a vital weapon in our armoury.
The Canary doesn’t have the budget of the corporate media. In fact, our income is over 1,000 times less than the Guardian’s. What we do have is a radical agenda that disrupts power and amplifies marginalised communities. But we can only do this with our readers’ support.
So please, help us continue to spread messages of resistance and hope. Even the smallest donation would mean the world to us.