The use of computer programs to predict crime hotspots and people who are likely to re-offend risks locking discrimination into the criminal justice system, a report by the human rights group Liberty has warned.
At least a dozen police forces are currently using or considering the use of predictive analytics. But the report has indicated that these programs encourage racial profiling and discrimination, and threaten privacy and freedom of expression.
According to Hannah Couchman a policy and campaigns officer at Liberty, one of the key risks with this system is that it adds a technological veneer to biased policing practices. The campaign officer also observed how people think computer programs are neutral but, according to her they are just entrenching the pre-existing biases that the police have always shown.
Using Freedom of Information data, the report finds that at least 14 forces in the UK are using algorithm programs known as “black boxes” for policing, have previously done so or conducted research and trials into them.