WTF?! There have been several stories over the years about different governments creating crime-predicting algorithms, leading to comparisons to the 2002 movie Minority Report – even though that film involved clairvoyant humans. The UK government is the latest to come under the spotlight for working on this technology, but officials insist it is only a research project – at least for now.
The UK government’s program, originally called the “homicide prediction project,” works by using algorithms to analyze the information of hundreds of thousands of people, including victims of crime, in the hope of identifying those most likely to commit serious violent offences, writes The Guardian.
Civil liberties group Statewatch uncovered the project through the Freedom of Information Act. It claimed that the tool was developed using data from between 100,000 and 500,000 people. Statewatch says the group includes not only those with criminal convictions, but also victims of crime, though officials deny this is the case, claiming it only uses existing data from convicted offenders.
The data included names, dates of birth, gender, ethnicity, and a number that identifies people on the police national computer. It also covers sensitive information such as mental health, addiction, suicide and vulnerability, self-harm, and disabilities.
“The Ministry of Justice’s attempt to build this murder prediction system is the latest chilling and dystopian example of the government’s intent to develop so-called crime ‘prediction’ systems,” said Sofia Lyall, a researcher for Statewatch.
“Time and again, research shows that algorithmic systems for ‘predicting’ crime are inherently flawed.”
“This latest model, which uses data from our institutionally racist police and Home Office, will reinforce and magnify the structural discrimination underpinning the criminal legal system.”
Officials say that the program is an extension of existing risk-prediction tools, which are often used to predict the likelihood of a prisoner reoffending when they approach their release date. They added that the project is designed to see if adding new data sources from police and custody data would improve risk assessment.
A Ministry of Justice spokesperson said the project is being conducted for research purposes only.
There’s a long history of crime-predicting algorithms that often get compared to Minority Report, including South Korea’s “Dejaview” – an AI system that analyzes CCTV footage to detect and potentially prevent criminal activity. It works by analyzing patterns and identifying signs of impending crimes.
In 2022, university researchers said they had developed an algorithm that could predict future crime one week in advance with an accuracy of 90%.
Also in 2022, it was reported that China was looking at ways to build profiles of its citizens, from which an automated system could predict potential dissidents or criminals before they have a chance to act on their impulses.
Source link