In the latest “We created Torment Nexus from the classic Sci-Fi novel Do not create the bond of torment“The news, the Guardian newspaper reported that the UK government is developing an predictive algorithm aimed at identifying people who are likely to commit a murder. What can happen?
the a reportWho cited documents It was obtained through requests for freedom of information by the Transparency OrganizationI found that the Ministry of Justice is charged with designing a system of features that can refer to people who seem to be able to commit serious violent crimes before they already do it. The alleged killing project (rename it to “sharing data to improve risk assessment” so as not to explode just as it is explicitly) absorbs data between 100,000 and 500,000 people in an attempt to develop models that can determine “predictions of killing risk data.”
The project includes data from the Ministry of Justice (Moj), the Ministry of Interior, the Greater Manchester Police (GMP), and the Metropolitan Police in London. The records are said to be not limited to those who have criminal records, but they also include the data of the suspects who have not been convicted, victims, witnesses and missing persons. It also included details about a person’s mental health, addiction, self-harm, suicide, weakness, and disability-“health signs” that Moj “is expected to have great predictive power. I mentioned the guardian Government officials denied the use of victims or weak population data, and they insisted that data from people with at least one criminal conviction have been used.
It does not take much to know how bad the idea is, and what is the potential end result: the unacceptable targeting of low and marginalized people. But if this is not clear, you should only look at the previous predictive justice tools offered by the UK Ministry of Justice and the results it produced.
For example, the government The perpetrator’s evaluation system The legal system uses it to “predict” if it is likely to restore someone, and this prediction is used by judges in decisions to issue rulings. A Government review Among the system that was among all criminals, the actual re -repetition was much lower than the expected rate, especially for non -violent crimes. However, as you might imagine, the algorithm evaluated the black perpetrators less accurate than the white perpetrators.
This is not just Britain’s problem, of course. These are the predictive police tools They abuse people regularly Regardless of where to be implemented, with the risks associated with marginalized, deviant societies – as a result The racist biases in the same data That stems from history excessive Among the colored societies and low -income societies that lead to more police reactions, high detention rates, and the issuance of strict rulings. These results are baked in the data, which then exacerbate through the algorithm processing of that information It enhances behaviors Which leads to unequal results.
However, just as a reminder: We were not supposed to embrace the predictive nature of the race in Minority reportWe are supposed to be skeptical of them.
https://gizmodo.com/app/uploads/2023/06/4424fa0f3b18559d6e02e29493a119ad.jpg
Source link