'Murder prediction' algorithms echo some of Stalin's most horrific policies — governments are treading a very dangerous line in pursuing them
Describing the horrors of communism under Stalin and others, Nobel laureate Aleksandr Solzhenitsyn wrote in his magnum opus, "The Gulag Archipelago," that "the line dividing good and evil cuts through the heart of every human being." Indeed, under the communist regime, citizens were removed from society before they could cause harm to it. This removal, which often entailed a trip to the labor camp from which many did not return, took place in a manner that deprived the accused of due process. In many cases, the mere suspicion or even hint that an act against the regime might occur was enough to earn a one way ticket with little to no recourse. The underlying premise here that the officials knew when someone might commit a transgression. In other words, law enforcement knew where that line lies in people's hearts.
The U.K. government has decided to chase this chimera by investing in a program that seeks to preemptively identify who might commit murder. Specifically, the project uses government and police data to profile people to "predict" who have a high likelihood to commit murder. Currently, the program is in its research stage, with similar programs being used for the context of making probation decisions.
Such a program that reduces individuals to data points carries enormous risks that might outweigh any gains. First, the output of such programs is not error free, meaning it might wrongly implicate people. Second, we will never know if a prediction was incorrect because there's no way of knowing if something doesn't happen — was a murder prevented, or would it never have taken place remains unanswerable? Third, the program can be misused by opportunistic actors to justify targeting people, especially minorities — the ability to do so is baked into a bureaucracy.
Consider: the basis of a bureaucratic state rests on its ability to reduce human beings to numbers. In doing so, it offers the advantages of efficiency and fairness — no one is supposed to get preferential treatment. Regardless of a person's status or income, the DMV (DVLA in the U.K.) would treat the application for a driver's license or its renewal the same way. But mistakes happen, and navigating the labyrinth of bureaucratic procedures to rectify them is no easy task.
In the age of algorithms and artificial intelligence (AI), this problem of accountability and recourse in case of errors has become far more pressing.
Mathematician Cathy O'Neil has documented cases of wrongful termination of school teachers because of poor scores as calculated by an AI algorithm. The algorithm, in turn, was fueled by what could be easily measured (e.g., test scores) rather than the effectiveness of teaching (a poor performing student improved significantly or how much teachers helped students in non quantifiable ways). The algorithm also glossed over whether grade inflation had occurred in the previous years. When the teachers questioned the authorities about the performance reviews that led to their dismissal, the explanation they received was in the form of "the math told us to do so" — even after authorities admitted that the underlying math was not 100% accurate.
If a potential future murderer is preemptively arrested, "Minority Report"-style, how can we know if the person may have decided on their own not to commit murder?
As such, the use of algorithms creates what journalist Dan Davies calls an "accountability sink" — it strips accountability by ensuring that no one person or entity can be held responsible, and it prevents the person affected by a decision from being able to fix mistakes.
This creates a twofold problem: An algorithm's estimates can be flawed, and the algorithm does not update itself because no one is held accountable. No algorithm can be expected to be accurate all the time; it can be calibrated with new data. But this is an idealistic view that does not even hold true in science; scientists can resist updating a theory or schema, especially when they are heavily invested in it. And similarly and unsurprisingly, bureaucracies do not readily update their beliefs.
To use an algorithm in an attempt to predict who is at risk of committing murder is perplexing and unethical. Not only could it be inaccurate, but there's no way to know if the system was right. In other words, if a potential future murderer is preemptively arrested, "Minority Report"-style, how can we know if the person may have decided on their own not to commit murder? The UK government is yet to clarify how they intend to use the program other than stating that the research is being carried for the purposes of "preventing and detecting unlawful acts."
We're already seeing similar systems being used in the United States. In Louisiana, an algorithm called TIGER (short for "Targeted Interventions to Greater Enhance Re-entry") — predicts whether an inmate might commit a crime if released, which then serves as a basis for making parole decisions. Recently, a 70-year-old nearly blind inmate was denied parole because TIGER predicted he had a high risk of re-offending..
In another case that eventually went to the Wisconsin Supreme Court (State vs. Loomis), an algorithm was used to guide sentencing. Challenges to the sentence — including a request for access to the algorithm to determine how it reached its recommendation — were denied on grounds that the technology was proprietary. In essence, the technological opaqueness of the system was compounded in a way that potentially undermined due process.
Equally, if not more troublingly, the dataset underlying the program in the U.K. — initially dubbed the Homicide Prediction Project — consists of hundreds of thousands of people who never granted permission for their data to be used to train the system. Worse, the dataset — compiled using data from the Ministry, Greater Manchester Police of Justice, and the Police National Computer — contains personal data, including, but not limited to, information on addiction, mental health, disabilities, previous instances of self-harm, and whether they had been victims of a crime. Indicators such as gender and race are also included.
Related stories
—The US is squandering the one resource it needs to win the AI race with China — human intelligence
—Climate wars are approaching — and they will redefine global conflict
—'It is a dangerous strategy, and one for which we all may pay dearly': Dismantling USAID leaves the US more exposed to pandemics than ever
These variables naturally increase the likelihood of bias against ethnic minorities and other marginalized groups. So the algorithm's predictions may simply reflect policing choices of the past — predictive AI algorithms rely on statistical induction, so they project past (troubling) patterns in the data into the future.
In addition, the data overrepresents Black offenders from affluent areas as well as all ethnicities from deprived neighborhoods. Past studies show that AI algorithms that make predictions about behavior work less well for Black offenders than they do for other groups. Such findings do little to allay genuine fears that racial minority groups and other vulnerable groups will be unfairly targeted.
In his book, Solzhenitsyn informed the Western world of the horrors of a bureaucratic state grinding down its citizens in service of an ideal, with little regard for the lived experience of human beings. The state was almost always wrong (especially on moral grounds), but, of course, there was no mea culpa. Those who were wronged were simply collateral damage to be forgotten.
Now, half a century later, it is rather strange that a democracy like the U.K. is revisiting a horrific and failed project from an authoritarian Communist country as a way of "protecting the public." The public does need to be protected — not only from criminals but also from a "technopoly" that vastly overestimates the role of technology in building and maintaining a healthy society.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
3 minutes ago
- Yahoo
Woolwich bus murder: Smiling teenage killers jailed for life for machete attack on 14-year-old Kelyan Bokassa
Two teenage killers who smiled as they stabbed to death a 14-year-old schoolboy with machetes on the back seat of a bus in Woolwich have been jailed for life. The 16-year-old boys attacked Kelyan Bokassa on the top deck of the 472 bus, moments after boarding the double-decker when it stopped in Woolwich Church Street. Horrific CCTV captured the 14 second attack as Kelyan was stabbed 27 times while trying to defend himself with his school bag. As he lay dying, he called out 'I want my mum', the Old Bailey heard. The attack in January sent shockwaves across London and provided another terrible example of the capital's knife crime problem. Prosecutor Deanna Heer KC said the killers 'repeatedly thrust their knives towards him, smiling as they did so'. In the evening after the murder, they appeared together in celebratory Snapchat videos, smoking, smiling and dancing. At the Old Bailey on Friday, Judge Mark Lucraft KC, the Recorder of London, sentenced the boys to detention for life, and ordered that they serve at least 15 years and ten months in custody. 'It is the senseless loss of yet another young life to the horrors of knife crime', he said. At the start of the hearing, Kelyan's mother, Marie Bokassa, told the killers: 'You didn't just take a life, you shattered an entire world.' Ms Heer said the murder happened just before 2.30pm on January 7, after Kelyan got on to the 472 bus at North Greenwich station to go to an appointment at the Youth Justice Centre in Woolwich. 'Kelyan Bokassa went upstairs and sat on the back seat, next to the window on the nearside', she said. 'He himself was in possession of a knife, which was tucked into the waistband of his trousers. The footage shows that he took care to look around himself and out of the windows before he sat down, giving every impression that he was concerned for his safety.' The court heard the killers boarded the same bus around 20 minutes after Kelyan, while armed with identical machetes. Ms Heer said the teen walked towards Kelyan 'with purpose, pulling out their knives as they did so. 'Upon reaching Kelyan Bokassa, apparently without saying anything, they both immediately began to stab him. 'Since Kelyan Bokassa was seated on the back seat, he was cornered, unable to escape as the defendants repeatedly thrust their knives towards him, smiling as they did so.' The prosecutor said the attack lasted around 13 or 14 seconds, as Kelyan 'had no time to reach for his own knife, which remained in his trousers and instead tried in vain to protect himself with his school bag.' The two attackers fled the bus as Kelyan was heard screaming in pain and shouting 'help, I've been stabbed'. Other passengers and the bus driver went to his aid as the vehicle stopped at Woolwich Ferry, and Kelyan was heard saying: 'Take me to my mum's. I want my mum.' Delivering her powerful impact statement, Mrs Bokassa told the court: 'I stand here not just as a mother but as a broken soul whose life changed forever the day my child's life was taken from me by another child using a knife. 'I speak through the silence my child left behind, a silence that now echoes through every corner of my home, heart, and dreams.' She told the killers: 'The life was not yours to take. 'That moment of violence may have lasting seconds, but the consequences are eternal. 'You didn't just take a life, you shattered an entire world. You broke a family, you broke the future. You left me as a mother dead inside, with wounds no justice could ever heal.' She paid tribute to her son, a budding artist and cook, saying he was 'kind and considerate' to others. Mrs Bokassa added that the killing was 'horrific', and told the court: 'How can children behave like this? What have the children been exposed to, to show such behaviour as this?' She vowed to campaign against knife crime in the future, and urged that the killers are not set free again until they have 'got the help they need'. The teenagers responsible for Kelyan's death were both arrested on January 15 in a raid on the home where they were staying. One told officers at the scene: 'I'm telling you right now bruv, I'm not involved bruv, not f***ing involved". He later watched the CCTV footage of the murder, and pretended not to have been involved, saying: 'Sad to see innit. Young person lose his life man, it's sad.' The court was told both boys had been 'exploited' and 'groomed' into criminal gang, as lawyers put forward their young ages and immaturity as mitigating factors. One put forward references from friends and family, describing him as 'warm, sociable, friendly, polite and kind'. The other boy wrote a letter to the judge, and was said to have 'genuine contrition'. At the start of Friday's hearing, the judge refused a media bid for reporting restrictions to be lifted that would have allowed the two killers to be publicly named. Judge Lucraft told the court hearing the killing was 'clearly gang related', had been carefully planned and was 'premeditated'. The court has heard that one of the killers threw his machete into the River Thames after the murder. He has a previous conviction from 2023 for possessing of a zombie knife. The second boy was serving a youth referral order when the murder happened, for a knife robbery in Canary Wharf. He and another male had threatened the victim and one of them showed off a machete in their trousers, forcing him to hand over his mobile phone. The teen was also convicted of possessing a lock knife on February 25 last year, when he and another boy had been seen stabbing a wall. Members of Kelyan's family were in court in May to hear the guilty pleas, and returned to court on Friday for the sentencing hearing. Kelyan's father, Hashim Mohamed, told the court in a statement: 'Kelyan was murdered for reasons that we may never know and will never understand. 'I will always wonder why whatever conflict existed could not be squashed with words or in some non-violent way. 'I will wonder what it was about Kelyan that made his life seem expendable in the eyes of those who callously snatched it.' He added: 'It has made us think more about violent youth crime and how it might be addressed and ultimately eradicated. Intellectually, it has made us question why the possibilities for joy are often fleeting for black boys like ours who exist in a world where violence, death, and criminality are normalised.' Kelyan had briefly attended St Columba's Catholic Boy's School in Bexleyheath, which was opened by former Tory prime minister Ted Heath in 1973. He had made rap videos on YouTube under the name 'Grippa', and his mother revealed that Kelyan had spent time in care and also living rough. 'He cares about the people around him. He loved to have his friends around and when they were round he would want to cook for them,' she said. 'He was very articulate. He knows what he likes and knows how to express it. 'He was good in music. Anything he loves... he was doing to his best ability. He was good at football... and he was was extremely good at drawing.' Scotland Yard took the unusual step after the stabbing of releasing CCTV images of the killers as well as their names, as a manhunt was launched to catch them. But reporting restrictions were imposed - due to their ages - once they had been arrested and charged with murder. Members of the media applied for the reporting restrictions on the identities of the killers to be removed. Lawyers for the boys objected, saying anonymity would help their rehabilitation, expressing concern for the safety of their families and arguing that news linking them to the stabbing could have consequences for them in custody.


Bloomberg
6 minutes ago
- Bloomberg
India Gives $565 Million Credit Line to Maldives as Ties Reset
By and Eltaf Najafizada Save India extended a $565 million credit line to the Maldives during Prime Minister Narendra Modi's visit to the island nation, signaling a reset in relations. The line of credit would be 'utilized for priority projects of my government' across key sectors including defense, education, and health, Maldivian President Mohamed Muizzu said during a joint press conference with Modi on Friday in Male.


Bloomberg
6 minutes ago
- Bloomberg
H2O's Ex-Deputy CEO Fined £1 Million by UK FCA, Banned From City
The former deputy chief executive officer of asset manager H2O has been banned from working in financial services and fined more than £1 million ($1.3 million) by the UK's Financial Conduct Authority. Jean-Noel Alba made false and misleading statements and instructed junior employees to create fake documents during the FCA's investigation into H20's investments relating to Lars Windhorst's Tennor Group, according to a statement Friday. He also created fake investment research designed for due diligence that was handed to the FCA during its investigation, despite being created years after the investments were made, it said.