Predictive policing — the use of machine-learning algorithms to fight crime — risks unfairly discriminating against characteristics including race, sexuality and age, a security think tank said.
Such algorithms, used to mine insights from data collected by police, are deployed for purposes including facial recognition, mobile phone data extraction, social media analysis, predictive crime mapping and individual risk assessment.
Researchers at the Royal United Services Institute, commissioned by the British government’s Centre for Data Ethics and Innovation, focused on predictive crime mapping and individual risk assessment and found that algorithms trained on police data might replicate — and in some cases amplify — biases inherent in the data set, such as over or under-policing of certain communities.
The paper, Data Analytics and Algorithmic Bias in Policing, by Alexander Babuta and Marion Oswald, summarizes the interim findings of an ongoing independent study into the use of data analytics for policing within England and Wales and explores different types of bias that can arise.
“The effects of a biased sample could be amplified by algorithmic predictions via a feedback loop, whereby future policing is predicted, not future crime,” the authors said.
The paper said that police officers, who were interviewed for the research, are concerned about the lack of safeguards and oversight regarding the use of predictive policing.
One officer told the researchers that “young black men are more likely to be stopped and searched than young white men, and that’s purely down to human bias.”
“That human bias is then introduced into the data sets and bias is then generated in the outcomes of the application of those data sets,” the officer said.
Another officer said police forces “pile loads of resources into a certain area and it becomes a self-fulfilling prophecy, purely because there’s more policing going into that area, not necessarily because of discrimination on the part of officers.”
The technological landscape was described by one officer as a “patchwork quilt, uncoordinated and delivered to different standards in different settings and for different outcomes.”
The briefing paper identifies individuals from disadvantaged socioeconomic backgrounds who are “calculated as posing a greater risk” of criminal behavior by algorithms.
This bias exists because individuals from this group are more likely to have frequent contact with public services and, in doing so, generate higher levels of data, which the police often have access to, the paper said.
The implications are serious in terms of police allocation of resources, which might be ineffective, as they are based on incorrect calculations, and legally where “discrimination claims could be brought by individuals scored ‘negatively’ in comparison to others of different ages or genders,” the paper said, referring to British anti-discrimination regulations.
The paper also said there was a risk of “automation bias,” whereby police officers become overreliant on the use of analytical tools, undermining their discretion and causing them to disregard other factors.
SEEKING CHANGE: A hospital worker said she did not vote in previous elections, but ‘now I can see that maybe my vote can change the system and the country’ Voting closed yesterday across the Solomon Islands in the south Pacific nation’s first general election since the government switched diplomatic allegiance from Taiwan to Beijing and struck a secret security pact that has raised fears of the Chinese navy gaining a foothold in the region. The Solomon Islands’ closer relationship with China and a troubled domestic economy weighed on voters’ minds as they cast their ballots. As many as 420,000 registered voters had their say across 50 national seats. For the first time, the national vote also coincided with elections for eight of the 10 local governments. Esther Maeluma cast her vote in the
Nearly half of China’s major cities are suffering “moderate to severe” levels of subsidence, putting millions of people at risk of flooding, especially as sea levels rise, according to a study of nationwide satellite data released yesterday. The authors of the paper, published by the journal Science, found that 45 percent of China’s urban land was sinking faster than 3mm per year, with 16 percent at more than 10mm per year, driven not only by declining water tables, but also the sheer weight of the built environment. With China’s urban population already in excess of 900 million people, “even a small portion
UNSETTLING IMAGES: The scene took place in front of TV crews covering the Trump trial, with a CNN anchor calling it an ‘emotional and unbelievably disturbing moment’ A man who doused himself in an accelerant and set himself on fire outside the courthouse where former US president Donald Trump is on trial has died, police said yesterday. The New York City Police Department (NYPD) said the man was declared dead by staff at an area hospital. The man was in Collect Pond Park at about 1:30pm on Friday when he took out pamphlets espousing conspiracy theories, tossed them around, then doused himself in an accelerant and set himself on fire, officials and witnesses said. A large number of police officers were nearby when it happened. Some officers and bystanders rushed
HYPOCRISY? The Chinese Ministry of Foreign Affairs yesterday asked whether Biden was talking about China or the US when he used the word ‘xenophobic’ US President Joe Biden on Wednesday called for a hike in steel tariffs on China, accusing Beijing of cheating as he spoke at a campaign event in Pennsylvania. Biden accused China of xenophobia, too, in a speech to union members in Pittsburgh. “They’re not competing, they’re cheating. They’re cheating and we’ve seen the damage here in America,” Biden said. Chinese steel companies “don’t need to worry about making a profit because the Chinese government is subsidizing them so heavily,” he said. Biden said he had called for the US Trade Representative to triple the tariff rates for Chinese steel and aluminum if Beijing was