Damien Sardjoe was 14 when the Amsterdam police put him on the city’s Top 600 criminals list, which sets people thought to be at risk of committing “high-impact crimes” such as robbery, assault and murder on a regime of care and punishment.
That was when his life began to fall apart.
Sardjoe had previously been arrested for two street robberies — one of which was violent. His inclusion on the list meant police would raid his home whenever a crime happened in his area, while officers routinely stopped him on the street asking for identification, he said.
Illustration: Yusha
“I felt very spied on,” said Sardjoe, now a 20-year-old youth worker. “I didn’t feel comfortable walking on the street.”
Sardjoe’s older brother was placed on another automated list — the Top 400 children at risk of criminal behavior — before he had ever committed a crime, and then went on to become involved in stealing scooters, he added.
The “top criminal” tag felt “almost like a self-fulfilling prophecy,” Sardjoe said.
Amid warnings from rights groups that artificial intelligence (AI) technologies reinforce prejudice in policing, the debate over systems such as the Top 600 list kicked up a notch this month, when members of the European Parliament voted for a report proposing strict regulation of predictive policing.
Officials said the nonbinding report, which also calls for outlawing the use of mass biometric surveillance, should become the European Parliament’s position for coming negotiations on a new AI law.
“We want to make sure certain types of AI, like facial recognition or predictive policing, cannot be used easily because they affect fundamental human rights,” said Bulgarian member Petar Vitanov, who wrote the report.
The use of automated risk modeling and profiling systems to predict future criminal activity has already been banned in cities such as Santa Cruz, California, and New Orleans, Louisiana, amid accusations that they reinforce racist policing patterns.
“They treat everyone as suspects to some extent,” said Sarah Chander, senior policy adviser at the European Digital Rights network.
“But [they] will be disproportionately used against racialized people ... who are perceived to be potential migrants, terrorists, poor and working-class people, in poor, working-class areas,” she said.
The Dutch police declined to comment on the Top 600 and 400 schemes, referring inquiries to Amsterdam’s city council, which in turn said they were the responsibility of the police.
Europe’s law enforcement and criminal justice authorities are increasingly using technologies such as predictive policing to profile people and assess their likelihood to commit crimes, according to Fair Trials, a partly EU-funded civil rights group.
One much-criticized Dutch project, which ran from January 2019 to October last year, aimed to counter crimes such as shoplifting in the southeastern city of Roermond.
The Sensing Project used remote sensors in and around the city to detect the make, color and route of vehicles carrying people suspected of what police call “mobile banditry.”
Merel Koning, senior policy officer at human rights group Amnesty International, said the system mainly targeted people from east European countries and specifically Roma, referring to members of Europe’s largest ethnic minority.
The focus was not in line with internal police crime figures for previous years, Amnesty said.
Dutch police spokeswoman Mireille Beentjes said the project’s scope went beyond pickpockets and was not predictive as the data used “always [had] a human check.”
“We know these kinds of criminals often come from eastern Europe,” she said in an e-mail. “However, an eastern European license by itself was never enough to draw our special attention. More features were needed for that.”
The program ended because the police did not have enough capacity to follow up project data, the Dutch police said.
In Denmark, the POL-INTEL project, based on the Gotham system designed by US data analytics firm Palantir and operational since 2017, uses a mapping system to build a “heat map” identifying areas with higher crime rates.
The data appears to include citizenship information, such as whether a person in the system is “a non-Western Dane,” said Matt Mahmoudi, an affiliated lecturer and researcher on digital society at the University of Cambridge.
“We want an indication of why citizenship data — or non-Western data — matters in being able to produce a heat map,” he said.
Magnus Andresen, a senior Danish National Police officer, confirmed that POL-INTEL contains nationality and citizenship data, but would not comment on why.
The police do not have any statistics on the system’s effectiveness in combating terrorism or crime, Andresen said.
He added it is being used to support most of the force’s operational decisions, such as stop and searches, through the use of a “finder function,” which quickly locates data on people, places, objects and events.
Courtney Bowman, Palantir’s director of privacy, said decisions on the data gathered by the Gotham system — which has also been used by the European police agency Europol and the Hesse state police in Germany — were “always determinations made by customers.”
“The software is designed to enable human-driven analysis for a posteriori investigations [based on prior evidence] and not to provide algorithmic-based predictive policing capabilities,” he said.
Pushback against institutions and companies linked to “predictive policing” has gone so far that digital experts say even the US firm that pioneered the technology, formerly called PredPol — short for predictive policing — now distances itself from the term.
The company’s system uses algorithms to analyze police records and identify crime-ridden areas to proactively determine when and where officers patrol.
“However, what we do isn’t ‘predictive,’ what we do is create location-based risk assessments based on historical crime patterns. This is why we changed our name from PredPol to Geolitica earlier this year,” CEO Brian MacDonald said.
“Any of these approaches using open, auditable data and algorithms to identify crime hot spots will always be better than relying on officer ‘intuition,’” he added in e-mailed comments.
Police use of AI is still “extremely controversial,” said Tom McNeil, assistant police and crime commissioner for the West Midlands Police in England, which is working with about eight types of automated modeling system.
He called for more oversight in the way the technology is used by authorities, adding that he personally favors a ban on the use of live facial recognition surveillance, as proposed in the European Commission’s proposed artificial intelligence act.
“We need a [British] law to clarify what should and shouldn’t be allowed, including red lines over when you shouldn’t be allowed to use facial recognition or [predictive] analytics,” McNeil said.
In Amsterdam, Sardjoe has been off the Top 600 list for three years and works in a program encouraging others to shun crime.
“There was a moment in my life where I thought: ‘They already think I’m a criminal, so I might as well do criminal stuff’ — because they don’t expect anything better,” he said.
“Right now, I’m helping boys or kids who are going through what I went through because all I needed then was a person who understood that [experience],” he added.
Recently, China launched another diplomatic offensive against Taiwan, improperly linking its “one China principle” with UN General Assembly Resolution 2758 to constrain Taiwan’s diplomatic space. After Taiwan’s presidential election on Jan. 13, China persuaded Nauru to sever diplomatic ties with Taiwan. Nauru cited Resolution 2758 in its declaration of the diplomatic break. Subsequently, during the WHO Executive Board meeting that month, Beijing rallied countries including Venezuela, Zimbabwe, Belarus, Egypt, Nicaragua, Sri Lanka, Laos, Russia, Syria and Pakistan to reiterate the “one China principle” in their statements, and assert that “Resolution 2758 has settled the status of Taiwan” to hinder Taiwan’s
Can US dialogue and cooperation with the communist dictatorship in Beijing help avert a Taiwan Strait crisis? Or is US President Joe Biden playing into Chinese President Xi Jinping’s (習近平) hands? With America preoccupied with the wars in Europe and the Middle East, Biden is seeking better relations with Xi’s regime. The goal is to responsibly manage US-China competition and prevent unintended conflict, thereby hoping to create greater space for the two countries to work together in areas where their interests align. The existing wars have already stretched US military resources thin, and the last thing Biden wants is yet another war.
As Maldivian President Mohamed Muizzu’s party won by a landslide in Sunday’s parliamentary election, it is a good time to take another look at recent developments in the Maldivian foreign policy. While Muizzu has been promoting his “Maldives First” policy, the agenda seems to have lost sight of a number of factors. Contemporary Maldivian policy serves as a stark illustration of how a blend of missteps in public posturing, populist agendas and inattentive leadership can lead to diplomatic setbacks and damage a country’s long-term foreign policy priorities. Over the past few months, Maldivian foreign policy has entangled itself in playing
A group of Chinese Nationalist Party (KMT) lawmakers led by the party’s legislative caucus whip Fu Kun-chi (?) are to visit Beijing for four days this week, but some have questioned the timing and purpose of the visit, which demonstrates the KMT caucus’ increasing arrogance. Fu on Wednesday last week confirmed that following an invitation by Beijing, he would lead a group of lawmakers to China from Thursday to Sunday to discuss tourism and agricultural exports, but he refused to say whether they would meet with Chinese officials. That the visit is taking place during the legislative session and in the aftermath