Damien Sardjoe was 14 when the Amsterdam police put him on the city’s Top 600 criminals list, which sets people thought to be at risk of committing “high-impact crimes” such as robbery, assault and murder on a regime of care and punishment.
That was when his life began to fall apart.
Sardjoe had previously been arrested for two street robberies — one of which was violent. His inclusion on the list meant police would raid his home whenever a crime happened in his area, while officers routinely stopped him on the street asking for identification, he said.
Illustration: Yusha
“I felt very spied on,” said Sardjoe, now a 20-year-old youth worker. “I didn’t feel comfortable walking on the street.”
Sardjoe’s older brother was placed on another automated list — the Top 400 children at risk of criminal behavior — before he had ever committed a crime, and then went on to become involved in stealing scooters, he added.
The “top criminal” tag felt “almost like a self-fulfilling prophecy,” Sardjoe said.
Amid warnings from rights groups that artificial intelligence (AI) technologies reinforce prejudice in policing, the debate over systems such as the Top 600 list kicked up a notch this month, when members of the European Parliament voted for a report proposing strict regulation of predictive policing.
Officials said the nonbinding report, which also calls for outlawing the use of mass biometric surveillance, should become the European Parliament’s position for coming negotiations on a new AI law.
“We want to make sure certain types of AI, like facial recognition or predictive policing, cannot be used easily because they affect fundamental human rights,” said Bulgarian member Petar Vitanov, who wrote the report.
The use of automated risk modeling and profiling systems to predict future criminal activity has already been banned in cities such as Santa Cruz, California, and New Orleans, Louisiana, amid accusations that they reinforce racist policing patterns.
“They treat everyone as suspects to some extent,” said Sarah Chander, senior policy adviser at the European Digital Rights network.
“But [they] will be disproportionately used against racialized people ... who are perceived to be potential migrants, terrorists, poor and working-class people, in poor, working-class areas,” she said.
The Dutch police declined to comment on the Top 600 and 400 schemes, referring inquiries to Amsterdam’s city council, which in turn said they were the responsibility of the police.
Europe’s law enforcement and criminal justice authorities are increasingly using technologies such as predictive policing to profile people and assess their likelihood to commit crimes, according to Fair Trials, a partly EU-funded civil rights group.
One much-criticized Dutch project, which ran from January 2019 to October last year, aimed to counter crimes such as shoplifting in the southeastern city of Roermond.
The Sensing Project used remote sensors in and around the city to detect the make, color and route of vehicles carrying people suspected of what police call “mobile banditry.”
Merel Koning, senior policy officer at human rights group Amnesty International, said the system mainly targeted people from east European countries and specifically Roma, referring to members of Europe’s largest ethnic minority.
The focus was not in line with internal police crime figures for previous years, Amnesty said.
Dutch police spokeswoman Mireille Beentjes said the project’s scope went beyond pickpockets and was not predictive as the data used “always [had] a human check.”
“We know these kinds of criminals often come from eastern Europe,” she said in an e-mail. “However, an eastern European license by itself was never enough to draw our special attention. More features were needed for that.”
The program ended because the police did not have enough capacity to follow up project data, the Dutch police said.
In Denmark, the POL-INTEL project, based on the Gotham system designed by US data analytics firm Palantir and operational since 2017, uses a mapping system to build a “heat map” identifying areas with higher crime rates.
The data appears to include citizenship information, such as whether a person in the system is “a non-Western Dane,” said Matt Mahmoudi, an affiliated lecturer and researcher on digital society at the University of Cambridge.
“We want an indication of why citizenship data — or non-Western data — matters in being able to produce a heat map,” he said.
Magnus Andresen, a senior Danish National Police officer, confirmed that POL-INTEL contains nationality and citizenship data, but would not comment on why.
The police do not have any statistics on the system’s effectiveness in combating terrorism or crime, Andresen said.
He added it is being used to support most of the force’s operational decisions, such as stop and searches, through the use of a “finder function,” which quickly locates data on people, places, objects and events.
Courtney Bowman, Palantir’s director of privacy, said decisions on the data gathered by the Gotham system — which has also been used by the European police agency Europol and the Hesse state police in Germany — were “always determinations made by customers.”
“The software is designed to enable human-driven analysis for a posteriori investigations [based on prior evidence] and not to provide algorithmic-based predictive policing capabilities,” he said.
Pushback against institutions and companies linked to “predictive policing” has gone so far that digital experts say even the US firm that pioneered the technology, formerly called PredPol — short for predictive policing — now distances itself from the term.
The company’s system uses algorithms to analyze police records and identify crime-ridden areas to proactively determine when and where officers patrol.
“However, what we do isn’t ‘predictive,’ what we do is create location-based risk assessments based on historical crime patterns. This is why we changed our name from PredPol to Geolitica earlier this year,” CEO Brian MacDonald said.
“Any of these approaches using open, auditable data and algorithms to identify crime hot spots will always be better than relying on officer ‘intuition,’” he added in e-mailed comments.
Police use of AI is still “extremely controversial,” said Tom McNeil, assistant police and crime commissioner for the West Midlands Police in England, which is working with about eight types of automated modeling system.
He called for more oversight in the way the technology is used by authorities, adding that he personally favors a ban on the use of live facial recognition surveillance, as proposed in the European Commission’s proposed artificial intelligence act.
“We need a [British] law to clarify what should and shouldn’t be allowed, including red lines over when you shouldn’t be allowed to use facial recognition or [predictive] analytics,” McNeil said.
In Amsterdam, Sardjoe has been off the Top 600 list for three years and works in a program encouraging others to shun crime.
“There was a moment in my life where I thought: ‘They already think I’m a criminal, so I might as well do criminal stuff’ — because they don’t expect anything better,” he said.
“Right now, I’m helping boys or kids who are going through what I went through because all I needed then was a person who understood that [experience],” he added.
Congratulations to China’s working class — they have officially entered the “Livestock Feed 2.0” era. While others are still researching how to achieve healthy and balanced diets, China has already evolved to the point where it does not matter whether you are actually eating food, as long as you can swallow it. There is no need for cooking, chewing or making decisions — just tear open a package, add some hot water and in a short three minutes you have something that can keep you alive for at least another six hours. This is not science fiction — it is reality.
A foreign colleague of mine asked me recently, “What is a safe distance from potential People’s Liberation Army (PLA) Rocket Force’s (PLARF) Taiwan targets?” This article will answer this question and help people living in Taiwan have a deeper understanding of the threat. Why is it important to understand PLA/PLARF targeting strategy? According to RAND analysis, the PLA’s “systems destruction warfare” focuses on crippling an adversary’s operational system by targeting its networks, especially leadership, command and control (C2) nodes, sensors, and information hubs. Admiral Samuel Paparo, commander of US Indo-Pacific Command, noted in his 15 May 2025 Sedona Forum keynote speech that, as
In a world increasingly defined by unpredictability, two actors stand out as islands of stability: Europe and Taiwan. One, a sprawling union of democracies, but under immense pressure, grappling with a geopolitical reality it was not originally designed for. The other, a vibrant, resilient democracy thriving as a technological global leader, but living under a growing existential threat. In response to rising uncertainties, they are both seeking resilience and learning to better position themselves. It is now time they recognize each other not just as partners of convenience, but as strategic and indispensable lifelines. The US, long seen as the anchor
Kinmen County’s political geography is provocative in and of itself. A pair of islets running up abreast the Chinese mainland, just 20 minutes by ferry from the Chinese city of Xiamen, Kinmen remains under the Taiwanese government’s control, after China’s failed invasion attempt in 1949. The provocative nature of Kinmen’s existence, along with the Matsu Islands off the coast of China’s Fuzhou City, has led to no shortage of outrageous takes and analyses in foreign media either fearmongering of a Chinese invasion or using these accidents of history to somehow understand Taiwan. Every few months a foreign reporter goes to