It sounds like something from the outer reaches of science fiction: battlefield robots waging constant war, algorithms that determine who to kill, face-recognition fighting machines that can ID a target and take it out before you have time to say “Geneva Conventions.”
This is no film script, however, but an ominous picture of future warfare that is moving ever closer.
“Killer robots” is shorthand for a range of tech that has generals salivating and peace campaigners terrified at the ethical ramifications of warfare waged via digital proxies.
Now, two women armed with nothing more than a Nobel prize, knowhow and a lot of conviction are standing in front of the march of deadly killer robots. They want them banned. And they have done this kind of thing before.
Jody Williams won the Nobel Peace Prize in 1997 for leading the long, global effort to get anti-personnel landmines banned. Mary Wareham was a prominent supporter in that campaign.
“We were there at the Nobel peace prize ceremony, and I said to Jody, ‘This is how you finish your career, not start it! What are we going to do now?’” Wareham said.
The answer? Lethal autonomous weapon systems, also known as LAWs. The women expect the struggle to be far harder.
“In relative terms, landmines are chump change,” Williams said, pointing to the billions of dollars manufacturers could make selling artificial intelligence (AI)-enhanced weapons.
The big question is: What would stop armies from deploying upgraded drone bots to search for, identify, and then take out every man in a village between the ages of 18 and 50? Or to send a killer drone to ID and assassinate a head of state?
Weapons manufacturers are riding the same artificial intelligence wave as other industries. Militaries, eyeing each other in a quiet but fierce arms race, are funding some of the most cutting-edge trials.
To some, the advantages are clear: Killer robots would never fatigue like a human soldier. They could potentially stay out on the battlefield for months. They would never get angry or seek revenge. They would never defy an officer’s orders. They would remove the imperfect human from the equation. Algorithms would determine who to kill.
However, some military experts have expressed concerns.
“There are not only legal and ethical concerns about lethal autonomy, but practical ones as well,” said Paul Scharre, a former US
Army Ranger who wrote the Pentagon’s earliest policy statement on killer robots. “How does one control an autonomous weapon? What happens if there’s a glitch in the system or someone hacks it?”
To Williams, the machines represent the very definition of cold-blooded slaughter. With killer robots, World War III would allow little space for what shred of humanity surfaces in wars.
“It’s men getting hard-ons over new weapons,” Williams said. “They’re doing it simply because they can. They’re doing it because they want to see where it can go.”
Israel already has some of the most advanced machines, including an armed ground robot that has patrolled the Gaza border and the Harpy, a missile that circles the skies until it finds its target.
Marketed ostensibly to destroy enemy radars, no technical barriers exist to stop engineers in the industry from developing similar weapons that would one day attack people.
In the valleys of central California, the US military is running drone swarm experiments. Russia has declared its desire to form an entire battalion of killer robots. No one really knows what China is doing.
No law governs this AI arms race. Countries currently face a free-for-all.
Scientists have sounded the alarm, and more than 250 research and academic institutions and 3,000 prominent players in the field have called for a ban on killer robots.
However, beyond a petition, activists reckon the best way to stop this technology in its tracks is through the tedious, unheroic task of passing an international treaty.
That is the strategy of the Campaign to Stop Killer Robots (CSKR). More than 100 organizations in 54 countries have joined the coalition, with the aim of getting a deal by 2021.
Williams is an idealist, but she is not naive. She has battled the military-industrial complex since her days protesting the Vietnam War. Sceptics had thought banning landmines would be impossible.
“Anything is inevitable if you do nothing to stop it,” she said. “When they were drumming that at us — ‘it’s inevitable, it’s inevitable’ — the reason people do that is to disempower you.”
In that campaign, Williams and others had lobbied the UN to pass an agreement. When that process flagged, they took negotiations outside the UN framework and began corralling countries on board, one by one, until a historic deal in Ottawa in 1997, when more than 120 nations committed to eradicating anti-personnel landmines.
Today, the Campaign to Stop Killer Robots is following a similar roadmap. The UN has held several rounds of talks in Geneva, including a session at the end of last month.
Williams and Wareham hope Germany will take the lead.
The country joined the UN Security Council at the start of the year, and German Minister of Foreign Affairs Heiko Maas recently called killer robots “nothing less than an attack on humanity itself.”
Behind the scenes, Berlin has reached out to other states to push for more progress.
Some German politicians recognize it has the opportunity to be the first country to ban killer robots and if it does, other European states would follow suit.
South Korea yesterday said that it would lift COVID-19 restrictions on social gatherings next week as the country prepares to switch to a “living with COVID-19” strategy amid rising vaccination levels. A new panel established this week is drawing up a plan for a gradual lifting of curbs, aiming to lift restrictions and reopen the economy next month on the expectation that 80 percent of the adult population will be fully vaccinated. From Monday, the South Korean government is to allow gatherings of up to four unvaccinated people and ease operating-hour restrictions imposed on venues such as restaurants, cafes and cinemas, South
Japan’s Mount Aso erupted yesterday, spewing a giant column of ash thousands of meters into the sky as hikers rushed away from the popular tourist spot. No injuries were immediately reported after the late-morning eruption in southwest Japan, which sent rocks flying in a dramatic blast captured by nearby CCTV cameras. People were warned not to approach the volcano as it ejected hot gas and ash as high as 3,500m, and sent stones tumbling down its grassy slopes. Authorities were checking if any hikers had been trapped or injured, officials told local media, as TV footage showed dozens of vehicles and tour buses
‘AVOIDABLE SITUATION’: After being tortured in his home country, a Sri Lankan and his family are at risk of deportation from the UK, despite his academic fellowship A scientist conducting groundbreaking research into renewable energy is facing deportation with his family to Sri Lanka, where he was tortured, after receiving contradictory information about his case from the British Home Office. Nadarajah Muhunthan, 47, his wife, Sharmila, 42, and their three children, aged 13, nine and five, went to the UK in 2018 after Muhunthan, who is working on thin-film photovoltaic devices used to generate solar power, was given a prestigious Commonwealth Rutherford fellowship. The award allowed him to reside to the UK for two years to research and develop the technology. His wife obtained a job caring for
A top global law firm is no longer representing the University of Hong Kong (HKU) in seeking the removal of a Tiananmen memorial from its campus after it came under heavy criticism in the US for helping China purge dissent, the Washington Post reported. Mayer Brown is the latest international company to face pressure over how its actions in China contradict its more progressive statements in the West. The 8m high Pillar of Shame sculpture by Danish artist Jens Galschiot has stood on HKU’s campus since 1997, the year the city was handed back to China. It features 50 anguished faces and tortured