Tue, Apr 09, 2019 - Page 6 News List

Nobel winner targets killer robots

CSKR:Jody Williams and Mary Wareham were leading lights in the effort to ban landmines, and Williams won a Nobel. Now they want autonomous weapons banned

The Guardian, BERLIN

It sounds like something from the outer reaches of science fiction: battlefield robots waging constant war, algorithms that determine who to kill, face-recognition fighting machines that can ID a target and take it out before you have time to say “Geneva Conventions.”

This is no film script, however, but an ominous picture of future warfare that is moving ever closer.

“Killer robots” is shorthand for a range of tech that has generals salivating and peace campaigners terrified at the ethical ramifications of warfare waged via digital proxies.

Now, two women armed with nothing more than a Nobel prize, knowhow and a lot of conviction are standing in front of the march of deadly killer robots. They want them banned. And they have done this kind of thing before.

Jody Williams won the Nobel Peace Prize in 1997 for leading the long, global effort to get anti-personnel landmines banned. Mary Wareham was a prominent supporter in that campaign.

“We were there at the Nobel peace prize ceremony, and I said to Jody, ‘This is how you finish your career, not start it! What are we going to do now?’” Wareham said.

The answer? Lethal autonomous weapon systems, also known as LAWs. The women expect the struggle to be far harder.

“In relative terms, landmines are chump change,” Williams said, pointing to the billions of dollars manufacturers could make selling artificial intelligence (AI)-enhanced weapons.

The big question is: What would stop armies from deploying upgraded drone bots to search for, identify, and then take out every man in a village between the ages of 18 and 50? Or to send a killer drone to ID and assassinate a head of state?

Weapons manufacturers are riding the same artificial intelligence wave as other industries. Militaries, eyeing each other in a quiet but fierce arms race, are funding some of the most cutting-edge trials.

To some, the advantages are clear: Killer robots would never fatigue like a human soldier. They could potentially stay out on the battlefield for months. They would never get angry or seek revenge. They would never defy an officer’s orders. They would remove the imperfect human from the equation. Algorithms would determine who to kill.

However, some military experts have expressed concerns.

“There are not only legal and ethical concerns about lethal autonomy, but practical ones as well,” said Paul Scharre, a former US

Army Ranger who wrote the Pentagon’s earliest policy statement on killer robots. “How does one control an autonomous weapon? What happens if there’s a glitch in the system or someone hacks it?”

To Williams, the machines represent the very definition of cold-blooded slaughter. With killer robots, World War III would allow little space for what shred of humanity surfaces in wars.

“It’s men getting hard-ons over new weapons,” Williams said. “They’re doing it simply because they can. They’re doing it because they want to see where it can go.”

Israel already has some of the most advanced machines, including an armed ground robot that has patrolled the Gaza border and the Harpy, a missile that circles the skies until it finds its target.

Marketed ostensibly to destroy enemy radars, no technical barriers exist to stop engineers in the industry from developing similar weapons that would one day attack people.

In the valleys of central California, the US military is running drone swarm experiments. Russia has declared its desire to form an entire battalion of killer robots. No one really knows what China is doing.

This story has been viewed 1431 times.

Comments will be moderated. Keep comments relevant to the article. Remarks containing abusive and obscene language, personal attacks of any kind or promotion will be removed and the user banned. Final decision will be at the discretion of the Taipei Times.

TOP top