Fri, Apr 06, 2018 - Page 11 News List

Scientists boycott S Korea’s KAIST over AI weapons

‘MEANINGFUL CONTROL’:The university said the new lab would focus on using AI for command and control systems, navigation of large drones and ‘smart’ training

Reuters, BERLIN

More than 50 top researchers in the field of artificial intelligence (AI) on Wednesday announced a boycott of KAIST, South Korea’s top university, after it opened what they called an AI weapons lab with one of South Korea’s largest companies.

The researchers, based in 30 countries, said they would refrain from visiting KAIST, hosting visitors from the university, or cooperating with its research programs until it pledged to refrain from developing AI weapons without “meaningful human control.”

KAIST, which opened the center in February with Hanwha Systems, one of two South Korean makers of cluster munitions, responded within hours, saying that it had “no intention to engage in development of lethal autonomous weapons systems and killer robots.”

The university was “significantly aware” of ethical concerns regarding AI, KAIST president Shin Sung-chul said, adding: “I reaffirm once again that KAIST will not conduct any research activities counter to human dignity including autonomous weapons lacking meaningful human control.”

The university said the Research Center for the Convergence of National Defense and Artificial Intelligence would focus on using AI for command and control systems, navigation for large uncrewed undersea vehicles, “smart” aircraft training and tracking and recognition of objects.

Toby Walsh, the professor at the University of New South Wales in Sydney who organized the boycott, said the university’s quick response was a success, but he needed to speak with all those who signed the letter before calling off the boycott.

“KAIST has made two significant concessions: not to develop autonomous weapons and to ensure meaningful human control,” Walsh said, adding that KAIST’s response would add weight to UN discussions taking place next week on the overall issue.

It remained unclear how meaningful human control of an uncrewed submarine — one of the launch projects — could be established when it was under the sea and unable to communicate, he added.

‘WEAPONS OF TERROR’

In an open letter announcing the boycott, the researchers said: “If developed, autonomous weapons will ... permit war to be fought faster and at a scale great than ever before. They will have the potential to be weapons of terror.”

They cited effective bans on previous arms technologies and urged KAIST to ban any work on lethal autonomous weapons, and to refrain from AI uses that would harm human lives.

The letter, also signed by top experts on deep learning and robotics, was released ahead of a meeting in Geneva, Switzerland, on Monday by 123 UN member countries on the challenges posed by lethal autonomous weapons, which critics have described as “killer robots.”

Walsh told reporters that there were many potential good uses of robotics and AI in the military, including removing humans from dangerous tasks, such as clearing minefields.

“But we should not hand over the decision of who lives or dies to a machine. This crosses a clear moral line,” Walsh said. “We should not let robots decide who lives and who dies.”

Comments will be moderated. Keep comments relevant to the article. Remarks containing abusive and obscene language, personal attacks of any kind or promotion will be removed and the user banned. Final decision will be at the discretion of the Taipei Times.

TOP top