Wed, Jan 23, 2019 - Page 9 News List

Winning the AI arms race

While Western democracies have traditionally been more adept at tapping innovation, China and Russia could have an advantage by being willing to remove the human component from decisionmaking

By Peter Apps  /  Reuters

Illustration: Mountain People

In October last year, 31 Chinese teenagers reported to the Beijing Institute of Technology, one of the country’s premier military research establishments. Selected from more than 5,000 applicants, Chinese authorities hope they will design a new generation of artificial intelligent weapons systems that could range from microscopic robots to computer worms, submarines, drones and tanks.

The program is a potent reminder of what could be the defining arms race of the century, as greater computing power and self-learning programs create new avenues for war and statecraft. It is an area in which technology might now be outstripping strategic, ethical and policy thinking — but also where the battle for raw human talent may be just as important as getting the computer hardware, software and programming right.

Consultancy PwC estimates that by 2030 artificial intelligence products and systems will contribute up to US$15.7 trillion to the global economy, with China and the US likely the two leading nations.

However, it is the potential military consequences that have governments most worried, fearful of falling behind — but also nervous that untested technology could bring new dangers.

In the US, Pentagon leaders have asked the Defense Innovation Board — a collection of senior Silicon Valley figures who provide the US military with tech advice — to come up with a set of ethical principles for the use of artificial intelligence (AI) in war. Last month, France and Canada announced they were setting up an international panel to discuss broadly similar questions.

So far, Western states have stuck to the belief that decisions of life and death in conflict should always be made by humans, with computers and algorithms simply supporting those decisions. Other nations — particularly Russia and China — are flirting with a different path.

Russia — which last year announced it was doubling AI investment — said this month it would publish a new AI national strategy “roadmap” by the middle of the year. Russian officials say they see AI as a key to dominating cyberspace and information operations, with suspected Russian online “troll farms” thought to already be using automated social media feeds to push disinformation.

Beijing is seen as even further ahead in developing AI, to the extent some experts believe it may already be beating the US.

Experts say achieving mastery in AI comes down to having sufficient computer power, enough data to learn from, and the human talent to make those systems work. As the world’s most powerful autocratic states, Russia and China have that capability and intent, both to use AI to maintain government dominance at home and beat enemies beyond.

Already, Beijing is using mass automatic surveillance — including facial recognition software — to crack down on dissent, particularly in its ethnic Uighur Muslim northwest. Along with Russia, China has many fewer scruples and controls than Western states when it comes to monitoring its citizens’ communications. Such systems will likely become more powerful as technology improves.

Traditionally, Western democracies — particularly the US — have proved more adept than dictatorships at tapping new technology and innovation. However, on AI, Washington’s efforts to build links between Silicon Valley and the military have been far from trouble-free. In June, employees at Google forced the firm to avoid renewing its contract with the Pentagon. Many tech researchers are reluctant to work on defense projects, nervous they will end up building out-of-control robots that kill.

This story has been viewed 2747 times.

Comments will be moderated. Remarks containing abusive and obscene language, personal attacks of any kind or promotion will be removed and the user banned.

TOP top