As a World Economic Forum's annual Global Risks report warns that an arms race is close at hand, should we be welcoming autonomous weapons?

Jai Galliott and Paul Scharre
South Korea Reacts As North Korea Confirms Hydrogen Bomb Test
Such systems would be particularly unproblematic if used in clearly demarcated conflict zones such as the Korean Demilitarised Zone, says Galliot (Source: Getty)

Dr Jai Galliott works at the Australian Centre for Cyber Security at the University of New South Wales in Canberra, Australia, says Yes.

Robotics mitigates the human cost of war physically, psychologically or otherwise, and the level of this mitigation is largely dependent on the degree of automation and the removal of the human operator. Put simply: humans make mistakes and machines do not. If we can eliminate human error in the design and programming of lethal autonomous weapons systems, states would do well to employ these robots.

Such systems would be particularly unproblematic if used in clearly demarcated conflict zones such as the Korean Demilitarised Zone, or when enemy targets are located far from noncombatants and civilian infrastructure. Such situations are indeed rare in the modern age, in which the good and bad guys commingle, but it would be unwise to prevent states from using autonomous systems when the circumstances arise, especially in a context of improving overall military effectiveness and efficiency, and limiting the damaging human footprint of more conventional war.

Paul Scharre, a senior fellow at the Center for a New American Security and former Army Ranger who served multiple tours in Iraq and Afghanistan, says No.

Like self-driving cars, automation could help reduce civilian deaths in war, but human judgement is also essential. Some decisions in war have a correct answer – is a person holding a rifle or a rake? Machines can help answer those questions. Whether someone is a combatant who should be killed may depend on context, which is extremely difficult for machines. Other decisions in war are moral judgments. How much civilian collateral damage is acceptable when destroying a military target? This requires a judgement call. Even if machines could make these decisions, do we want them to? It’s true that humans would still programme and launch autonomous weapons, but we would lose the ability to evaluate each target based on the specific context. We should not be rushing to fully autonomous weapons.

Machines can answer factual questions, but human judgement is needed to understand the context and resolve moral dilemmas.

City A.M.'s opinion pages are a place for thought-provoking views and debate. These views are not necessarily shared by City A.M.

Related articles