Skip to content

U.N. stalemate over autonomous weapons enters second decade

Delegates in Geneva mustered a non-binding report that essentially prolongs a decade-old geopolitical impasse.

The U.S. military's X-45 concept for an autonomous unmanned military aircraft.
The U.S. military's X-45 concept for an autonomous unmanned military aircraft in 2000 on display at the National Air and Space Museum in Washington. (AN//Flickr)

More than a decade ago, two United Nations human rights investigators put the threat of autonomous weapons systems such as robotics and drones under an international spotlight.

The first was Philip Alston, a U.N. special rapporteur for extrajudicial, summary or arbitrary executions, who reported in 2010 on the dangers of "automated technologies" making life-and-death decisions on artificial intelligence reasoning.

Then came Christof Heyns, a U.N. special rapporteur on extrajudicial killings, who had a provocative question in a 2013 report to the U.N. Human Rights Council. Is it "inherently wrong," he asked, if autonomous machines decide who and when to kill?

If the answer is "yes, it's wrong," he argued, then the world must logically conclude that developing such weapons cannot be justified "no matter the level of technical competence with which they operate."

His report urged nations to impose a moratorium on autonomous weapons to allow more time for the instruments of global governance to catch up with the ethical implications of the technology. But he warned that the "power of vested interests" might be too strong to overcome.

Flash forward 10 years, and the issue is as much a sticking point as ever. It's also evident just how prescient Heyns' warning turned out to be.

Over this past weekend a group of governmental experts on lethal autonomous weapons systems, known as LAWS or (lethal) AWS, wrapped up its second meeting of the year. Delegates met until the early morning hours at the Palais des Nations in Geneva, but could only muster a non-binding report that essentially prolongs a decade-old stalemate.

"Once again, the outcome report is hollow on the content and fails to set a course for the safeguards against AWS that the world urgently needs," Ousman Noor, the Geneva-based Campaign to Stop Killer Robots' government relations manager, said on Monday.

The consensus-based approach fails

The latest meeting, chaired by Brazilian diplomat Flávio Soares Damico, marks the fourth consecutive year in which no constructive action emerged. It pitted some 91 nations that favor an outright ban versus 10 nations that want more limited action.

Noor said a draft agreement put forward by Damico with the support of 52 nations would not have been legally binding, but it did have useful policy elements such as how to characterize, regulate and develop the weapons based on human controls.

"However, once discussions began over the adoption of the final report, it was clear that the flow of discussions, and the content of the report would largely be determined by Russia and other states already investing in autonomous weapons technologies," he said. "Civil society and the International Committee of the Red Cross were barred from commenting on the report, and the European Union was interrupted several times by Russia and ultimately prevented from speaking."

The last concrete outcome came in 2019, when nations agreed on 11 guiding principles for the use and development of LAWS, a document that human rights groups described as toothless. Last year, Human Rights Watch urged nations to give up on U.N. negotiations – which they noted haven't even produced any non-binding commitments – and "find an effective forum to negotiate a treaty."

Richard Moyes, director of the U.K.-based Article 36, said the latest draft has no useful content. "The kind of level of quality that we're working with here is basically spending weeks coming to an agreement that it's illegal to use things that are illegal," he said. "I think these discussions, frankly, have run out of road."

The consensus-based rules of the U.N. forum are widely blamed for blocking progress. Some 91 of the 126 nations that signed onto the Convention on Certain Conventional Weapons, or CCW, support amending it with legally binding rules for LAWS, according to International Committee of the Red Cross data.

Another 34 mostly European nations are "undeclared." Ten countries – Australia, Estonia, India, Israel, Japan, South Korea, Poland, Russia, the United States and the United Kingdom – are opposed but for varying reasons. A U.S.-led coalition, which includes Australia, Japan, South Korea and the U.K., proposed banning certain types of LAWS – but without any legally binding rules.

Nations such as India, Israel and Russia, which are developing more military applications for artificial intelligence, told the forum earlier in the week they would not negotiate a legal instrument, sinking hopes for a binding treaty from the start.

Russia, whose leader Vladimir Putin said five years ago that whichever nation achieves a breakthrough in developing AI will come to dominate the world, drew protests from dozens of other nations when it forced observers from civil society groups to be expelled from the forum during the last hours of negotiations.

Momentum for action growing outside the CCW

The Mine Ban Treaty and Convention on Cluster Munitions both languished in the CCW before gaining approval in 1997 and 2008, respectively, when freed from the limitations of its consensus-based format.

The Treaty on the Prohibition of Nuclear Weapons, ratified in 2021, also was spearheaded through pressure from civil society. LAWS could follow a similar path.

Andrea Bilgeri, a disarmament expert with Austria's foreign ministry who took part in the latest round of negotiations, said the "disappointing" result did "not do justice to the evolution of positions."

Verity Coyle, a senior advisor for Amnesty International who attended the latest negotiations, said lots of nations "genuinely want to tackle autonomy in weapon systems" and could find their footing outside the consensus-based U.N. forum.

"It has been proven, again, that the Convention of Certain Conventional Weapons is not the place to do it," she said.

In a sign of investors' growing appetite for AI-backed military systems, shares in Palantir Technologies Inc., the U.S. government-linked surveillance company founded by billionaire Peter Thiel, jumped by more than 20% earlier this month after the company previewed a new AI-integrated military platform.

Reports of autonomous weapons systems used in Ukraine and Libya have added to the pressure for global governance.

"When we address these issues in another 10 years, we should not still be rehashing the problems of autonomous weapons systems and the need for new law," Human Rights Watch senior researcher Bonnie Docherty told the U.N. forum. "We should be assessing the implementation of a long-adopted, well-negotiated, and widely embraced treaty."

The Palantir pavilion at the 2023 World Economic Forum in Davos, Switzerland
The Palantir pavilion at the 2023 World Economic Forum in Davos, Switzerland (AN/J. Heilprin)

Comments

Latest