Skip to content

Majority of nations pass measure setting up U.N. vote on 'killer robots'

The first-of-its-kind provision cites an “urgent need for the international community to address the challenges and concerns raised by autonomous weapons systems."

Campaigners at the U.N.
Campaigners at the U.N. (AN/Stop Killer Robots/Clare Conboy)

A U.N. committee easily passed a provision seeking a vote in the General Assembly later this year on how to control autonomous weapons.

The 193-nation assembly's First Committee, which handles issues related to disarmament and international security, "retained the provision" in 164-5 vote on Wednesday. Belarus, India, Mali, Niger and Russia were opposed.

Eight nations abstained: China, Iran, Israel, North Korea, Saudi Arabia, Syria, Turkey and United Arab Emirates.

The vote advances a first-of-its-kind resolution – drafted by Austria and co-sponsored by 43 nations including Belgium, Costa Rica, Germany, Ireland, Mexico, New Zealand, Philippines, Sierra Leone, Sri Lanka and Switzerland – that cites an “urgent need for the international community to address the challenges and concerns raised by autonomous weapons systems."

The resoluton reflects growing concerns about the degree to which artificial intelligence-assisted weapons could compromise international security and cross humanitarian and ethical lines. It calls on the U.N. to prepare a report on how nations approach lethal autonomous weapons systems, known as LAWS.

Such weapons systems could have "negative consequences" for regional and international stability, it said, including "the risk of an emerging arms race.”

Those that opposed or abstained from the vote included five of the world's nine nuclear-armed nations: China, India, Israel, North Korea, Russia. The other four nuclear-armed nations are France, Pakistan, the U.S., and the U.K.

United Nations
United Nations

A 'confidence-building' step

Two United Nations human rights investigators put the threat of autonomous weapons systems such as robotics and drones under an international spotlight more than a decade ago.

Philip Alston, a U.N. special rapporteur for extrajudicial, summary or arbitrary executions, reported in 2010 on the dangers of "automated technologies" making life-and-death decisions on artificial intelligence reasoning.

Then in 2013, Christof Heyns, a U.N. special rapporteur on extrajudicial killings, who had a provocative question in a report to the U.N. Human Rights Council: Is it "inherently wrong" to let autonomous machines decide who and when to kill?

His report urged nations to impose a moratorium on autonomous weapons to allow more time for the instruments of global governance to catch up with the ethical implications of LAWS.

On Wednesday, the Geneva-based Campaign to Stop Killer Robots, representing some 250 advocacy and civil society organizations across 70 countries, welcomed the First Committee's vote at U.N. headquarters in New York.

"After 10 years of international discussions and in the face of rapid technological developments, the adoption of this resolution is a step forward, lighting a path towards a legal framework to ensure meaningful human control over the use of force," the campaign said.

"While it does not go far enough to call for negotiations," it said, "this resolution does build international confidence, and signals that urgent political action must be taken to safeguard against the serious risks posed by autonomous weapons systems."

The issue of autonomous weapons systems has taken on new urgency as AI-based tools are adopted more quickly than anticipated. U.N. Secretary General António Guterres called for the creation of a new international AI watchdog agency.

A key concern is the degree to which AI-assisted weapons can make battlefield judgments that people ordinarily make. In the absence of a treaty on LAWS, Russia's war in Ukraine has become a testing ground for prototype systems.

Russia, China and the U.S. – three of the U.N. Security Council's five permanent members; the others are France and the U.K. – have been developing them.

Comments

Latest