As experts warn that there is not much time left to impose restrictions on new deadly technologies, nations are gathering at the UN on Monday to renew attempts to govern the types of AI-controlled autonomous weapons that are being employed more and more in contemporary combat.
In conflicts ranging from Gaza to Ukraine, autonomous and AI-assisted weaponry systems are already becoming more prevalent. Additionally, growing defense expenditure globally is expected to give emerging AI-assisted military technologies an additional push. The U.N. General Assembly will hold its inaugural meeting focused on autonomous weapons this Monday in New York.
How are AI weapons spreading without global oversight?
Efforts to create international regulations controlling their creation and application have not kept up. Furthermore, there are still hardly any internationally binding norms. Parties to the Convention on Conventional Weapons (CCW) have been gathering in Geneva since 2014 to explore a possible prohibition on completely autonomous systems that function without significant human oversight and control of others.
How urgent is the push for AI safeguards?
Antonio Guterres, the secretary-general of the United Nations, has given governments until 2026 to create explicit regulations regarding the deployment of AI weapons. However, human rights organizations caution that governments are not in agreement.
Alexander Kmentt, head of arms control at Austria’s foreign ministry, said that must quickly change. “Time is really running out to put in some guardrails so that the nightmare scenarios that some of the most noted experts are warning of don’t come to pass,” he said.
Human Rights Watch cautioned in a study last month that unregulated autonomous weapons pose a number of human rights risks and might spark an arms race if left unchecked, stating that important issues of responsibility under international law are still unresolved.
There is presently little in place to guarantee that military companies will properly create AI-driven weaponry, according to activists like Laura Nolan of Stop Killer Robots. “In general, we don’t trust industries to govern themselves. She asserted that there is no need for defense or tech firms to be given more credibility.
Will the UN move regulation efforts forward?
Following 164 governments’ endorsement for a 2023 U.N. General Assembly resolution urging the international community to immediately confront the threats presented by autonomous weapons, the New York negotiations were held.
Diplomatic officials want the discussions to increase pressure on military states that are opposing regulation because they fear the regulations might reduce the technology’s wartime benefits, even if they are not legally binding.
Campaign organizations think the summit will force nations to agree on a legal tool. It will also address important topics not addressed by the CCW, including human rights and ethical issues and the deployment of autonomous weapons by non-state actors. Before the next round of CCW discussions in September, they see it as an important litmus test to see if nations can overcome their differences.
“This issue needs clarification through a legally binding treaty. The technology is moving so fast,” said Patrick Wilcken, Amnesty International’s Researcher on Military, Security and Policing. “The idea that you wouldn’t want to rule out the delegation of life or death decisions … to a machine seems extraordinary.”
Who is resisting a global treaty on AI arms?
Amnesty Worldwide claims that although many nations support a legally enforceable worldwide framework, the US, Russia, China, and India favor national regulations or pre-existing international rules. According to a U.S. Pentagon official, “We have not been convinced that existing law is insufficient,” and autonomous weapons may even be less dangerous for people than conventional ones.
How are killer drones used in Gaza and Ukraine?
Autonomous systems are growing in number without regulation. About 200 autonomous weapon systems have been deployed throughout Ukraine, the Middle East, and Africa, according to weapons specialists at the Future of Life Institute think tank.
For instance, according to its data, Russian troops have sent over 3,000 Veter kamikaze drones—which can identify and engage targets on their own—to Ukraine. Meanwhile, semi-autonomous drones have been utilized in the battle by Ukraine. The Ukrainian authorities chose not to respond.
Israel has identified targets in Gaza using AI technology. According to its mission in Geneva, it fully complies with international law while using data technology and supports multilateral debates.