By Aurora Weiss
Vienna (INPS Japan) – More than 1000 guests, including representatives from 144 countries, international organizations, industry, academia, science, and civil society, gathered in Vienna to discuss how Autonomous Weapons Systems can be regulated. The first international conference on Autonomous Weapons Systems (AWS), titled ‘Humanity at the Crossroads: Autonomous Weapons Systems and the Challenge of Regulation,’ took place from 29 to 30 April 2024, organized by the Ministry of Foreign Affairs of Austria.
The increasing autonomy of weapons through the introduction of artificial intelligence (AI) has fundamentally changed armed conflicts. Despite years of efforts and discussions, regulations have not yet been implemented for the rapid technological progress at the international level. Autonomous Weapons Systems (AWS) are already being used in current conflicts, such as Israel’s war in Gaza and Russia’s war of aggression against Ukraine. Regulations are urgently needed, as the use of AI in armed conflicts raises profound questions of international law, morality, humanitarianism, and security.
“Autonomous weapons systems will soon fill the world’s battlefields. We already see this with AI-enabled drones and AI-based target selection. Technology is moving ahead with racing speed, while politics is lagging behind. We are faced with profound legal, ethical, and security questions: How can we prevent ceding life and death decisions to machines? How can we deal with algorithms prone to mistakes and bias? How can we stop an AI-driven arms race and keep this technology out of the hands of terrorists? I cannot overstate the urgency. This is the ‘Oppenheimer moment’ of our generation. ‘Now is the time to agree on international rules and norms to ensure human control,” stressed Austrian Foreign Minister Alexander Schallenberg in his opening speech. He appealed that every human life lost is one too many and that we must always ensure that decisions about life and death are not made by machines
Austria has long been striving for international regulation of AWS and is playing a pioneering role in this area. In 2023, Austria coordinated the first UN resolution on autonomous weapons systems, underlining the need for regulations. Developing international laws and standards takes time, as the adoption of a treaty typically results from decades of work, close partnerships, and collective mobilization. Agreements also require effective support after they have been signed.
An autonomous weapon system is pre-programmed to kill a specific ‘target profile.’ The weapon is then deployed into an environment where its AI searches for that ‘target profile’ using sensor data, such as facial recognition. Autonomous weapons are an example of digital dehumanization at its most extreme. Giving machines the power to make life-or-death decisions undermines human dignity and denies us our rights. Instead of being seen as people, individuals are processed as objects. When an autonomous weapon is activated, we do not specifically know who or what it will strike, nor precisely where or when that strike will occur.
Austria has a long-standing tradition of working on disarmament issues and creating international legally binding treaties, such as the Treaty on the Prohibition of Nuclear Weapons (TPNW), which challenged the established nuclear order. One of the architects of the Humanitarian Initiative that led to the TPNW is Alexander Kmentt, Director for Disarmament, Arms Control, and Non-Proliferation at the Austrian Ministry of Foreign Affairs. During the conference, many speakers pointed out that we are once again in a historic Oppenheimer moment. We asked Ambassador Kmentt to explain the comparison between nuclear and autonomous weapons.”
“The comparison to the Oppenheimer moment is that after Hiroshima and Nagasaki, many people, including Oppenheimer and Einstein, warned of the implications of nuclear weapons and pushed for regulation. Today, we have key AI experts warning about the possible existential risks of AI and AWS and asking for regulation, but the current geopolitical situation makes it very difficult to agree on international rules. We must try not to miss the moment when preventative action is still possible,” explains Alexander Kmentt.
There are several significant challenges associated with AWS from the perspective of arms control experts like Kmentt. Increasing autonomy (through the use of AI) in weapons systems will fundamentally change armed conflicts. We are already witnessing some of these changes. One major concern is when machines make life-and-death decisions based on pre-programmed algorithms.
“When machines learn and communicate with each other, what is, or should be, the role of humans when weapons make such decisions? We already see signs of an AI arms race. Soon, these weapons will be in many arsenals worldwide and also in the hands of non-state actors, such as terrorists,” an alarmed Austrian diplomat said.
Currently, there are no specific rules to deal with the legal, ethical, and security policy challenges posed by weapons systems like AWS. Kmentt stressed that Austria wants to raise the political profile of this issue and create momentum for progress on international rules for AWS. Austria initiated a resolution in the UNGA last year, and we have now organized this conference. It was significant as the largest international meeting so far specifically on this issue, and we hope that it will be a step towards more political momentum for international rules. Knowing about his great contribution to the creation of the TPNW, we asked him if this conference is a sign that a Treaty on AWS is being prepared soon.
“‘The challenge at the moment is to move from discussions, which have been ongoing for years, to actually negotiating a treaty. The UN Secretary-General has challenged the international community to do this by 2026. If we get to negotiations, we should aim to explicitly prohibit systems that cannot be used in accordance with international law or contravene basic ethical principles, and to regulate other systems in a way that a meaningful level of human control can be maintained,” concluded Alexander Kmentt.” “Innovation in science and technology (S&T) is progressing at a rapid pace. Some advances have applications in weapons development, from directed energy weapons to nanoweapons or neuroweapons, to swarms of autonomous robotic platforms. Such developments can challenge established norms for the maintenance of international peace and security, the protection of human rights, and the achievement of sustainable development goals.”
That’s why Richard Moyes, Director of Article 36 and a founder of the Stop Killer Robots campaign, thinks that the Vienna conference was significant in continuing to build the partnership of states, international organizations, and civil society that are needed to make these treaties happen. Stop Killer Robots is a coalition of civil society organizations from around the world concerned with human rights, conflict, technology, and the protection of civilians. It is a partnership of organizations working together to push states to develop a new international treaty.
“We are not at the stage of drafting treaty text yet, but we are building confidence across different partners that a treaty is possible. In the end, it is only states that can agree on new international law, but we can work together to make sure they act,” Richard Moyes, an expert on the impact of conflict on civilians and the international regulation of weapon technologies, told us.
He has worked on the creation of a number of international legal and political instruments relating to weapons and conflict. Regarding AWS, he sees a special threat in the removal of human control and accountability from the use of force. “People make mistakes, and people sometimes do terrible things, but all of our legal frameworks around the use of force are built on the foundation that people make decisions and people can be held responsible. We need to maintain meaningful human control if we are going to preserve the concept of law in armed conflict. Handing life and death decisions over to machines is also dehumanizing and will further devalue human life, especially for those who are already marginalized,” says Moyes, pointing out how challenging it is to regulate new technologies before they become a widespread problem. By the time autonomous weapons become a widespread problem, it will be too late. Moyes stressed that the legal treaty should contain prohibitions on systems that cannot be used with meaningful human control, and rules to ensure that control in practice. It should also prohibit autonomous systems that would target people directly. From his perspective, these are the key rules that will influence how technologies are developed in the future.
“A key challenge is that highly militarized states don’t want to accept any constraint on their military options. We need to get a wider majority of states to draw the lines that can guide society in a safer direction,” the director of the UK non-governmental organization Article 36 told us.
The importance of building a legal framework, specifically an international treaty, was also emphasized by the NGO Soka Gakkai International(SGI), which is also part of the Stop Killer Robots campaign.
“We therefore join a growing number of stakeholders in calling for an international treaty to prohibit and regulate autonomy in weapons systems, to safeguard the rights and dignities of humanity in the face of rapidly advancing technological change,” Hayato Yamashita, Program Coordinator for Disarmament at SGI from Japan, appealed in his statement.
INPS Japan
This article is brought to you by INPS Japan, in collaboration with Soka Gakkai International in consultative status with UN ECOSOC.
Related Articles:
SGI Participates at 2024 Vienna Conference on Autonomous Weapons Systems