Our “Oppenheimer Moment” – Autonomous Weapons Systems
- Joseph Lisa
- Mar 11
- 6 min read
How the international community is responding to the global threat posed by Autonomous Weapons Systems.

Few issues in the realm of modern international law have concerned world leaders as greatly as the emerging threat posed by Autonomous Weapons Systems (AWS). Developments in artificial intelligence, military technology, computer science, and machine learning are converging to create a novel form of weaponry capable of destruction at an unprecedented scale. Diplomats and legal experts are cognizant of their potential to violate international humanitarian law, and escalate geopolitical tensions, and have begun to discuss how the world should approach AWS.[2] In a recent survey conducted by the United Nations Office of the Secretary-General, states and multinational organizations voiced their concerns about the future of these powerful technologies in the international legal sphere. Among other issues, the importance of global collaboration and the need to extend or establish regulation is a constant preoccupation of world leaders.
One of the foremost legal concerns states voiced to the Secretary-General was the lack of a universally-accepted definition of AWS. Autonomous weapons – those that function without continuous human input – are not necessarily new to the military landscape. Landmines, for instance, were first developed and used during World War I, and recognized as “rudimentary autonomous weapons.”[3] Although states have yet to reach a consensus on a precise definition, the emergent class of AWS discussed here is most aptly described by the International Committee of the Red Cross’s working definition: “weapons systems ... designed to select and engage one or more targets without the need for human intervention after activation.”[4] Under this definition, AWS are best understood as weapons systems designed to perform the entire sequence of deployment from target selection to elimination without substantial human input. In the current military technology environment, drones, missile systems, and unmanned aerial vehicles (UAVs) are common examples of AWS.[5]
However, controversy arises in the search for even a general definition of AWS. Several states have argued that any definition should exclude defensive weapons platforms.[6] As the United States asserts, the International Committee of the Red Cross’s definition would encompass legally uncontroversial autonomous defensive systems such as the AEGIS Weapon System and PATRIOT Air and Missile Defense System, which can select and engage targets without requiring substantial operator input.[7] Proponents of the defensive-weapons exclusion have also argued that defensive AWS are distinguishable from offensive AWS algorithmically.[8] Under this approach, defensive AWS implementing deterministic algorithms should be exempted from future legal prohibitions specific to offensive AWS implementing probabilistic models.[9] Briefly, the key difference between these computational approaches is the incorporation of uncertainty: given the same input, a deterministic algorithm will always reach the same output, whereas a probabilistic algorithm incorporates uncertainty, and are therefore less predictable, but account for a wider gamut of potential input errors and output contingencies.[10]
Should AWS technology progress unabated, violence and chaos are likely to follow. States acknowledged the potential for AWS technology to destabilize regional security arrangements.[11] The nature of AWS technology may lower the threshold for the use of force, leading to more frequent and more severe conflicts.[12] Accordingly, several states issued warnings against nations seeking military superiority and hegemony using AWS-enabled military tactics because arms races could erupt.[13] Additionally, AWS could be equipped with nuclear warheads or other Weapons of Mass Destruction, heightening the risk of nuclear war.[14]
With these concerns in mind, attorneys and diplomats continue to discuss approaches to best protect human life. The Secretary-General found general approval regarding the applicability of the Convention on Certain Conventional Weapons (CCW) as a forum to address current threats posed by unrestricted AWS usage in combat.[15] The CCW currently appears to be the most suitable legal framework to address the issue. The CCW has attained widespread recognition by the international community, having been ratified or acceded to by 126 nations.[16] Furthermore, the CCW’s High Contracting Parties have been discussing regulation of AWS since 2017.[17] Given its broad scope and acceptance by the international community, the CCW may serve as a promising framework for establishing and expanding international legal protocol for AWS.
Urgency is critical at this nascent stage in AWS development and deployment. Existing international law bodies must be expanded and must adopt regulations and ethical guidelines now, before this technology - in its unbridled form - becomes a common feature of military arsenals. Evidence indicates that AWS technologies are already being introduced in conflicts. In early 2020, the UN reported the deployment of AWS against human personnel in Libya.[18] UAVs and Turkish Kargu-2 drones were observed autonomously hunting down and engaging retreating Haftar Armed Forces personnel.[19] The Kargu-2 drones were deemed AWS because they were programmed to fire without requiring data connectivity – that is, the drones fired without an input signal from human operators.[20] Further analysis of the Kargu-2 drone platform shows they are capable of fully autonomous operation, employing facial recognition software to identify human targets to engage.[21] Their lethality and autonomous function led to a colloquial moniker for the Kargu-2 platform – “Slaughterbots.”[22]
Even with these concerning developments, the future of AWS regulation appears to be uncertain at best. In March 2024, the Group of Governmental Experts of the CCW on AWS convened to discuss state sentiments and intentions regarding regulations on the weapons.[23] Against the interest of immediate progress, delegates from Russia, Israel, and the United States voiced their opposition to AWS regulations.[24] Russia stated that it views AWS no differently than any other weapons platform, and asserted its sovereign right to build “any weapons systems [Russia] wants without justifying or explaining itself.”[25] Israel made a similar declaration of sovereignty, arguing for its ability to freely deploy AWS if it is in their self-interest.[26] Finally, the United States suggested that fully autonomous weapons systems are already a component of the US military arsenal, and argued along lines similar to those voiced by Russia.[27] Collectively, the refusal of these militarily-powerful nations to support CCW amendments for regulation of AWS will cripple efforts to provide safeguards against the unchecked development of this lethal technology.
The future use of AWS in military operations remains likely. Less clear, however, is how the development and deployment of these technologies will be tolerated or regulated by the international community. The Austrian delegation summarized the current state of AWS aptly:
“Humanity is at a crossroads and must come together to address the challenge of regulating these weapons. This could be the ‘Oppenheimer moment’ of our generation. Experts from various fields have been warning about the profound risks and severe consequences for humanity of an unregulated autonomous weapons systems race. International efforts must rise to the challenge of regulating those systems. So far, they have not been commensurate with the speed and significance of this development.”[28]
The future of warfare is now in the hands of the world’s international lawyers and diplomats. How they approach the issue of AWS—or fail to do so—will determine the extent to which these weapons can be developed, proliferated, and used to extinguish human life. Are we ready for killer robots?
[1] Mehmet Kaman, Photograph of autonomous rotary-wing attack drones at the campus of OSTIM Technopark in Ankara, Turkey, in Stuart Russell, Anthony Aguirre, Emilia Javorsky & Max Tegmark, Lethal Autonomous Weapons Exist; They Must Be Banned, IEEE Spectrum (June 16, 2021), https://spectrum.ieee.org/lethal-autonomous-weapons-exist-they-must-be-banned.
[2] Lethal Autonomous Weapon Systems (LAWS), U.N. Off. for Disarmament Affs., https://disarmament.unoda.org/the-convention-on-certain-conventional-weapons/background-on-laws-in-the-ccw/ (last visited Mar. 11, 2025).
[3] Jody Williams, Landmines: A Global Socioeconomic Crisis, 22 Soc. Just. 97, 98 (1995) (describing history of landmines); What You Need to Know About Autonomous Weapons, Int’l Comm. of the Red Cross (July 26, 2022), https://www.icrc.org/en/document/what-you-need-know-about-autonomous-weapons.
[4] U.N. Secretary-General, Lethal Autonomous Weapons Systems, ¶ 8, U.N. Doc. A/79/88 (July 1, 2024).
[5] Robert F. Trager & Laura M. Luca, Killer Robots Are Here—and We Need to Regulate Them, Foreign Policy (May 11, 2022, 1:46 PM), https://foreignpolicy.com/2022/05/11/killer-robots-lethal-autonomous-weapons-systems-ukraine-libya-regulation/.
[6] U.N. Secretary-General, supra note 4, at ¶ 10.
[7] Id. at 114.
[8] Id. at ¶ 10.
[9] Id.
[10] Autonomous Weapons Systems, Technical, Military, Legal and Humanitarian Aspects, Int’l Comm. of the Red Cross (Jan. 11, 2014), https://www.icrc.org/en/document/report-icrc-meeting-autonomous-weapon-systems-26-28-march-2014; Sebastian Thrun, Probabilistic Algorithms in Robotics, 21 AI Mag. 93, 93 (2000).
[11] U.N. Secretary-General, supra note 4, at ¶ 40.
[12] Id.
[13] Id.
[14] Id. at ¶ 41.
[15] Id. at 22, 30, 36, 49, 52, 55, 62–63, 67, 73, 77, 84, 86, 90, 94, 106 (acknowledging approval explicitly expressed by Argentina, Bulgaria, China, France, Germany, India, Israel, Italy, Japan, Luxembourg, Kingdom of the Netherlands, Norway, Pakistan, Republic of Korea, Russian Federation, and Sweden, respectively)
[16] The Convention on Certain Conventional Weapons, U.N. Off. for Disarmament Affs., https://disarmament.unoda.org/the-convention-on-certain-conventional-weapons/ (last visited Mar. 11, 2025).
[17] Id.
[18] Panel of Experts on Libya, Letter dated 8 March 2021 from the Panel of Experts on Libya established pursuant to resolution 1973 (2011) addressed to the President of the Security Council, ¶ 63, U.N. Doc. S/2021/229 (Mar. 8, 2021).
[19] Id.
[20] Id.
[21] Stuart Russell, Anthony Aguirre, Emilia Javorsky & Max Tegmark, Lethal Autonomous Weapons Exist; They Must Be Banned, IEEE Spectrum (June 16, 2021), https://spectrum.ieee.org/lethal-autonomous-weapons-exist-they-must-be-banned.
[22] Id.
[23] Reaching Critical Will, Women’s Int’l League for Peace & Freedom, Civil Society Perspectives on the Group of Governmental Experts of the Convention on Certain Conventional Weapons on Lethal Autonomous Weapon Systems 4–8 March 2024 (Ray Acheson, ed., 2024), https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2024/gge/reports/CCWR12.1.pdf.
[24] Id.
[25] Id.
[26] Id.
[27] Id.
[28] U.N. Secretary-General, supra note 4, at 25 (emphasis added).
Comments