Keep Our Soldiers Home,
Use Lethal Autonomous Weapons Systems
By Isabel Wilson,
The Gavel, Associate Editor
J.D. Candidate, Class of 2025
INTRODUCTION
As weapon systems face technological advancements, Lethal Autonomous Weapons Systems, further addressed as LAWS, have become a “forefront of recent Department of Defense directives.”1 This arising issue continues to cause tension on a global scale and will likely come to fruition sooner rather than later due to the tumultuous state of international affairs. The lack of policy surrounding Lethal Autonomous Weapon Systems will likely provide conflict for the United States on an international front if these systems are not ethically regulated.
WHAT ARE LETHAL AUTONOMOUS WEAPONS SYSTEMS (LAWS)?
Defining what constitutes LAWS has varied amongst nations. The United States has adopted the following: “A weapon system that, once activated, can select, and engage targets without further intervention by an operator. This includes, but is not limited to, operator-supervised autonomous weapon systems designed to allow operators to override operation of the weapon system but can select and engage targets without further operator input after activation.”2 A less bureaucratic definition compiled by the International Committee of the Red Cross defines LAWS as “[a]ny weapon system with autonomy in its critical functions. That is, a weapons system that can select and attack targets without human intervention.”3 Another critical component to contemplate when discussing LAWS is that these weapons systems are being used only in military conflict and are not available in civilian settings. Therefore, the primary use would alleviate the need for the deployment of U.S. soldiers to carry out similar objectives.
SHOULD LETHAL AUTONOMOUS WEAPONS SYSTEMS BE REGULATED AND IF SO ON WHAT LEVEL?
Currently, there is little to no regulation of the development and use of LAWS. The recent Department of Defense directive issued on January 25, 2023, outlines policy and guidelines to ensure that the risk of unintended engagements and other consequences of utilizing LAWS is minimized.4 Therefore, current standard operating procedures are in place within the United States regarding the proliferation of autonomous weapons systems, but no international regulation has been determined.
Two reasons LAWS should be regulated are the cybersecurity risks and compliance with international weapons treaties. First, cybersecurity risks pose one of the greatest hurdles while developing and using LAWS because the consequences of a technological breach would be deadly. Currently, the DoD has cyber safety measures in place to align with military standards while developing LAWS in the interest of preventing risks that could ultimately turn the weapon system back on the deploying nation.
Second, compliance with international weapons laws and similar regulation amongst nations will likely lead to the need for formal regulation negotiations. The UN has taken a stand against the proliferation of LAWS and continues to advocate for LAWS to be banned.5 Meanwhile, the U.S. and several other nations encourage the development and eventual use. This will likely lead to treaty considerations or chaos within the international sphere.
WHAT CONSIDERATIONS DO LAWS RAISE?
Antonio Guterres, Secretary-General for the United Nations, made a statement regarding the advancement of technology and weapons systems following a recent AI summit. He stated that “… negotiations [are] to be concluded by 2026 on a legally binding instrument to prohibit lethal autonomous weapons systems that function without human control or oversight, which cannot be used in compliance with international humanitarian law.” Numerous nations, including the United States, have asserted they will continue to develop LAWS. This creates tension with twenty-six nations and other independent organizations that support a ban on both the development and use of such weapons systems.
One ethical issue is the lack of humanity within the systems, despite this being the main point of autonomous weapons systems. The DoD has released AI Ethical Principles that must be upheld during the “design, development, deployment, and use of AI capabilities in autonomous weapons systems.”6 These include five pillars – responsible, equitable, traceable, reliable, and governable – to ensure that the DoD minimizes any ethical pitfalls that may arise.7 Therefore, steps by the United States have already commenced to ensure that LAWS are used in an ethical manner and any further ethical implications can be addressed on an international level.
CONCLUSION
To conclude, the development and future use of Lethal Autonomous Weapons Systems will eventually come to legal fruition and the policy gap regarding these weapons systems must be closed. The current policy guidelines implemented by the Department of Defense are just the start to what will likely become lengthy negotiations on an international level to ensure that the inevitable use of LAWS align with international humanitarian and war treaties. Weighing the preventable risks with the substantial positives – such as keeping United States soldiers off enemy soil – leads the forefront of LAWS development in an effort to transform modern warfare by utilizing advancing technology.
References:
1 Shane R. Reeves, Challenges in Regulating Lethal Autonomous Weapons Under International Law, 27 Sw. J. Int’l L. 101 (2021).
2 Dep’t of Def. Directive 3000.09, Autonomy in Weapon Systems, at 21 (DoD 2023).
3 Views of the International Committee of the Red Cross (ICRC) on Autonomous Weapons System, Int’l Comm. of the Red Cross (Apr. 11, 2016), https://www.icrc.org/en/document/views-icrc-autonomous-weapon-system.
4 Dep’t of Def. Directive 3000.09, Autonomy in Weapon Systems (DoD 2023).
5 United Nations Office for Disarmament Affairs, Lethal Autonomous Weapons Systems (LAWS), https://disarmament.unoda.org/the-convention-on-certainconventional-weapons/background-on-laws-in-the-ccw/
6 Dep’t of Def. Directive 3000.09, Autonomy in Weapon Systems (DoD 2023).
7 Id.