About the Author: The author is a final year student at the National Law University, Delhi. She has previously been associated with CCG as part of its summer school in March 2020 and has also worked with the Centre as a Research Assistant between September 2020 and March 2021.
Editor’s note: This post is part of the Reflection Series showcasing exceptional student essays from CCG-NLUD’s Seminar Course on Technology & National Security Law.
Introduction
When a machine has the ability to perform certain tasks which typically require human intelligence it is known as Artificial Intelligence (AI). AI is currently used in a variety of fields and disciplines. One such field is the military where AI is viewed as a means to reduce human casualties.
One such use case is the development and use of Lethal Autonomous Weapons Systems (LAWS) or “killer robots” which can make life and death decisions without human intervention. Though the technology behind LAWS and its application remains foggy, LAWS have become a central point of debate globally. Several countries seek a complete preemptive ban on its use and development, by highlighting that technology to achieve such outcomes already exists. Other countries have expressed their preference for a moratorium on its development till there are universal standards regarding its production and usage.
This piece examines whether LAWS are legal/lawful under International Humanitarian Law (IHL) as per the principles of distinction, proportionality and precautions. LAWS are understood as fully autonomous weapon systems that once activated, have the ability to select and engage targets without any human involvement. The author argues that it is premature to conclude LAWS as legal or illegal by hypothetically determining their compliance with extant humanitarian principles. Additionally, they pose ethical considerations and legal reviews under IHL that must be satisfied to determine the legality of LAWS.
What are LAWS?
There is presently no universal definition of LAWS since the term ‘autonomous’ is ambiguous. ‘Autonomous’ in AI refers to the ability of a machine to make decisions without human intervention. The US’ Department of Defense issued a 2012 directive which defines LAWS as weapon systems that can autonomously or independently “select and engage targets without any human intervention” once activated. This means LAWS leave humans “out of the loop”. The “lack of human intervention” element is also present in definitions proposed by Human Rights Watch, the International Committee of the Red Cross (ICRC) and the UK Defence Ministry.
While weapon systems that are completely autonomous currently do not exist, the technology to develop the same does. There are near-autonomous weapons systems like Israel’s Iron Dome and the American Terminal High Altitude Area Defense that can identify and engage with incoming rockets. These are defensive in nature and protect sovereign nations from external attacks. Conversely, LAWS are weapon systems having offensive capabilities of pursuing targets. Some scholars recommend incentivizing defensive autonomous systems within the international humanitarian framework.
Even though there is no singular definition, LAWS can be identified as machines or weapon systems which once activated or switched on by humans have the autonomy to select and search for targets as well as engage or attack them without any human involvement in the entire selection and attacking process. The offensive nature of LAWS as opposed to the use of automated systems for defensive purposes is an important distinguishing factor for identifying LAWS. An open letter by Future of Life Institute calls for a ban on “offensive autonomous weapons beyond meaningful human control” instead of complete ban on AI in the military sector. This distinction between offensive and defensive weapons in the definition of LAWS was also raised in the Group of Governmental Experts on LAWS 2017 meet.
Autonomy and offensive characteristics are primary grounds behind demands for a complete ban on LAWS. Countries like Zimbabwe are uncomfortable with a machine making life and death decisions and others like Pakistan are worried about military disparities with technologically superior nations leading to an unfair balance of power.
There remains considerable uncertainty surrounding LAWS and its legality as weapons to be used in armed conflicts. Governance of these weapons, accountability, criminal liability and product liability are specific avenues of concern.

Legal Issues under IHL
The legality of LAWS under IHL is observable at two levels: (a) development, and (b) deployment/use.
Legal Review of New Weapons
The Geneva Convention provides for Legal Review of any new weapons or means of warfare under Article 36 of Additional Protocol I (“AP 1”) to determine whether the development of new weapons is in compliance with the Geneva Convention and customary international law. The weapon must not have an “indiscriminate effect” or cause “superfluous injury” or “unnecessary suffering” like chemical weapons.
The conduct of LAWS must have ‘predictability’ and ‘reliability’ for them to be legally deployed in armed conflicts. If not possible then the conduct of LAWS in the midst of conflict may lead to “indiscriminate effect or superfluous injury or unnecessary suffering”.
Principles of Distinction, Proportionality & Precautions
LAWS must uphold the basic rule of distinction. LAWS should differentiate between civilians and military objects; and between those injured and those active in combat. Often even deployed troops are unable to successfully determine this and thus, programming LAWS to uphold the principle of distinction remains a challenge.
Second, LAWS must uphold the principle of proportionality. Civilian casualties, injury and destruction must not be excessive in comparison to the military advantage gained by the attack. Making such value judgments in the middle of intense battles is difficult. Programmers who develop LAWS may struggle to comprehend the complexities around these circumstances. Even when deploying deep learning, as machines recognise patterns, there might be situations when it first has to gain experience and those growing pains in technological refinement may lead to violations of the proportionality principle.
Finally, LAWS must adhere to the principle of precaution. This is the ability to recall or suspend an act when it is not proportionate or harms civilians as opposed to military adversaries. The ability to deactivate or recall a weapon once deployed is tricky. There is general consensus that LAWS will fail to comply with these principles and violate the laws of armed conflict.
Conversely others argue that its autonomous characteristics are not enough to prove that LAWS violate IHL. Existing principles are enough to restrict the use of LAWS to situations where IHL is not violated. Furthermore, autonomous weapons might be able to wait till they are fired upon to determine whether a person is civilian or military as their sense of ‘self-preservation’ will not be as strong as that of human troops thereby complying with the principle of distinction. Moreover, they might be employed in the navy or other areas not open to civilians, thereby affording LAWS a lower threshold for compliance with IHL principles. Supporters contend that LAWS might calculate and make last minute decisions without any human subjective emotions allowing them to choose the best possible plan of action thereby respecting the principles of proportionality and precautions.
Marten’s Clause
Article 1 of AP I to the Geneva Conventions states that if certain cases are not covered under the Convention, then the civilians and the combatants are protected under “Customary International Law, principles of Humanity and Dictates of Public Conscience”. This has also been reiterated in the preamble of AP II of the Geneva Conventions. This is referred to as Marten’s Clause and provides the basis for ethical and moral aspects to the law of armed conflict. Since LAWS are not directly covered by the Geneva Convention, their development and use must be guided by Marten’s clause. Therefore, LAWS may be prohibited due to noncompliance with customary international law or principles of humanity or dictates of public conscience.
LAWS cannot be declared illegal under customary international law since there is no defined state practice; as they are still being developed. The principles of humanity require us to examine questions about whether machines should have the ability to make life or death decisions regarding humans. Moreover, recent data suggests that dictates of public conscience may be skewed against the use of LAWS.
It might be early to term LAWS, which do not currently exist, as legal or illegal on the basis of compliance with the Geneva Convention. However, any discussion regarding the same must keep these legal and ethical IHL-related considerations in mind.
Present Policy Situation
The legal issues relating to LAWS are recognised by the UN Office of Disarmament. Under the Convention of Certain Conventional Weapons (CCW), a Group of Governmental Experts was asked to address the issues regarding LAWS. This group is yet to provide a singular definition of the term. However, it has recommended 11 guiding principles which were adopted by the High Contracting Parties to the CCW in 2019.
The first principle states that IHL shall apply to all autonomous weapons systems including LAWS. The second principle addresses accountability through “human responsibility” during decision making relating to the use of these systems. Further, any degree of human-machine interaction at any stage of development or activation must be in compliance with IHL. Accountability for development, deployment and use of these weapons must be as per IHL by ensuring there is a “chain of human command and control”. States’ obligation of ensuring a legal review for any new weapons is also reiterated.
The guidelines also state that cyber and physical risks, and the risk of proliferation and acquisition by terrorists must be considered while developing and acquiring such weapons. Risk assessment and mitigation must also be made a part of the design and development of such weapons. Consideration must be given to compliance with IHL and other international obligations while using LAWS. While crafting policy measures, emerging technologies in LAWS must not be “anthropomorphized”. Discussions on LAWS should not hinder peaceful civilian innovations. The principles finally highlight the importance of balancing military needs and human factors under the CCW framework.
The CCW also highlights the need for ensuring “meaningful human control” over weapon systems but does not define relevant criteria for the same. Additionally, there are different stages such as development, activation and deployment of autonomous weapons. Only a human can develop and activate the autonomous systems. However, deployment is determined by the autonomous weapon on its own as per its human programming.
Therefore, the question arises – will that level of human control over the LAWS’ programming be enough to qualify as meaningful human control? If not, will an override human command which may or may not be exercised allow for “meaningful human control”? These questions require further deliberation on what qualifies as “meaningful human control” and whether this control will even be enough given how rapidly AI is being developed. There is also a need to ensure that no bias is programmed into these weapons.
While these guiding principles are the first step towards an international framework, there is still no universal/comprehensive legal framework to ensure accountability on LAWS.
Conclusion
The legal, ethical and international concerns regarding LAWS must be addressed at a global level. A pre-emptive and premature ban might stifle helpful civilian innovation. Moreover, a ban will not be possible without the support of leading States like the US, Russia, UK. Conversely, if the development of LAWS is left unregulated then it will make it easier for countries with LAWS to go to war. Moreover, development and deployment of LAWS will create a significant imbalance between the technologically advanced and technologically disadvantaged nations. Furthermore, no regulation may lead to the proliferation and acquisition of LAWS by bad actors for malicious, immoral and/or illegal purposes.
Since LAWS disarmament is not an option, control on LAWS is recommended. The issues with LAWS must be addressed at the international level by creating a binding treaty which incorporates a comprehensive definition of LAWS. The limits of autonomy must also be clearly demarcated along with other legal and ethical considerations. The principles of IHL including legal reviews must also be implemented. Till then, defense research centers around the world should incorporate AI in more “defensive” and “non-lethal” military machineries. Such applications could include disarming bombs or surveillance drones or smart borders instead of offensive and lethal autonomous weapons systems without any overriding human control.
*Views expressed in the blog are personal and should not be attributed to the institution.