An Analytical Study Of Legal And Operational Gaps In Regulating Lethal Autonomous Weapon Systems (Laws) Under International Humanitarian Law
- IJLLR Journal
- Nov 8, 2025
- 2 min read
Avin Anto, LLM, School of Law, Lovely Professional University, Phagwara
ABSTRACT
Lethal Autonomous Weapon in short it is known as “LAWS” the weapon platforms endeavoured and aims to using artificial intelligence to select and engage targets without human interference in most precise way this have emerged as contentious topic in arms control and international law. This study endeavours to conduct critical examination on whether the existing International Humanitarian law IHL provides sufficient regulation on “LAWS” and endeavours to identifies any legal or operational gaps are existing. Drawing on doctrinal analysis of IHL i.e. Geneva Conventions, Additional Protocols, CCW and recent policy debates, it highlights core IHL obligations such as distinction, proportionality, precaution that traditionally depend on human judgment. The paper supplement this with empirical data from an google form online survey of 45 participants mostly Indian professionals and students querying their view on “LAWS” its definition, support, oversight, accountability and regulatory preference. The survey shows mixed support for “LAWS” 47% supportive v/s 11% opposed and widespread concern that autonomous targeting could raise civilian risk. Only 49% felt current IHL is somewhat fully adequate for “LAWS”, while 38 % said its inadequacy. Notably 51% instead on meaningful human control in attacks. Many respondents assigned primary responsibility for Automated Weapon System AWS misuse to deploying state 29% programmers 24 percentage or shared responsibility 31 %, this reflecting uncertainty over the “responsibilities gap.” In literature review, ICRC and other experts emphasize that IHL obligations cannot be transferred to machines and that AWS pose new compliance challenges. From the in-depth analysis it finds that although IHL formally applies to AWS, key question is that how to ensure compliance and enforcement. Based on the analysis suggest that the need for clear rules – possibly new protocols or a declaration to define permissible autonomy levels, human oversight requirement as in the meaningful human control concept and accountability mechanisms. The manuscript concludes that without such clarifications or additional binding measures, future AWS deployment could undermine IHL’s protective mandates.
Keywords: Lethal Autonomous Weapon, International Humanitarian law, Automated Weapon System AWS.
