Defining Autonomous Weapon Systems: A Conceptual Overview of Existing Definitory Attempts and the Effectiveness of Human Oversight

Open Access
Article
Conference Proceedings
Authors: Mark Tsagas

Abstract: It is only “natural that proponents and opponents of Autonomous Weapon Systems (AWS) will seek to establish a definition that serves their aims and interests. The definitional discussion will not be a value-neutral discussion of facts, but ultimately one driven by political and strategic motivations” (UNIDIR, 2017). Though somewhat of a sullen statement, conceptually echoing the self-serving outcome-oriented nature of ‘conflict’ espoused by the proverb, ‘all is fair in love and war’ (attributed to the poet John Lyly), the aforementioned stipulation by the United Nations Institute for Disarmament Research (UNIDIR), coupled with the belief that some States are reluctant to engage in a broader definitional exercise, have recently been proven rather prophetic.In effect, the concepts above have somewhat accurately described the existing state of the discourse surrounding the technology in question. Specifically, the divided stance on definitions can be evidenced in a wide array of publications. These range from and include the 2024 ‘collation of responses’ report compiled by the Implementation Support Unit of the Convention on Certain Conventional Weapons (UNODA), the UNODA 2023 report, which focused specifically on individual definitions and characterisations from multiple countries, as well as results from domestic inquiries (such as those arising from the House of Lords AI in Weapon Systems Committee Report and the subsequent UK Government response). The latter two documents provide an informative snapshot as to the potential reasoning behind this discourse, reverberating the initial quotation’s poignancy. Specifically, the House of Lords (2023) Committee acknowledged the United Kingdom’s lack of an operational definition for Autonomous Weapon Systems, alongside the challenges such an absence may pose regarding regulation. Yet, the UK Government’s response stipulated that while it respected the general arguments put forward by the Committee, it did not intend to adopt an official or operative definition for AWS. The reasoning? Definitions are typically an aid to policy making and may serve as a starting point for a new legal instrument prohibiting certain types of systems, which would represent a threat to UK defence interests (Ministry of Defence, 2024). This stance, albeit not unique to the UK, aptly demonstrates that the suggested apprehension regarding the adverse effects of adopting a wider definition, in so far as bringing the legitimacy or legality of encompassed technologies into question (UNIDIR, 2017), seems to have retained its prevalence.Irrespective of the explicit or implicit recognition of the extent to which International Humanitarian Laws (IHL) should apply with respect to the potential use of weapons systems based on emerging technologies (UNIDIR, 2023), namely the adherence to the principles of Distinction, Proportionality and Precaution in attack, it remains necessary to conduct an inquiry and ascertain what a value-neutral definition of AWS can be. Consequently, the effectiveness of human oversight must also be reviewed. Not only does the human-element operate as an integral thematic component of the existing definitions but the value of its position as a pre-requisite for adherence needs to be ascertained.

Keywords: Autonomous Weapon Systems (AWS), Lethal Autonomous Weapon Systems (LAWS), Automatic, Automated, Generative Artificial Intelligence, Human Oversight.

DOI: 10.54941/ahfe1005950

Cite this paper:

Downloads
4
Visits
46
Download