Vanguard Magazine

Vanguard August/September 2023

Preserving capacity, General Tom Lawson, Chief of the Defence Staff, Keys to Canadian SAR

Issue link: http://vanguardcanada.uberflip.com/i/1508203

Contents of this Issue

Navigation

Page 22 of 35

This perspective becomes particularly relevant when considering the potential repercussions of using AI in the military domain. Lethal autonomous weapons sys- tems (LAWS), sometimes powered by AI, come in various forms, such as guided mis- siles, militarized robots, and drones of all sizes. Those drones have the potential to disrupt traditional patterns of warfare by redefining the dynamics of armed conflict through the introduction of new defence tactics and strategies. This use concerns many states and non-governmental orga- nizations since LAWS are distinguished by their ability to select and attack targets without human intervention, as presented by the International Committee of the Red Cross (ICRC). However, the definition of LAWS is not universal. Each state adopts its own defi- nition, which may vary in restrictiveness. Professors from the University of Oxford and the Alan Turing Institute have identi- fied 12 definitions of LAWS provided by states or key international actors, based on official documents. Some countries, such as France and the United States, define LAWS as machines that, once activated, are fully autonomous from humans. The British government specifies in its defini- tion that autonomous systems can under- L E T H A L W E A P O N S stand and interpret human intentions. Thus, the United Kingdom distinguishes itself by emphasizing the intrinsic capa- bilities of autonomous systems. Finally, some countries, such as Switzerland and the ICRC, focus on the nature of the tasks performed by the machine and the legal implications of autonomous action. Ac- cording to them, the machine should, at all times, be capable of complying with international humanitarian law (IHL), especially during the targeting process. Therefore, there are divergent approaches from different states and key international actors when it comes to defining lethal au- tonomous weapons systems. The rationale developed below focuses on two key aspects related to the issues posed by LAWS: the ambiguity of their definition and the inherent challenges in establishing a normative framework. These elements shed light on preconcep- tions and realities concerning the use of these autonomous weapons, which are often described in the press as "game changers" (i.e. weapons that can alter the course and dynamics of armed conflict). By extension, as we will see, these con- cerns raise questions about the role of the human being on the battlefield and the potential for its diminishment. Degree of autonomy and human control The multiple definitions of LAWS are ex- plained by the inherent difficulty in pre- cisely qualifying what autonomy means. Indeed, it is essential to distinguish be- tween functions automated by humans and the independence of artificial intelli- gence systems from human control. How- ever, the latter – achieving full autonomy – is not the current reality of LAWS. The degree of autonomy is determined by the ability of these weapons to make decisions based on their own (pre-pro- grammed by humans) analysis of the situa- tion. There are several levels of autonomy. For example, the Russian Navy stands out with its P-800 Onik missiles, smaller than the Granit missiles but still equipped with an artificial intelligence system. Thanks to its autonomous "fire and forget" system, this missile could, using satellite guidance, track its target in real-time and adapt its trajectory. According to Russian state me- dia, it may even work in tandem to iden- tify and classify targets before choosing an appropriate attack strategy. Once the main target is destroyed, the remaining missiles could be redirected toward other ships to avoid any duplication of attack. The Onik missiles are not the only au- tonomous weapons capable of selecting and engaging a target without human in- tervention, thanks to their programming. Suicide drones like the Turkish Kargu-2 and the Russian KUB are reportedly ca- pable of operating in complete autonomy and independently targeting without hu- man assistance. This asset, perceived as significant by several states, such as the United States and China, has led them to make considerable investments in the de- velopment of such autonomous weapons. However, defence companies tend to ex- aggerate the capabilities of their products, and only "confidential" sources suggest autonomous use of these weapons in the United Nations report on Libya in 2021. In addition to these mobile autonomous weapons, there are also fixed LAWS, such as the South Korean robotic senti- nel SG-RA1. This weapon is deployed in the demilitarized zone between the two Koreas. Although it can spot intrusions and fire autonomously, it always sends a firing authorization request to the com- mand post. This choice to retain human control primarily reflects South Korea's ethical consideration, as the technol- ogy does have a function that allows it A Royal Canadian Navy member aboard HMCS HARRY DEWOLF launches a PUMA UAV in order to conduct drug interdiction surveillance during Operation CARIBBE. Photo: Canadian Armed Forces www.vanguardcanada.com AUGUST/SEPTEMBER 2023 23

Articles in this issue

Links on this page

view archives of Vanguard Magazine - Vanguard August/September 2023