Preserving capacity, General Tom Lawson, Chief of the Defence Staff, Keys to Canadian SAR
Issue link: http://vanguardcanada.uberflip.com/i/1508203
24 AUGUST/SEPTEMBER 2023 www.vanguardcanada.com The question of whether war is inherently linked to human presence on the battlefield or whether it could be replaced by new technologies finds its ultimate answer in the enlightening words of Colonel Ardant du Picq: "Man is the primary instrument of combat." L E T H A L W E A P O N S to open fire without human supervision. Therefore, the primary constraint does not lie in the technology itself but rather in a government's willingness to develop or ac- knowledge the existence of such politically critical technology. It is crucial to differ- entiate between autonomy and control to avoid considering automation and human control as mutually exclusive concepts: au- tomation can eliminate human interven- tion but does not render human control impossible. The absence of a normative framework These degrees of autonomy in weapons systems capable of acting without direct human supervision have led certain states, such as Austria, Brazil and Costa Rica, to seek regulation of their use. Autonomous weapons have been de- veloped for more than 30 years, but it is only recently that the first UN commis- sion called for a ban on LAWS until an appropriate normative framework is es- tablished. The question, therefore, arises as to whether these current autonomous weapons represent a significant departure from previous developments. It is essential to note that autonomous weapons are not an entirely new concept, as illustrated by landmines or automated missile defence systems. Therefore, auton- omy in weapons has already been present in various forms for some time, and some of them are already subject to regulation. For example, the Ottawa Convention has internationally regulated anti-personnel mines since 1997. Thus, over 25 years ago, there was already an awareness of the need to limit weapons independent of hu- man control with this convention. This perspective has been reiterated recently before the European Parliament, the Unit- ed Nations and the ICRC. The problem with the evolution of au- tonomous weapons and the lack of regula- tion lies in two points. Firstly, the diversity of autonomous weapons and their varying degrees of autonomy makes it extremely complex to establish a coherent and ef- fective definition, let alone a normative framework. It is a major challenge that the international community has not yet managed to overcome. Additionally, the current strategic context and geopolitical motivations play a significant role. Some states, notably the United States and Rus- sia, which are developing these systems, have no interest in supporting the de- velopment of regulations as they believe they can gain a competitive advantage over their adversaries. This advantage in- cludes more precise attack capabilities, a reduction in human casualties, and im- proved responsiveness on the battlefield. In this context, these states are reluctant to encourage strict regulation that could restrict their freedom of action and tech- nological edge. LAWS: between myths and realities However, will these autonomous weapons systems revolutionize the terms of war- fare? Or will they suffer the same fate as hypersonic weapons, which were expected to bring about a major revolution but have so far turned out to be mainly "press release" weapons? Indeed, one should not believe in the creation of the perfect weapon. LAWS, no matter how revolutionary, have limi- tations and vulnerabilities. Like any elec- tronic technology, they are susceptible to disruption or neutralization. They will be subject to attacks from the adversary who could, by taking control of their commu- nication channel, jam or intercept the in- formation or intelligence of the machine. The unpredictable and volatile nature of modern conflicts requires the use of a va- riety of weapons and strategies to achieve military objectives. Autonomous drones are, therefore, just one more tool in the arsenal of "Air power" available to states. While technological advancements im- prove the collection and sharing of infor- mation on the battlefield, artificial intel- ligence partly contributes to complicating matters. Among other things, autono- mous systems introduce new variables and require constant adaptation. AI, when combined with hypersonic weapons, leads to ultra-fast decision-making, which can result in sudden and risky tactical situa- tions. War, as a human activity, undeniably contains unpredictable factors and ambi- guity. Therefore, despite these innova- tions, there will always be certain limits to the ability to fully see and understand the complex realities of a military scenario. Clausewitz's fog of war merely shifts with the emergence of these new technologies. Strategies overtly reliant on rationality and certainty are simply dangerous in the dynamic and variable environment of war. Should we believe in the reduction of human presence on the battlefield? The increasing use of drones and autono- mous weapons raises questions about the future of military operations. Will we wit- ness massive offensives with expeditionary forces, similar to the Normandy landings in 1944, or will we instead see countries be invaded by swarms of drones? In May 2022, China launched its first autonomous navigation drone carrier, the "Zhu Hai Yun." Officially presented as an oceanographic research vessel, its military functions are nevertheless evident. This vessel, which can accommodate up to 50 aerial, surface and underwater drones, becomes a strategic instrument for Chi- nese actions in the Indo-Pacific region. In parallel, Turkey also stands out with the development of "drone carrier vessels" equipped with Bayraktar TB2 drones. Ad- mittedly, these drone carriers would not be able to participate in high-intensity aer- ial conflicts where they could not, for ex- ample, compete with light aircraft carriers equipped with Japanese or Italian F-35Bs.