Military Robotics and Emotion: Challenges to Just War Theory

Military Robotics and Emotion: Challenges to Just War Theory

Jai Galliott (The University of New South Wales, Australia)
DOI: 10.4018/978-1-4666-7278-9.ch019
OnDemand PDF Download:
No Current Special Offers


In this chapter the author considers the complex moral interplay between unmanned systems, emotion, and just war theory. The first section examines technologically mediated fighting and suggests that through a process of moral-emotional disengagement and emotional desensitisation, any pre-existing barriers to immoral conduct in war may be reduced. Having considered the impact on the long distance warrior's capacity or willingness to adhere to jus in bello norms, the author then examines the impact on the personal wellbeing of the operators themselves. Here, among other things, the author considers the impact of being simultaneously present in contrasting environments and argue that this, if nothing else, may lead to serious transgressions of just war principles. The fourth and final section asks whether we can eliminate or relieve some of these technologically mediated but distinctly human moral problems by further automating elements of the decision making process.
Chapter Preview

A Brief Background To The Problem

While it is high-level decision makers that are central to the initial decision to engage in warfare, it is the individual soldier who defends his state and society that must be most unconditional in exercising moral restraint and adhering to just war theory. Michael Ignatieff (1998) writes that more than any other of war-making agential group, it is the soldiers who actually conduct war that have the most influence on its outcomes and the ability to introduce the moral component. In his words, ‘the decisive restraint on inhuman practice on the battlefield lies within the warrior himself – in his conception of what is honourable or dishonourable for a man to do with weapons’ (Ignatieff 1998, p. 118). Ironically, soldiers are the primary agents of both physical violence and compassion and moral arbitration in war. As Darren Bowyer (1998) remarks, they deliver ‘death and destruction one moment…[and deal] out succour to the wounded (of both sides) and assistance to the unwittingly involved civilian population, the next’ (p. 276). The specific concern examined here is whether by removing soldiers from the battlefield and training them to fight via a technologically mediated proxy we may, through a process of psycho-moral disengagement and emotional desensitisation, lower their ability or willingness to exercise restraint and compassion in warfare and adhere to the jus in bello principles of discrimination and proportionality. It will be argued that the employment of unmanned systems tracks unethical decision-making and/or lowers barriers to killing, endangering the moral conduct of warfare.

Key Terms in this Chapter

Moral Desensitisation: A phenomenon which can reduce or even eliminate the stress of remote weapons operators by altering their comprehension or processing of sensory inputs.

Close Range Killing: Involves any easily attributable kill at ‘point-blank’ range, whether with one’s bare hands, an edged weapon or even a projectile weapon.

Long Range Killing: Involves the use some sort of mechanical or electrical assistance to view potential victims (i.e., binoculars, cameras or radar).

Maximum Range Killing: Involves the remote operation of weaponry and a significant reduction in the number of, or the complete removal of, troops on the ground.

Ethical Governor: A transformer or suppressor of automatically generated lethal action of the type wielded by unmanned systems.

Midrange Killing: Involves being able to see and engage the enemy with hand grenades, sniper rifles and so on, but usually without being able to gauge the extent of the wounds inflicted.

Unmanned Systems: Electro-mechanical military robots that operate across land, sea and air and remove human war fighters from the battle space.

Complete Chapter List

Search this Book: