Robots and the Ethics of Care

Robots and the Ethics of Care

Linda Johansson (KTH Royal Institute of Technology, Stockholm, Sweden)
Copyright: © 2013 |Pages: 16
DOI: 10.4018/jte.2013010106
OnDemand PDF Download:
No Current Special Offers


In this paper, the moral theory ethics of care – EoC – is investigated and connected to care robots. The aim is twofold: first, to provide a plausible and ethically relevant interpretation of the key term care in EoC (which is, it is argued, slightly different from the everyday use of the term) indicating that we should distinguish between “natural care” and “ethical care”. The second aim is to discuss whether EoC may be a suitable theory to implement in care robots. The conclusion is that EoC may be a theory that is suitable for robots in health care settings.
Article Preview


With the emergence of more advanced and autonomous robots, there is a need to ensure that future robots behave “ethically” or morally right; that when they stand in front of a decision with moral impacts, there are some sort of internal guidelines or restrictions that make sure that they do not engage in immoral behavior. There is no clear answer to what ethical – or immoral – behavior is; there are several moral theories attempting to provide criteria for the morally right action: utilitarianism, according to which the consequences of actions should maximize well-being or utility, or deontology, or virtue ethics, to mention a few. Which theory should be implemented in an advanced robot? And should there even be a moral theory programmed into it?

These questions are examples of topics in technoethics. Technoethics is an interdisciplinary research area concerned with ethical aspects of technology in society, “dealing with the human processes and practices connected to technology which are embedded within social, political, and moral spheres of life” (Luppicini & Adell 2008). It is highly important to discuss ethical issues connected to technology with autonomous elements, and this paper is an effort to do this.

In the discussion surrounding robots and ethics, there are two different approaches: the top-down method, and the bottom-up – where the robot would learn the desired behavior as it goes along, much like a child learns morality – and hybrid variants. To program a moral theory such as utilitarianism – or Asimov’s laws for robots – would be examples of the top-down method. The theories usually discussed for programming ethical behavior into robots have been utilitarianism (Grau, 2011) and deontology (Beavers, forthcoming), and there has been a more general focus, on some generic “advanced robot” rather than robots in a certain area, such as robots in war, healthcare or airport security.

In this paper, the normative theory ethics of care – EoC – will be investigated and connected to robots. This theory has not previously been given much attention in terms of robots. Aimee van Wynsberghe (2012) may be the first to connect EoC to the design of robots. Her focus regarding robots, as well as the focus in this paper, is on robots in health care (care robots), which poses different ethical questions, compared to that of robots in war, for instance: “ [care robots] require rigorous ethical reflection to ensure their design and introduction do not impede the promotion of values and the dignity of patients at such a vulnerable and sensitive time in their lives… [there] are no universal guidelines or standards for the design of robots outside the factory.” (van Wynsberghe, 2012)

Van Wynsberghe suggests that an ethics of care, combined with Care sensitive value sensitive design1, should determine how care robots are be designed. “The care centered framework and CVSD methodology both pay tribute to the central thesis in care ethics, namely that the care perspective provides an orientation from which to begin theorizing as opposed to a pre-packaged ethical theory.” (van Wynsberghe, 2012)

Van Wynsberghe discusses the robots of today or the near future2, while my discussion is intended to be relevant also for future robots in possession of more autonomy. One reason for this is an idea on how the key concept “care” should be interpreted when used in an ethical theory such as EoC. That is, it is important to distinguish between “care” in a wider sense, as the word is used in daily parlance, and “care” as a value for guiding moral action. An important challenge will arise when we need to design robots that are able to make decisions that have ethical implications such as needing to choose which patient to help first; a scenario that might not be too far off in the future. Therefore it is important to be precise when discussing and defining the key concept of care as it is and should be used in a normative theory such as EoC.

EoC has different versions, which makes it difficult to pin down, and this paper is a contribution to how “care” should be understood in an ethical, rather than a natural sense. The conclusion is that EoC may be a suitable theory to implement in advanced care robots.

Complete Article List

Search this Journal:
Open Access Articles
Volume 13: 2 Issues (2022): Forthcoming, Available for Pre-Order
Volume 12: 2 Issues (2021)
Volume 11: 2 Issues (2020)
Volume 10: 2 Issues (2019)
Volume 9: 2 Issues (2018)
Volume 8: 2 Issues (2017)
Volume 7: 2 Issues (2016)
Volume 6: 2 Issues (2015)
Volume 5: 2 Issues (2014)
Volume 4: 2 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing