Are Persuasive Technologies Really Able to Communicate?: Some Remarks to the Application of Discourse Ethics

Are Persuasive Technologies Really Able to Communicate?: Some Remarks to the Application of Discourse Ethics

Christian Linder (Department of Research Management, Institute for Employment Research, Nuremberg, Germany)
Copyright: © 2014 |Pages: 15
DOI: 10.4018/ijt.2014010104
OnDemand PDF Download:
$30.00
List Price: $37.50

Abstract

Since a while the ethics of persuasive technology (PT) have been discussed. One interesting approach is the assessment of PTs in the light of discourse ethics and the speech-act theory as proposed recently. While some see such an approach as promising, the author will illustrate that the application of discourse ethics is only appropriate for a few limited persuasive strategies. It is argued that most often PT does not provide the essentials of a discourse; reason or arguments to convince the counterpart. In line with discourse ethics the elements of speech-act theory refer to the preconditions every debater has to subscribe to in order to reach a mutual understanding that is the ultimate goal of a discourse. It is evident that PT has to deal with serious problems in order to fulfill the preconditions such as comprehensibility, truth, truthfulness and legitimacy. If discourse ethics is the theoretical framework which reflects the moral content of PT, the intention of the designer and his arguments or reasons have to be taken into account. It is argued that this often contradicts the purpose of persuasion or manipulation if PT is applied. This paper provides propositions that should ensure that the design of PT fulfill the basic requirements of discourse ethics.
Article Preview

Introduction

Since information technology is constantly advancing, technical artifacts are continuously equipped with new functional dimensions; the ability to influence users by providing information in a way that affects their behavior. So called persuasive technologies (PT) offer a great number of opportunities to influence the user that ranges from conviction to persuasion and manipulation. From an ethical point of view, to convince users is typically seen as uncritical since it refers to arguments and the free choice of the interlocutor to replace or adjust his outlook in the face of better arguments. Manipulation in return has a negative connotation which relates to an intention that is tried to be enforced against the will of the counterpart. Here, associations which are parallel with asymmetric power and advantage taking or outsmarting spring to mind.1 In PT the ability to influence users ranges between these two extremes.

To understand the persuasive power of technology scholars often refer to Foggs (2003) who defines PT as “an attempt to shape, reinforce, or change behavior, feelings, or thoughts about an issue, object, or action” (p. 225). Fogg (1999) further highlights the intentionality of PT as a purposely designed tool with the goal to influence the user. Because machines have no intention at all, it must be understood that PT is a medium for the engineer, the designer, or the applying authority to ‘speak’ to the user or to send his message to the counterpart in order to archive his intended goals.2

Therewith persuasive artifacts raise some techno-ethical questions. Since techno-ethics as an interdisciplinary field provide theories and methods for analysing all normative aspects in the relationship between technology and society, it can provide a solid basis for evaluating and conceptualising the impact of persuasive technologies from an ethical point of view (Luppicini, 2010). Moral issues arise in this respect from the range of responsibilities of the engineer (Bunge, 1977; Jonas, 1979). Persuasion through technology has uncountable facets and therewith the engineers’ responsibility is never static, but changes its shape according to the particular context in which the technology is applied. With emerging technological possibilities, more and more sophisticated means with the opportunity to persuade will appear for both the good and the bad. Persuasive technologies have the ability to affect society in many aspects. From a paternalistic perspective, they may be used to educate people in a way that is not possible today. From an economic perspective they provide an instrument for competition and of course they may also play a role when it comes to political opinion making. So there are critical issues that are raised through persuasive technologies. The call for techno-ethics is understandable, because today there is no consensus how and to what extent persuasive technology ought to be a means for reaching certain aims. The character of the modern world is shaped by technology and it shapes out needs and desire for more and new technical solutions (Borgmann, 1984). To handle these dynamics, the techno-ethical reflections about the questions in what kind of world we would live, need to be looked at.

One argumentative starting point of such reflections can be the analysis of what persuasive technologies actually do. This is first of all communications. These technologies attempt to create an exchange of information with the respective user. So if PT establishes such kind of communicative interaction, it is not farfetched to seek to apply the basic ideas of discourse ethics and the basic elements of analytic speech-act theory to PT. In this sense, Spahn (2011a; 2011b) refers to PT as an act of communication that provides an access to contemporary rationalistic or deontological ethics. Such an approach is interesting, because it adds a new concept of reflection to a mainly utilitarian or consequentialist driven discussion. This is why it is a very promising procedure to analyse the ethical issues arising from the growing development and the application of PT, and its increasing penetration into our daily life.

Complete Article List

Search this Journal:
Reset
Open Access Articles: Forthcoming
Volume 8: 2 Issues (2017)
Volume 7: 2 Issues (2016)
Volume 6: 2 Issues (2015)
Volume 5: 2 Issues (2014)
Volume 4: 2 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing