Technoethics and Public Reason

Technoethics and Public Reason

Govert Valkenburg
Copyright: © 2013 |Pages: 13
DOI: 10.4018/jte.2013070106
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Public reason specifies the rules under which a political community collectively conducts ethical reasoning. Technoethics needs to incorporate an account of how the technologies it aspires to govern bear on these rules. As the case of biobanks shows, technologies have the capacity to change the exact meanings of concepts that play central roles in ethical reasoning. By consequence, a revision of the rules to which this ethical reasoning about the very same biobanks is subject, becomes inevitable. Thus, reasoning in technoethics becomes essentially reflexive, as it is to discuss its own rules at the same level at which it discusses its primary object, namely new technologies. Technoethics is thus not only about how human values are to be incorporated into technology design, but also about what kind of political world is constructed through technology.
Article Preview
Top

The Self-Referentiality Of Technoethics

Technoethics, being the reflection on how social, moral and political concerns can and should be incorporated in the development of technology, is profoundly situated in societies of perpetual sociotechnical change. Not only does it aim to inform technological enterprise and to open it up for moral deliberation and thereby emancipate human values. Technoethics is, as a practice, inextricably intertwined with the technological infrastructure and the essentially technological culture of which it is part. Thereby, the way we discuss moral issues of technology development, is inseparably connected to that development.

As our technological culture and our moral discursive order are so tightly connected, technoethics is an essentially reflexive or self-referential affair. Because of this reflexivity, it needs to develop a permanent awareness of how its own rules and procedures co-evolve with the sociotechnical developments it tries to accommodate. This paper identifies the challenge posed by this reflexivity, and investigates how moral discussion could be accommodated, if the rules of that very discussion are perpetually challenged.

Programmatic approaches to technoethics have emphasized the need to address and resolve moral issues at early phases of technology design, rather than trying to repair ethical issues post-hoc (Jelsma, 1992; Schomberg, 2011; Stahl et al., 2010; Verbeek, 2006). The bottom line of these approaches is the recognition that human values are entangled with technological arrangements, rather than the two being separate spheres. A straightforward early extension of this recognition from ethics into the political has for example been provided by Sclove (1995), who argues that indeed the design of technologies must be aligned with democratic values at an early phase.

An underdeveloped consequence of these visions, however, is the fact that not only technologies are to be adapted to democratic and other ethical ideals, but that also democracy and ethics are to be adapted to technological change. As such, reflexivity in the ethics of technology has indeed been observed. For example, Swierstra and Rip (2007) identify the reflexivity in ethics of technology at the level of new issues and new repertoires to discuss them. They argue that disagreement is not just the consequence of the moral pluralism that characterizes modern societies, or of any unwillingness to arrive at consensus, but of the fact that new technologies are ambivalent: what they ‘are’ and ‘do’ has not been sufficiently stabilized. Hence, moral opinions about them will vary widely. Similarly, Keulartz et al. (2004) argue that the permanent state of change in which technological societies find themselves, entails that any ethics of technology must be anti-foundationalist and pragmatic.

From this vantage point, it is only logical to move forward and focus even more explicitly on the ethical discussion itself, and how it evolves with technological change. As Anderson and Peterson (2010) show, early ethical assessment is subject to uncertainty because the effects of a certain development are hard to predict. They show that this uncertainty is used strategically, by parties framing technologies in particular ways and trying to get particular vocabularies dominant. The present paper continues this train of thought, and brings not only those uncertain consequences under scrutiny, but also the uncertainty that is engendered in the moral order itself.

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024)
Volume 14: 1 Issue (2023)
Volume 13: 2 Issues (2022)
Volume 12: 2 Issues (2021)
Volume 11: 2 Issues (2020)
Volume 10: 2 Issues (2019)
Volume 9: 2 Issues (2018)
Volume 8: 2 Issues (2017)
Volume 7: 2 Issues (2016)
Volume 6: 2 Issues (2015)
Volume 5: 2 Issues (2014)
Volume 4: 2 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing