The Cultural Influence of Control Sharing in Autonomous Driving

The Cultural Influence of Control Sharing in Autonomous Driving

Yun Wan (University of Houston, USA), Emem Akpan (University of Houston, USA) and Hongyu Guo (University of Houston, USA)
Copyright: © 2022 |Pages: 13
DOI: 10.4018/IJT.302629
OnDemand PDF Download:
Available
$33.75
List Price: $37.50
10% Discount:-$3.75
TOTAL SAVINGS: $3.75

Abstract

This research investigated the cross-cultural perspectives on control-sharing in ethical decision-making when both human and AI-enabled auto-driven vehicles to involve. We reviewed the current practices. We then illustrated a survey we conducted related to this topic on a total of 771 subjects from three nations, the U.S., India, and Nigeria. We found participants from individualistic culture tend to emphasize personal choice and human control. We also found though most subjects prefer human drivers to take full control, India's subjects were more ambivalent in their attitude due to lower uncertainty avoidance. Also, subjects with higher incomes were more likely to cede control. There was consistent proportional distribution across nations in the control sharing configuration, with 2/3 chose full customization, and 1/3 chose limited customization. Car owners are more likely to have more control and full customization. Our findings shed important insights on both research in this domain and industry practitioners.
Article Preview
Top

Introduction

With the advancement of artificial intelligence and computing technology, especially deep learning, AI-driven robotic machines, such as autonomous vehicles, are increasingly taking more, if not all, control from human beings. When this happens, AI-driven robot became an agent of human being to make decisions for the latter, including those related to ethics, hence, becomes so-called ethical robot. In this study, we define Artificial Intelligence as the ability of an intelligent agent to do what is appropriate for its circumstances and its goal, flexible to changing environments and goals, learns from experiences, and make appropriate choices given perceptual limitations and finite computation (Poole, Mackworth, & Goebel, 1998).

According to Moor (Moor, 2009), there are four kinds of ethical robots based on the implicit or explicit of ethical rules and the agents' ethical or unethical motivations. An autonomous vehicle could either be an implicit ethical agent, which have fixed ethical rules (such as safety control) built into their design and acting according to ethics, or an explicit ethical agent, which can identify and process ethical information or rules and make sensitive determinations about what should be done, hence from ethics.

When AI-driven machines began to share control with human beings, either implicitly or explicitly, certain ethical dilemma would undoubtedly be involved, such as the Trolley problem. In such a scenario or similar, who should take control or how to share control in ethical decision-making, in general, becomes relevant and important for human drivers, robot manufacturers, policymakers, and other stakeholders to understand and decide.

The control sharing between human and autonomous A.I. has technical, social/legal, and ethical implications. For an auto manufacturer to design an AI-enabled vehicle, one has to address the control-sharing and either enable A.I. to make certain type of ethical decision or transfer it to human. A proper design requires input and understanding of human needs or expectations. The stakeholders involved in the technical design are not only auto manufacturer and consumers (human driver), but also the public and legislatures, because the decision could re-distribute the risk among passengers, pedestrians, and other drivers. We need explicit clarification from the legislature regarding responsibility and liability when an ethical decision-making lead to accidents. There is still another level of complexity on top of design and legal implications, that is consumers with varying social background may have contrasting or even conflicting ethical and moral perspective in control sharing (J. F. Bonnefon, Shariff, & Rahwan, 2016), which means one configuration or design may not fit all consumers. Thus, there is a research gap in how subjects’ social background relates to their view on the ethical behavior of Autonomous Vehicles and whether, or to what extent, human intervention or control-sharing is necessary.

This study investigated this topic from a social and ethical perspective. We explored this issue mainly through the national cultural lens because culture plays influential role in an individual’s social background and existing studies indicated users' cultural background could lead to distinctive preference in decision-making when facing ethical dilemmas in auto-driving (Awad et al., 2018).

This research question has significant implication to both auto-industry and consumers. Consumers with varying demographic background need to understand the standing of each other regarding sharing control with AI in, not just vehicle, but also other commonly used household items. Meanwhile, the auto industry needs to understand the expectation and readiness of human stakeholders, sometimes conflicting perspectives, and considering them when either adopting mandatory ethics setting in AI-enabled auto design (Gogoll & Müller, 2017) or allow human driver to personalize the setting, oversighting, or conditional override pre-existing settings, based on technical feasibilities (Lütge et al., 2021).

Considering the trend in adopting AI-driven vehicle around the globe and the intensity of competition among a few auto manufacturers involved, how to understand and manage these varying expectations could be either a bottom line to stay in the marketing or a major dimension of competitive advantage.

Complete Article List

Search this Journal:
Reset
Volume 14: 1 Issue (2023): Forthcoming, Available for Pre-Order
Volume 13: 2 Issues (2022)
Volume 12: 2 Issues (2021)
Volume 11: 2 Issues (2020)
Volume 10: 2 Issues (2019)
Volume 9: 2 Issues (2018)
Volume 8: 2 Issues (2017)
Volume 7: 2 Issues (2016)
Volume 6: 2 Issues (2015)
Volume 5: 2 Issues (2014)
Volume 4: 2 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing