Modeling the Relationship between a Human and a Malicious Artificial Intelligence, Natural-Language ’Bot in an Immersive Virtual World: A Scenario

Modeling the Relationship between a Human and a Malicious Artificial Intelligence, Natural-Language ’Bot in an Immersive Virtual World: A Scenario

Shalin Hai-Jew (Kansas State University, USA)
DOI: 10.4018/978-1-4666-3637-8.ch016
OnDemand PDF Download:
List Price: $37.50


People go to virtual immersive spaces online to socialize through their human-embodied avatars. Through the “passing stranger” phenomenon, many make fast relationships and share intimate information with the idea that they will not deal with the individual again. Others, though, pursue longer-term relationships from the virtual into Real Life (RL). Many do not realize that they are interacting with artificial intelligence ’bots with natural language capabilities. This chapter models some implications of malicious AI natural language ’bots in immersive virtual worlds (as socio-technical spaces). For simplicity, this is referred to as a one-on-one, but that is not to assume that various combinations of malicious ’bots or those that are occasionally human-embodied may not be deployed for the same deceptive purposes.
Chapter Preview

A Review Of The Literature

Information and Communication Technologies (ICT) have been explored to create addictiveness in people, particularly in drawing them to immersive game worlds—to increase play and the popularity of such games and spaces, and also to increase embedded learning in entertainment. Castronova, an economist, has described some of the reward structures in immersive games that encourage long-term (addictive) play. For decades now, humans have been compromised by their lack of understanding of information systems and their implementation (Giani, 2006). Giani explains:

A cognitive channel is the communication channel between a person and the information technology used. An attack on a cognitive channel exploits the vulnerabilities between the user, her perception of the information system, and the actual underlying technology. The vulnerabilities are in the gap between the user’s mental model of the information system and its actual implementation (Giani, 2006, p. ii).

Rice (2008) describes an “asymmetry of intimacy” between people and computing machines, with so much intimate detail and revelations of humans distributed over machines without true human awareness. ICT spaces may be turned into elegant traps for the unwary. The employment of psychological techniques in relation to information technology has resulted in hundreds of billions of dollars in losses annually in the US. Pre-texting (“the act of creating and using an invented scenario in order to induce a victim to release confidential information or take actions to weaken the security of system”) and phishing (“to direct users to fraudulent Web sites in order to steal personal identity information, credentials and financial data”) involve a clear psychological component (Enrici, Ancilli, & Lioy, 2010, p. 461). “Spear-phishing” refers to targeting of individuals through deceptive measures to steal particular information and compromise a particular individual. A further step in this manipulation involves the creation of Non-Player Characters (NPCs) or robots (‘bots) that play alongside humans as companions to create relationships and elicit information. Immersive game worlds have been used to launder money, and game exploits have been used to amass virtual fortunes that may be translated into real-world moneys. From this compromised space, there has emerged a threat model involving apparent human-embodied avatars which are actually Turing-competent AI ‘bots that can build relationships with people and lead them into a sense of trust relationship that results in their ultimate compromise (financial, informational, emotional, and others compromises or mixes of compromise of their well-being or health).

What will it take to make an elegant trap for people using a semi-smart ‘bot that is mimicking a human-embodied avatar (but really a non-player character or even a sometimes-human-embodied “cyborg”) through various emulated relationship manipulations such as the so-called “immersive parasocial” (Hai-Jew, 2009). (The “immersive parasocial” describes a mediated phenomenon in which people emotionally and mentally relate to media figures—whether real or fictional—as if they were in an actual relationship—at the far extreme).

Complete Chapter List

Search this Book: