Emotional Computer: Design Challenges and Opportunities

Emotional Computer: Design Challenges and Opportunities

Shikha Jain, Krishna Asawa
Copyright: © 2015 |Pages: 22
DOI: 10.4018/IJSE.2015070103
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Extensive studies established the existence of a close interaction between emotion and cognition with remarkable influence of the emotion on all sorts of cognitive process. Consequently, technologies that emulate human intelligent behavior cannot be thought completely intelligent without incorporating interference of emotional component in the rational reasoning processes. Recently, several researchers have been started working in the field of emotion modeling to cater the need of interactive computer applications that demand human-like interaction with the computer. However, due to the absence of structured guidelines, the most challenging task for the researcher is to understand and select the most appropriate definitions, theories and processes governing the human psychology to design the intended model. The objective of the present article is to review the background scenario and necessary studies for designing emotion model for a computer machine so that it could generate appropriate synthetic emotions while interacting with the external environmental factors.
Article Preview
Top

1. Introduction

The brain regions were earlier thought to be either purely emotional (e.g., amygdala) or cognitive (e.g., frontal cortex). However, research in neuroimaging contemplates (Damasio, 1994; Bechara, 2005) have exhibited that these two are closely related and work together to show complex human behavior. Various psychological experiments have already proved how various cognitive processes like perception, attention, and memory are altered by emotional events that occurred in the same environment. This observation inspires to make computer machines more human-like and raise a requirement of incorporating emotions in machines too (MacLennan, 2014).

Recall Clippit, a software agent which talks, dances, smiles and acts as a Microsoft Office Assistant. According to Microsoft, Clippit is very intelligent and knows most of the things about MS office. It can recognize user’s actions and is always at their service. However, it cannot comprehend the user’s problem or intention. Sometimes, it keeps providing useless solutions, despite irritation to the user. Now suppose the Clippit turns out to be emotionally intelligent. It starts understanding one’s problem and acts accordingly. Obviously, it cannot provide a solution to every problem, but, when the user is annoyed, instead of dancing with a little smile, if it can empathize, then definitely the user will feel more satisfied and his anger will subside. Literature also reinforces that user experience with an emotionally intelligent machine is more satisfactory (Prendinger and Ishizuka, 2005; Brave et al, 2005; Picard and Liu, 2007) and improves the overall task performance (Partala and Surakka, 2004; Moral et al., 2014).

A computer can become more realistic, more credible as well agreeable and can make more human-like decisions, if it is able to recognize, comprehend and express emotions as well at the time of interacting with a user rationally.

The opportunity of using an emotional computer lies in a wide range of domains like psychology, artificial intelligence and human-computer interaction to serve the society. The most fundamental application of an emotional computer is to illuminate cutting edge human interfaces that can perceive and react to the emotional states of the users. Users who are getting to be disappointed or irritated while working with a product would convey signs to the computer and soon thereafter the application may react in an assorted manner that the user would see as natural. For instance, a computer tutor may adjust its teaching style and presentation based upon its perception that the user is interested, exhausted or confused. It may quickly switch to the slide which is more captivating.

Further, the opportunities can be exploited by using emotion recognition tools which are available for practical use such as: Emotient, EmoVu, nVISO, Affectiva and many more. The Kinect, a sensor, tracks players’ heartbeat and physical movements and plans to use that information to gain insight into how people feel when playing the games. Furthermore, a cognitive virtual agent, Amelia (developed by IPsoft) is capable of understanding human emotions and responding accordingly. She understands what people ask even what they feel – when they call for service. She has IQ and EQ both. In most of the emotion recognition tools, the statistical approach is used to label the emotion, in accordance with the limited available dataset for example in 2013, Lawrence & Nabi developed a dataset for facial expression recognition. The dataset comprises of more than 400 colored images from three different classes: emoticon, cartoon and human faces. However, Mortillaro et al. (2012) suggested the use of appraisal theories before emotion labeling process. Such model can result in better precision as it is grounded on psychological theories of emotions.

Complete Article List

Search this Journal:
Reset
Volume 11: 2 Issues (2020)
Volume 10: 2 Issues (2019)
Volume 9: 2 Issues (2018)
Volume 8: 2 Issues (2017)
Volume 7: 2 Issues (2016)
Volume 6: 2 Issues (2015)
Volume 5: 2 Issues (2014)
Volume 4: 2 Issues (2013)
Volume 3: 2 Issues (2012)
Volume 2: 2 Issues (2011)
Volume 1: 2 Issues (2010)
View Complete Journal Contents Listing