Article Preview
Top1. Introduction
The brain regions were earlier thought to be either purely emotional (e.g., amygdala) or cognitive (e.g., frontal cortex). However, research in neuroimaging contemplates (Damasio, 1994; Bechara, 2005) have exhibited that these two are closely related and work together to show complex human behavior. Various psychological experiments have already proved how various cognitive processes like perception, attention, and memory are altered by emotional events that occurred in the same environment. This observation inspires to make computer machines more human-like and raise a requirement of incorporating emotions in machines too (MacLennan, 2014).
Recall Clippit, a software agent which talks, dances, smiles and acts as a Microsoft Office Assistant. According to Microsoft, Clippit is very intelligent and knows most of the things about MS office. It can recognize user’s actions and is always at their service. However, it cannot comprehend the user’s problem or intention. Sometimes, it keeps providing useless solutions, despite irritation to the user. Now suppose the Clippit turns out to be emotionally intelligent. It starts understanding one’s problem and acts accordingly. Obviously, it cannot provide a solution to every problem, but, when the user is annoyed, instead of dancing with a little smile, if it can empathize, then definitely the user will feel more satisfied and his anger will subside. Literature also reinforces that user experience with an emotionally intelligent machine is more satisfactory (Prendinger and Ishizuka, 2005; Brave et al, 2005; Picard and Liu, 2007) and improves the overall task performance (Partala and Surakka, 2004; Moral et al., 2014).
A computer can become more realistic, more credible as well agreeable and can make more human-like decisions, if it is able to recognize, comprehend and express emotions as well at the time of interacting with a user rationally.
The opportunity of using an emotional computer lies in a wide range of domains like psychology, artificial intelligence and human-computer interaction to serve the society. The most fundamental application of an emotional computer is to illuminate cutting edge human interfaces that can perceive and react to the emotional states of the users. Users who are getting to be disappointed or irritated while working with a product would convey signs to the computer and soon thereafter the application may react in an assorted manner that the user would see as natural. For instance, a computer tutor may adjust its teaching style and presentation based upon its perception that the user is interested, exhausted or confused. It may quickly switch to the slide which is more captivating.
Further, the opportunities can be exploited by using emotion recognition tools which are available for practical use such as: Emotient, EmoVu, nVISO, Affectiva and many more. The Kinect, a sensor, tracks players’ heartbeat and physical movements and plans to use that information to gain insight into how people feel when playing the games. Furthermore, a cognitive virtual agent, Amelia (developed by IPsoft) is capable of understanding human emotions and responding accordingly. She understands what people ask even what they feel – when they call for service. She has IQ and EQ both. In most of the emotion recognition tools, the statistical approach is used to label the emotion, in accordance with the limited available dataset for example in 2013, Lawrence & Nabi developed a dataset for facial expression recognition. The dataset comprises of more than 400 colored images from three different classes: emoticon, cartoon and human faces. However, Mortillaro et al. (2012) suggested the use of appraisal theories before emotion labeling process. Such model can result in better precision as it is grounded on psychological theories of emotions.