Automatic Emotion Recognition Based on Non-Contact Gaits Information

Automatic Emotion Recognition Based on Non-Contact Gaits Information

Jingying Wang (University of Chinese Academy of Sciences, China), Baobin Li (University of Chinese Academy of Sciences, China), Changye Zhu (University of Chinese Academy of Sciences, China), Shun Li (University of Chinese Academy of Sciences, China) and Tingshao Zhu (University of Chinese Academy of Sciences, China)
Copyright: © 2018 |Pages: 12
DOI: 10.4018/978-1-5225-2255-3.ch012
OnDemand PDF Download:
$30.00
List Price: $37.50

Abstract

Automatic emotion recognition was of great value in many applications, however, to fully display the application value of emotion recognition, more portable, non-intrusive, inexpensive technologies need to be developed. Except face expression and voices, human gaits could reflect the walker's emotional state too. By utilizing 59 participants' gaits data with emotion labels, we train machine learning models that are able to “sense” individual emotion. Experimental results show these models work very well, proved that gait features are effective in characterizing and recognizing emotions.
Chapter Preview
Top

Background: Gait And Emotion

Walking is one of the basic and important components for the body posture and movement, and psychological research found that affective states can be identified by gaits (Montepare, Goldstein & Clausen, 1987). People in different emotional states could walk in different speed and show different gait patterns (Strike, Mohiyddini & Carlisle, 2009). Human can perceive others’ emotion from gait or posture in daily life. For example, people in fear may shrink his shoulder, and sad ones might lower his head and walk slowly (Roether, Omlor & Christensen, 2009). Even when gait was minimized by use of point-light displays, which meant to represent the body motion by only a small number illuminated dots, observers still could make judgments of emotion category and intensity (Atkinson, Dittrich, Gemmell, & Young, 2004).

Since Montepare et al. (1987) firstly demonstrated that gait relates to emotions, there have been a lot of researches focusing on the relationship between gaits and emotions. Krieger et al. (2013) found that gait, cognition and emotion are closely related. Applying sparse regression, Roether et al. (2009) extracted critical emotion-specific posture and movement features depended only on a small number of joints. Gross et al. (2011) identified the movement characteristics associated with positive and negative emotions experienced during walking. Destephe et al. (2013) assessed the differences between the expression of emotion regarding the expressed intensity. Characterizing human walking patterns by some kinematic cues, Hicheur et al. (2013) produced the avatar animation whose emotions could be recognized by human observers. In particular, by using data mining technique, much research has been done to recognize emotions automatically by analyzing subjects’ gait. The accuracy of classifying distinct emotional states was 60~89% (Janssen, Schollhorn & Lubienetzki, 2008; Hicheur, Kadone, Grezes & Berthoz, 2013; Gunes, Shan & Chen, 2015; Clark, Pua & Pua, 2012).

Key Terms in this Chapter

Automatic Emotion Recognition: It is a manner of recognize emotion by using computer technologies such as signal processing, machine learning, and computer vision.

Affective Computing: It is defined as recognize, interpret, process, and simulate emotion via technologies or devices.

Emotional Arousal: Which is an emotional reaction induced by certain emotional stimulus.

Emotion: It is a manner of nonverbal expression of people’ s views and attitudes.

Complete Chapter List

Search this Book:
Reset