Automatical Emotion Recognition Based on Daily Gait

Automatical Emotion Recognition Based on Daily Gait

Xiaoqian Liu, Tingshao Zhu
Copyright: © 2019 |Pages: 14
DOI: 10.4018/978-1-5225-7128-5.ch004
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

In this chapter, a kind of emotion recognition method based on gait using a customized smart bracelet with a built-in acceleration sensor was introduced in detail. The results showed that the classification accuracies of angry-neutral, happy-neutral, angry-happy, and angry-happy-neutral using the acceleration data of wrist are 91.3%, 88.5%, 88.5%, and 81.2%, respectively. Besides, the wearable devices and motion-sensing technology application in psychology research have been further discussed, and non-contact emotion identification and mental health monitoring based on offline behaviors were reviewed summarily.
Chapter Preview
Top

Introduction

Brief Review of the Literature About the Relevance Between Emotion and Offline Behaviors

Emotion expression plays an important role in human communication, especially for Chinese, who saying the same words with different emotions may indicate different or even opposite meanings. Therefore, emotion recognition is a necessary and valuable research for human-computer interaction applications. For example, if a service robot could automatically and accurately perceive and respond to the user's emotion (such as anger, happiness or sadness), it could correspondingly provide services with high-quality.

It was James, the father of American psychology, who gave the earliest definition of emotion. In his article published in 1884, he stated that emotion is a feeling of physical changes, which leads to emotional perception. Any emotion is associated with certain physical changes, including facial expression, muscle tension and visceral activities (James, 1884). Lazarus considered emotion is the combination of physiological disturbance, affection and action tendencies that do not need to show (Smith, 1990).

As early as the 19th century, Darwin suggested that different emotions correspond to certain gestures; for example, when a person is in anger, the body would be trembling and the chest swelling. Gunes et al. (2015) pointed out that emotion analysis based on physical expression (bodily expression) is of great significance, firstly, providing a long-distance emotion identification method, secondly, providing a vague emotion identification method. Wallbot (1998) found that the relationship between postures and arousal degree is more closely than that between postures and specific emotions; that is to say, a particular emotion does not necessarily correspond to a particular gesture, but is associated with emotional arousal level. The research by De Meijer showed that different types of emotion could be recognized based on the number, type, and intensity of gestures (Meijer, 1989). Crane and Gross (2007) indicated that emotions are associated with physical movements; they have identified velocity, cadence, head orientation and the motion range of shoulder and elbow as significant physical parameters which are affected by emotions. Several emotion recognition approaches have been proposed and to some degree progressed. According to the characteristics used in emotion recognition, those approaches could be classified as facial expressions-based, linguistics-based, physiological parameters-based, gestures-based and body motions-based (Peter & Beale, 2008).

Walking is one of the most common behaviors in people's daily life, and many psychological studies supported that emotions can be expressed in gait and recognized by human observers (Montepare et al., 1999; Janssen et al., 2008; Karg et al., 2010). Montepare et al. (1987) pointed out that the trait and diversity of walking style have different effects on emotions. Besides, sadness and anger are easier to be identified than neutral or happy emotions by human observers. In addition, Michalak et al. (2009) supported that sadness and depression could be reflected in walking.

In recent years, machine learning approaches have been applied in assistant therapeutic equipment for patients with gait disorders, gait-based identity recognition, and human motion identification (Kale et al., 2004; Man et al.., 2006). Prior to training model, dimension reduction is often considered. Principal Component Analysis (PCA), Kernel Principal Component Analysis (KPCA), and Linear discriminant analysis (LDA) are all effective dimension reduction methods. Michelle and Robert employed PCA and Fourier Transformation to realize data reduction, and classified emotions into four utilizing Naive Bayes, 1-Nearest Neighbor and SVM, respectively. The best classification accuracy is 72% when using Naive Bayes (Karg et al., 2009). Janssen et al. (2008) figured out the emotion recognition method from walking based on artificial Neural Nets.

Complete Chapter List

Search this Book:
Reset