Recognition of Emotions in Gait Patterns Using Discrete Wavelet Transform

Recognition of Emotions in Gait Patterns Using Discrete Wavelet Transform

N. M. Khair (School of Mechatronic Engineering, Universiti Malaysia Perlis, Arau, Malaysia), Hariharan Muthusamy (School of Mechatronic Engineering, Universiti Malaysia Perlis, Arau, Malaysia), S. Yaacob (School of Mechatronic Engineering, Universiti Malaysia Perlis, Arau, Malaysia) and S. N. Basah (School of Mechatronic Engineering, Universiti Malaysia Perlis, Arau, Malaysia)
Copyright: © 2012 |Pages: 8
DOI: 10.4018/ijbce.2012010107

Abstract

Emotion is a natural instinctive state of mind deriving from one’s circumstances, mood, or relationships with others. Emotion can be characterized primarily by the psycho-physiological expressions, biological reactions, body interaction, and mental states. The emotional component is to be important for social interaction to serve the communication, response, and conveying information. The problem in controlling and maintaining human emotion can lead to emotional disorder. According to the National Institute of Mental Health (NIMH), approximation of 10-15% of the children tend to have an emotional and behavioral disorder. In this paper, discrete wavelet transform (DWT) was proposed to recognize human emotions in gait patterns. Four discrete categories of emotion such as fear, happy, normal, and sad were analyzed. Data was extracted from a single stride of gait. Daubechies wavelet of order 1 and order 4 was utilized to investigate their performance in recognizing emotional expression in gait patterns. Six statistical features namely mean, maximum, minimum, standard deviation, skewness, and kurtosis were derived from both approximation and detail coefficients at every level of decomposition. The discrete emotion was classified using kNN and fkNN classifier. The maximum classification accuracy of 96.07% was obtained at the first level of decomposition using kNN.
Article Preview

Introduction

Human lives involve social judgment and bring us to elicit reliable judgment of emotion and personality from human raters (Andrea et al., 2004). Emotion and personality depend on a series of processes including the perception of the stimuli, observable behavior to prior knowledge and inferring the state or trait that relatively automatic component towards of a body (Bodenhausen & Hugenberg, 2011; Adolphs, 2002). Hans Eysenck (1916) classifies personality into three categories of super traits (extraversion–introversion), neuroticism and psychoticism stability. The problem with thoughts and behavior of humans comes from uncontrolled emotion that can lead to personality disorders such as borderline, antisocial, narcissistic and others (Zuckerman, 1991)

An article from Pacer Centre has explicated that the childhood of teenager who experiences an emotional disorder will most likely have difficulties in growing up. In addition from DSM-IVR diagnostic criteria, there are several types of emotional disorder that affect a child and youth. In psychosocial aspects, emotional disorders are quite complex. Many psychologists will normally investigate the patient emotional states through questionnaires and counseling and this technique are more subjective. In order to develop objective method to understand the emotional state of an individual, several studies using physiological signal, facial expressions, acoustic analysis of speech, gesture and body motion were investigated (Hassan et al., 2010; Reddy et al., 2011; Russo et al., 2009; Yamada & Watanabe, 2007; Asha et al., 2005; Friberg, 2004; Kobayashi, 2007; Kleinsmith et al., 2011; Omlor et al., 2006; Karg et al., 2010; Roether et al., 2009; Janssen et al., 2008; Venture, 2010). Although several studies are available in the literature, there is still some limitation. Physiological signal is hard to deal with since it can be affected by internal factors of a user. External factor on the other hand deals with the different physiological or acoustic signal characteristic where they are easily exposed to the environmental noise during experimental session. In order to improve the performance to develop the removable method, the researchers used motion capture based marker techniques. The data obtained is considered more accurate since it could represent the orientations of the joints and bone structure as presented in previous works.

Developing machine learning models for recognizing human emotion is far more challenging and it is an active research field which generally referred to as affective computing. Table 1 depicts some of the significant research works that was conducted in the human emotions for various types of motion.

Table 1.
A summary of some research works on emotion recognition, detailing the number of subjects, the emotion involved, the features used and the classifiers employed
First AuthorTarget GroupEmotionMotionFeaturesClassifierBest Result
Karg, M.
(2009)
30 Nonprofessional actorsSad, happy, angry, neutralWalking, knocking, lifting, throwingPosition, Joint angleNaïve Bayes, 1- Nearest Neighbor, SVM72%
Asha, K.
(2005)
3 nonprofessional dancers
2 professional dancers
Sadness, joy, anger, fearDancingPosition, velocity, accelerationLogistic Regression, Naïve Bayes, Decision Tree, ANN, SVM93%
Karg, M.
(2010)
13 nonprofessional actorsNeutral, sad, happy, angryWalkingJoint angle, velocity, stride length, cadenceNearest Neighbor, Naïve Bayes, SVM95%
Bernhardt, D.
(2007)
30 individualsNeutral, happy, angry and sadKnocking, throwing, lifting and walkingMaximum distance, velocity, acceleration, jerkSVM81%
Castellano, G.
(2007)
10 participantsAnger, joy, pleasure, sadnessBody movement
and gesture expressivity
amplitude,
speed and 〉uidity of movement
1-nearest-neighbor,
Decision-tree,
Bayesian Network
68%

Complete Article List

Search this Journal:
Reset
Open Access Articles: Forthcoming
Volume 8: 2 Issues (2019): Forthcoming, Available for Pre-Order
Volume 7: 2 Issues (2018)
Volume 6: 2 Issues (2017)
Volume 5: 2 Issues (2016)
Volume 4: 2 Issues (2015)
Volume 3: 2 Issues (2014)
Volume 2: 2 Issues (2013)
Volume 1: 2 Issues (2012)
View Complete Journal Contents Listing