Improving Emotion Analysis for Speech-Induced EEGs Through EEMD-HHT-Based Feature Extraction and Electrode Selection

Improving Emotion Analysis for Speech-Induced EEGs Through EEMD-HHT-Based Feature Extraction and Electrode Selection

Jing Chen, Haifeng Li, Lin Ma, Hongjian Bo
DOI: 10.4018/IJMDEM.2021040101
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Emotion detection using EEG signals has advantages in eliminating social masking to obtain a better understanding of underlying emotions. This paper presents the cognitive response to emotional speech and emotion recognition from EEG signals. A framework is proposed to recognize mental states from EEG signals induced by emotional speech: First, speech-evoked emotion cognitive experiment is designed, and EEG dataset is collected. Second, power-related features are extracted using EEMD-HHT, which is more accurate to reflect the instantaneous frequency of the signal than STFT and WT. An extensive analysis of relationships between frequency bands and emotional annotation of stimulus are presented using MIC and statistical analysis. The strongest correlations with EEG signals are found in lateral and medial orbitofrontal cortex (OFC). Finally, the performance of different feature set and classifier combinations are evaluated, and the experiments show that the framework proposed in this paper can effectively recognize emotion from EEG signals with accuracy of 75.7% for valence and 71.4% for arousal.
Article Preview
Top

Introduction

Emotion plays an important role in human mental life. It is a conscious mental experience reflecting the personal significance of internal and external events. Human speech conveys not only linguistic messages, but also emotional information. Speech is one of the principal conveyers to express one’s emotion during human social communication. The ability to identify vocal expressions of emotion or attitude in speech is one of the basic cognitive functions of human beings. Although emotional speech are as ubiquitous as facial expressions, far less is known about the brain mechanisms of emotion perception in the speech than emotional facial expressions.

Emotion detection using physiological responses have advantages on eliminating social masking to obtain a better understanding of underlying emotions. Electroencephalography (EEG) signals are the summation of the activities of billions of neurons in the cerebral cortex, which can directly reflect brain activities. EEG is noninvasive and provides high temporal resolution which can reflect effective emotional changes in brain. Furthermore, advances in wearability, price, portability and ease-to-use, emotion recognition based on EEG has received extensive attention in numerous fields such as affective brain-computer interface (BCI), neuroscience, health care, emotional companionship and e-learning.

Multi-channel EEG provides population measures of neurons that allow us to uncover the complex cognitive processes of emotional information integration and processing. The fundamental goal of decoding mental states from EEG recordings is to identify emotions in EEG data, which correspond to the experimental task or stimulus by using machine learning classifiers. The research about emotion recognition from EEG is still progressing and few studies regard to emotion detection from speech-evoked EEG signals. This paper focuses on EEG-based mental state recognition induced by emotional speech.

Our EEG-based emotion recognition framework mainly includes two modules: feature extraction and classification. Various features have been extracted from different type of domains (time-domain, frequency-domain, time-frequency domain) for EEG analysis. This research will limit the scope of discussion to time-frequency analysis and recognition. Power-related features from different frequency bands of EEG are often used during classification. The most commonly used analytical technique is short-time Fourier transform(STFT), wavelet transform(WT).

In this work, Ensemble Empirical Mode Decomposition and Hilbert-Huang Transform (EEMD-HHT) are first applied to analyze the time-frequency distribution of EEG signals and corresponding features are extracted for classification. Compared with the traditional Pwelch frequency feature extraction method based on STFT, HHT can process non-stationary signals and obtain practical instantaneous frequencies, which is more accurate to reflect the actual frequency of the signal. EEMD is an adaptive decomposition method, which avoids the limitation of selecting the mother wavelet and setting the number of decomposition layers in WT. Then, maximal information coefficient (MIC) is applied to measure the relationship between frequency band power and the level of arousal or valence. Next, the authors reduce the feature dimensions by using statistical test to determine whether the energy of electrodes significantly vary under different conditions. Finally, three classifiers are used for classification.

The remaining of this paper is organized as follows: First, the authors briefly introduce relevant researches on emotion recognition based on EEG signals . Second, the authors describe the speech-evoked emotion cognitive experiment protocol and EEG dataset collection, followed by the feature extraction method based on EEMD-HHT. Third, the relationship between band power and the level of emotion rating will be presented. Fourth, the classification results are provided and discussed in Results and Discussion section. Finally, the authors conclude the findings and discuss future directions.

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024)
Volume 14: 1 Issue (2023)
Volume 13: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing