Emotion Detection and Classification Using Machine Learning Techniques

Emotion Detection and Classification Using Machine Learning Techniques

Amita Umesh Dessai, Hassanali G. Virani
DOI: 10.4018/978-1-6684-5673-6.ch002
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter analyzes 57 articles published from 2012 on emotion classification using bio signals such as ECG and GSR. This study would be valuable for future researchers to gain an insight into the emotion model, emotion elicitation and self-assessment techniques, physiological signals, pre-processing methods, feature extraction, and machine learning techniques utilized by the different researchers. Most investigators have used openly available databases, and some have created their datasets. The studies have considered the participants from the healthy age group and of similar cultural backgrounds. Fusion of the ECG and GSR parameters can help to improve classification accuracy. Additionally, handcrafted features fused with automatically extracted deep machine learning features can increase classification accuracy. Deep learning techniques and feature fusion techniques have improved classification accuracy.
Chapter Preview
Top

Introduction

Emotions indicate the way people prompt their feelings and communicate with the external world. Paul Ekman’s theory recognizes the seven basic discrete emotions Fear, anger, disgust, sadness, happiness, surprise, and contempt experienced by human beings. The emotions can be distinguished from their biological processes and characteristics and usually do not persist for long. James Russell‘s Circumplex model of emotion separates the emotions in a two-dimensional circular space (Emotion classification, 2022). The valence indicates the pleasantness of the emotions and arousal intensity of emotions. Happiness has high valence and high arousal. Electroencephalogram (EEG), Electromyogram (EMG), respiration rate (RT), Galvanic skin response(GSR) or Electrodermal activity (EDA), skin temperature(ST), heart-related methods like Electrocardiogram (ECG), Blood volume pulse (BVP), enable the detection of emotions. In a virtual classroom environment reliant on students' emotions suitable teaching plans can be formulated. The passengers can get alert if the driver is found angry or in stressful conditions. Companies can assess the emotions of their customers and then decide the strategy for marketing their products. In the healthcare field, robots can judge patient's emotional states and alter their actions appropriately. Human-computer interaction systems monitor the discomfort experienced by patients who aren’t able to express their emotions verbally. Emotions are detected in geriatric patients to provide them assistance. Remote monitoring of the health of the elderly is done depending on the emotions experienced by them. The patients are provided assistance based on their emotions. Emotions can be detected using smart devices for real-time applications like healthcare, online classroom environment, advertising products. In this survey, the Inclusion criteria are ECG, BVP, GSR, and the exclusion criteria are Facial, Voice signals, EEG, EMG, ST, Respiration rate for emotion detection. The research papers built on the openly available databases using ECG and GSR signals are published in the year 2012. This survey focuses on the recently published papers from the year 2012 to 2020. A total of 47 research papers are from the years 2016 to 2020. This chapter reviews ten papers published in the year 2016 and nine papers during the years 2017 and 2018 respectively. Twelve papers in this review are from the year 2019, seven papers during the year 2020. Mainly total number of 9 papers is from Elsevier, 8 numbers of papers are from IEEE transaction and 8 papers from MDPI sensors databases. Factors like the number and age of participants, gender proportionality, cultural & social background, and intellectual, physical, mental well-being of the subjects were considered. Emotions experienced by the participants mapped on the Valence Arousal scale. Pictures, Audio, Audiovisuals, virtual reality, the real-time environment used for emotion elicitation. Self-assessment manikins (SAM), Questionnaire, Participant’s ratings, Participant’s feedback, Android-based application assess the emotions experienced by the participants. The fusion of the ECG and GSR parameters leads to an enhancement in classification accuracy.

The paper structured as follows. The Section Background reviews the emotion models, emotion elicitation methods, methods of self-assessment by the participants. Next Section on “Emotional Intelligence” reviews the publicly available databases for research, devices used for acquiring these signals, and the participant’s details. “Emotion Detection” and “ECG, GSR Signal Preprocessing” Sections mention the pre-processing methods, feature extraction, and feature selection techniques. The feature selection techniques assist in selecting the optimum features needed for classification. Next Section “Classifications and Fusion” explains the classification techniques and the fusion techniques for classifying the emotions. Last Sections are Future directions and Conclusion. Figure number 1 lists an overview of the various attributes for emotion recognition and classification using the ECG & GSR signals.

Figure 1.

Overview of Emotion classification

978-1-6684-5673-6.ch002.f01

Key Terms in this Chapter

Signal Processing: Removing unwanted signals from the data.

Electrodermography: Measurement of electrical conductance of the skin.

Feature Extraction: Extraction of the relevant features from the data.

Electrocardiography: Measurement of electrical activity of the heart.

Deep Learning: Machine learning technique.

Fusion of Signals: Combination of signals.

Emotion Elicitation: Provoking emotions in the participants.

Russel’s Model: Two-dimensional model for emotion recognition.

AMIGOS: Publicly available database for emotion classification.

Emotion Self-Assessment: Emotions experienced by the participants.

Complete Chapter List

Search this Book:
Reset