Automatic Facial Expression Recognition by Facial Parts Location with Boosted-LBP

Automatic Facial Expression Recognition by Facial Parts Location with Boosted-LBP

Yi Ji, Khalid Idrissi
Copyright: © 2011 |Pages: 14
DOI: 10.4018/ijcvip.2011010104
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This paper proposes an automatic facial expression recognition system, which uses new methods in both face detection and feature extraction. In this system, considering that facial expressions are related to a small set of muscles and limited ranges of motions, the facial expressions are recognized by these changes in video sequences. First, the differences between neutral and emotional states are detected. Faces can be automatically located from changing facial organs. Then, LBP features are applied and AdaBoost is used to find the most important features for each expression on essential facial parts. At last, SVM with polynomial kernel is used to classify expressions. The method is evaluated on JAFFE and MMI databases. The performances are better than other automatic or manual annotated systems.
Article Preview
Top

Introduction

In everyday life, the interpretation of expressions on human faces occupies an important role in interpersonal and non-verbal communication. For human beings, the changes of internal emotion states will influence facial muscles movement, and generate facial expressions. In psychology science, facial expressions occupy an important position in interpersonal and non-verbal communication. Over the last two decades numerous researches had been active on the area of automatically analysis and recognition of these facial motions and related emotions. Accurate interpretation of facial expression and related human emotions is a challenging issue both in the areas of psychology and computer vision. Ekman and Friesen (1978) proposed to use FACS (facial action code system) as the standards to systematic categorize the facial expression of emotions. They defined six basic universal emotions: Anger, Disgust, Fear, Happiness, Sadness and Surprise. Even for human, the task of accurate interpretation of faces is difficult because of individual differences and culture background, regardless of lighting, gesture or facial hairs. From then in computer vision and pattern recognition, much progress from academic institutions and commercial companies are obtained on automatic facial expression recognition (FER) systems. The full understanding, which is able to make computer interact with human, is still a challenge problem. Since the 90's, Ekman et al. (2002) added more positive and negative emotions such as Amusement, Guilty, and Satisfaction to the expanded list. These expanded emotions mostly are unexplored.

Complete Article List

Search this Journal:
Reset
Volume 14: 1 Issue (2024): Forthcoming, Available for Pre-Order
Volume 13: 1 Issue (2023)
Volume 12: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 11: 4 Issues (2021)
Volume 10: 4 Issues (2020)
Volume 9: 4 Issues (2019)
Volume 8: 4 Issues (2018)
Volume 7: 4 Issues (2017)
Volume 6: 2 Issues (2016)
Volume 5: 2 Issues (2015)
Volume 4: 2 Issues (2014)
Volume 3: 4 Issues (2013)
Volume 2: 4 Issues (2012)
Volume 1: 4 Issues (2011)
View Complete Journal Contents Listing