Automatic Facial Expression Recognition by Facial Parts Location with Boosted-LBP

Automatic Facial Expression Recognition by Facial Parts Location with Boosted-LBP

Yi Ji (Universite de Lyon, France) and Khalid Idrissi (Universite de Lyon, France)
DOI: 10.4018/978-1-4666-3906-5.ch004
OnDemand PDF Download:
No Current Special Offers


This paper proposes an automatic facial expression recognition system, which uses new methods in both face detection and feature extraction. In this system, considering that facial expressions are related to a small set of muscles and limited ranges of motions, the facial expressions are recognized by these changes in video sequences. First, the differences between neutral and emotional states are detected. Faces can be automatically located from changing facial organs. Then, LBP features are applied and AdaBoost is used to find the most important features for each expression on essential facial parts. At last, SVM with polynomial kernel is used to classify expressions. The method is evaluated on JAFFE and MMI databases. The performances are better than other automatic or manual annotated systems.
Chapter Preview


In everyday life, the interpretation of expressions on human faces occupies an important role in interpersonal and non-verbal communication. For human beings, the changes of internal emotion states will influence facial muscles movement, and generate facial expressions. In psychology science, facial expressions occupy an important position in interpersonal and non-verbal communication. Over the last two decades numerous researches had been active on the area of automatically analysis and recognition of these facial motions and related emotions. Accurate interpretation of facial expression and related human emotions is a challenging issue both in the areas of psychology and computer vision. Ekman and Friesen (1978) proposed to use FACS (facial action code system) as the standards to systematic categorize the facial expression of emotions. They defined six basic universal emotions: Anger, Disgust, Fear, Happiness, Sadness and Surprise. Even for human, the task of accurate interpretation of faces is difficult because of individual differences and culture background, regardless of lighting, gesture or facial hairs. From then in computer vision and pattern recognition, much progress from academic institutions and commercial companies are obtained on automatic facial expression recognition (FER) systems. The full understanding, which is able to make computer interact with human, is still a challenge problem. Since the 90's, Ekman et al. (2002) added more positive and negative emotions such as Amusement, Guilty, and Satisfaction to the expanded list. These expanded emotions mostly are unexplored.

Complete Chapter List

Search this Book: