Facial and Body Feature Extraction for Emotionally-Rich HCI

Facial and Body Feature Extraction for Emotionally-Rich HCI

Kostas Karpouzis (National Technical University of Athens, Greece), Athanasios Drosopoulos (National Technical University of Athens, Greece), Spiros Ioannou (National Technical University of Athens, Greece), Amaryllis Raouzaiou (National Technical University of Athens, Greece), Nicolas Tsapatsoulis (National Technical University of Athens, Greece) and Stefanos Kollias (National Technical University of Athens, Greece)
DOI: 10.4018/978-1-59904-941-0.ch126
OnDemand PDF Download:
$30.00
List Price: $37.50

Abstract

Emotionally-aware Man-Machine Interaction (MMI) systems are presently at the forefront of interest of the computer vision and artificial intelligence communities, since they give the opportunity to less technology-aware people to use computers more efficiently, overcoming fears and preconceptions. Most emotion-related facial and body gestures are considered to be universal, in the sense that they are recognized along different cultures; therefore, the introduction of an “emotional dictionary” that includes descriptions and perceived meanings of facial expressions and body gestures, so as to help infer the likely emotional state of a specific user, can enhance the affective nature of MMI applications (Picard, 2000).

Complete Chapter List

Search this Book:
Reset