A Human Affect Recognition System for Socially Interactive Robots

A Human Affect Recognition System for Socially Interactive Robots

Derek McColl (University of Toronto, Canada) and Goldie Nejat (University of Toronto, Canada)
DOI: 10.4018/978-1-4666-2211-1.ch029
OnDemand PDF Download:
$30.00
List Price: $37.50

Abstract

This chapter presents a real-time robust affect classification methodology for socially interactive robots engaging in one-on-one human-robot-interactions (HRI). The methodology is based on identifying a person’s body language in order to determine how accessible he/she is to a robot during the interactions. Static human body poses are determined by first identifying individual body parts and then utilizing an indirect 3D human body model that is invariant to different body shapes and sizes. The authors implemented and tested their technique using two different sensory systems in social HRI scenarios to motivate its robustness for the proposed application. In particular, the experiments consisted of integrating the proposed body language recognition and affect classification methodology with imaging-based sensory systems onto the human-like socially interactive robot Brian 2.0 in order for the robot to recognize affective body language during one-on-one interactions.
Chapter Preview
Top

Introduction

Socially interactive robots are currently being designed to engage in convincing and natural social interactions with people for a wide variety of everyday applications. Namely, emerging applications for these robots include assistants in health/elderly care (Chan et al., 2011; Tapus et al., 2009); helpers in the home/workplace (Hashimoto & Kobayashi, 2009); tour guides and greeters in museums, hospitals and shopping malls (Haasch et al., 2004); and aids in security and defense (Belkhouche et al., 2006). In order for socially interactive robots to be effectively integrated and accepted within society, they must be able to communicate, function and interact with people. Namely, effective human-robot interaction (HRI) is highly dependent on a robot’s ability to recognize various social spaces and social cues. Thus, an important design issue that needs to be addressed for social robots operating in person-centered environments is their ability to recognize and identify a person in an environment, and judge that person’s intent and behavior in order to respond appropriately during interactions. By being able to detect a person’s mannerisms and actions during HRI, a social robot can aim to obtain the person’s acceptance in order to create a long-term relationship between a user and itself. Observing this relationship can bring insight into how humans adapt to and interact with social robots.

While both verbal and non-verbal (i.e., facial expressions, body gestures and tone of voice) communication between humans can be used to convey affect, it has been found that non-verbal communication is more meaningful than verbal content, particularly in Western cultures, in demonstrating affective qualities during one-on-one interactions (Mehrabian & Ferris, 1968; Argyle et al., 1971; Haase & Tepper, 1972; Tepper & Haase, 1978; Davis & Hadiks, 1994). To date, a great deal of work has been conducted in the development of automated affect recognition techniques utilized in determining human affect through paralanguage (pitch and volume of voice) (Sundberg et al., 2011; Hyun et al., 2007) and facial expressions (Mingli et al., 2010; Tian et al., 2005). Little attention has been placed on the development of automated affect recognition systems that utilize body language, mainly due to the complexity and high number of degrees of freedom of the human body. However, body language has been found to play a vital role in conveying human intent, moods, attitudes and affect (Gong et al., 2007). Thus, it is important that during social HRI, a robot have the ability to recognize human body language in order to better engage a person in an interaction through its own appropriate display of behaviors. In our work, we focus on developing a body language affect recognition technique for social robots in order to promote natural one-on-one interactions.

Key Terms in this Chapter

Real-Time: A system that operates at an instantaneous rate that is virtually impossible for a human to notice.

Automated: Controlled by computers in such a manner where human input is not required.

Accessibility: The degree of psychological openness and rapport, and emotional involvement one person feels towards another during a one-on-one interaction.

Body Language: The expression of information and emotions through human body dynamics or static poses.

Socially Interactive Robots: Robots that are capable of communicating with humans through social communication modes such as speech, facial expressions and/or body language.

Affect: The emotion, mood, and/or attitude a person is feeling.

Static Pose: An arrangement of a person’s body that is held for four or more seconds.

Human-Robot Interaction: Communication between a human and a robot.

Delaunay Triangulation: Defines a set of triangles connecting points in a point set where a circle defined by connecting the three points of a triangle does not include any other points in the set.

Complete Chapter List

Search this Book:
Reset