Human-Friendly Robots for Entertainment and Education

Human-Friendly Robots for Entertainment and Education

Jorge Solis, Atsuo Takanishi
Copyright: © 2012 |Pages: 24
DOI: 10.4018/978-1-4666-0291-5.ch008
OnDemand:
(Individual Chapters)
Available
$33.75
List Price: $37.50
10% Discount:-$3.75
TOTAL SAVINGS: $3.75

Abstract

Even though the market size is still small at this moment, applications of robots are gradually spreading out from the manufacturing industrial environment to face other important challenges, like the support of an aging society and to educate the new generations. The development of human-friendly robots drives research that aims at autonomous or semi-autonomous robots that are natural and intuitive for the average consumer to interact with, communicate with, and work with as partners, besides learning new capabilities. In this chapter, an overview of research done on the mechanism design and intelligent control strategies implementation on different platforms and their application to entertainment and education domains will be stressed. In particular, the development of an anthropomorphic saxophonist robot (designed to mechanically reproduce the organs involved during saxophone playing) and the development of a two-wheeled inverted pendulum (designed to introduce the principles of mechanics, electronics, control, and programming at different education levels) will be presented.
Chapter Preview
Top

Introduction

The development of anthropomorphic robots is inspired by the ancient dream of humans replicating themselves. However, human behaviors are difficult to explain and model. The recent technological advances in robot technology, artificial intelligence, power computation, etc. have contributed to enable humanoid robots to roughly emulate the physical dynamics and motor dexterity of humans. Nowadays, humanoid robots are able of displaying motor dexterities for dancing, playing musical instruments, talking, etc. Although the long-term goal of true autonomous humanoid robots has yet to be accomplished, the feasibility of integrating them into people’s daily lives is becoming closer.

Towards developing humanoid robots capable of interacting more naturally with human partners, robots are required to process and display human-like emotions. The way a person interacts with a humanoid robot is quite different from interacting with the majority of industrial robots today. Modern robots are generally viewed as tools that human specialists use to perform hazardous tasks in remote environments. In contrast, human-like personal robots are often designed to engage people in order to achieve social or emotional goals. The development of socially intelligent and socially skillful robots drives research to develop autonomous or semi-autonomous robots that are natural and intuitive for the average consumer to interact with, communicate with, work with as partners, and teach new capabilities. In addition, this domain motivates new questions for robotics researchers, such as how to design for a successful long-term relationship where the robot remains appealing and provides consistent benefit to people over weeks, months, and even years. The benefit that social robots provide people extends far beyond the strict task performing utility to include educational, health and therapeutic, domestic, social and emotional goals (e.g., entertainment, companionship, communication, etc.), and more.

However, these mechanical devices are still far from understanding and processing emotional states as humans do. Research on musical performance robots seems like a particularly promising path toward helping to overcome this limitation, because music is a universal communication medium, at least within a giving cultural context. Furthermore, research into robotic musical performance can shed light on aspects of expression that traditionally have been hidden behind the rubric of “musical intuition.” The late Prof. Ichiro Kato argued that the artistic activity such as playing a keyboard instrument would require human-like intelligence and dexterity (Kato, et al., 1973). In 1984, at Waseda University, the WABOT-2 was the first attempt of developing an anthropomorphic music robot capable of playing a concert organ (Sugano & Kato, 1987). Then, in 1985, the WASUBOT built also at Waseda, could read a musical score and play a repertoire of 16 tunes on a keyboard instrument. More recently, thanks to the technological advances on power computation, Musical Information Retrieval (MIR) and Robot Technology, several researchers have been focusing on developing anthropomorphic robots and interactive automated instruments capable of interacting with musical partners. As a result, different kinds of wind playing-instrument automated machines and humanoid robots have been developed for playing wind instruments (Doyon & Liaigre, 1966; Klaedefabrik, 2005; Solis, et al., 2008; Takashima & Miyawaki, 2006; Solis, et al., 2009a; Dannenberg, 2005; Toyota Motor Corporation, 2011; Degallier, 2006; etc.). Other researchers have been focusing in analyzing wind instrument playing from a musical engineering approach by performing experiments with simplified mechanisms (Ando, 1970; Guillemain, et al., 2010; etc.) and from a physiological point of view by analyzing medical imaging data of professional players (Mukai, 1992; Fletcher, 2001; etc.). In this research, we particularly deal with the development of an anthropomorphic saxophone-playing robot designed to mechanically emulate the required organs during the saxophone playing. Due to the interdisciplinary nature of this research, our collaboration with musicians, musical engineers, and medical doctors will certainly contributes to better reproduce and understand the human motor control from an engineering point of view.

Complete Chapter List

Search this Book:
Reset