Camera-Based Motion Tracking and Performing Arts for Persons with Motor Disabilities and Autism

Camera-Based Motion Tracking and Performing Arts for Persons with Motor Disabilities and Autism

Alexandros Kontogeorgakopoulos, Robert Wechsler, Wendy Keay-Bright
DOI: 10.4018/978-1-5225-0034-6.ch021
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The aim of this chapter is to discuss a range of computer applications designed to enable people with disabilities to interact through music, dance, and the visual arts. A review of the main motion tracking algorithms and software environments is included as well as an overview of theoretical positions regarding the mapping of real time extracted motion features to sound, interactive music, and computer-generated or modified visual content. The chapter concludes with descriptions of how the concepts have been applied to research projects undertaken with different groups of young people with motor limitations and autism spectrum disorders.
Chapter Preview
Top

Introduction

The World Health Organization estimates that some form of disability affects one out of every seven people. This means that as many as 110-190 million people are directly affected by disability and many others are indirectly affected because of their responsibility as carers (SCRPD, 2012). People with disabilities disproportionately face social isolation and reduced physical activity as compared to their non-disabled counterparts (SGUN, 2010). These factors are major contributors to secondary health problems such as those associated with obesity and depression1.

This chapter focuses on two distinct groups of people, those with autism and those with motor disabilities. They are described as a group because they meet diagnostic criteria, however, they are nonetheless diverse individuals. Identifying with these populations gives this chapter a distinct context for exploring camera-based applications. Furthermore, their inclusion in implementation and evaluation has provided a helpful framework for understanding the benefits of these technologies and for speculating on future work. In the body of the chapter we will describe the characteristics of these disabilities in relation to their significance in the design of the technologies and their relevance for music, dance and arts based programmes.

For most persons dance and music are social or artistic activities. For people with disabilities they can be additionally employed as therapeutic interventions, used to promote physical and emotional well-being (Muller & Warwick, 1993; Wimpory et al., 1995). Advances in technology, and particularly in Human Computer Interaction, have led to a number projects that offer novel conditions for enabling people with a range of physical, and developmental disabilities to engage in expressive performances, and to investigate the wider impact of such interactions on social communication. Camera-based motion tracking is one such technology that has been extensively explored in the last decade (Dixon & Smith, 2007). The most notable benefit of camera-based motion tracking is that it can support full unencumbered body movement, offering a more inclusive, open-ended and accessible mode of interaction and performance.

Camera-based interactive dance has had a powerful resonance among experimental artists and their audiences over the last three decades. In 1995, Palindrome, then based in Nürnberg, Germany, began working with the computer engineer Frieder Weiss (the inventor of the Eyecon motion tracking system, which is discussed later in this chapter) and quickly gained a reputation as a pioneering interactive dance company2. Other early players in this burgeoning artistic trend included Troika Ranch, Company in Space, Die Audio Gruppe, John D. Mitchell, Armando Menicacci and Alien Nation. There were sporadic dance and technology conferences (The University of Wisconsin, Madison, 1992; Simon Fraser University, British Columbia, Canada, 1993; York University, Toronto, Canada, 1995; Arizona State University in Tempe, Arizona, 1999), but it was not until The Monaco Dance Forum began sponsoring bi-annual dance and technology conferences (2002, 2004, and 2006) that the players began to share their notes more widely and an understand grew that a larger world of interactive performance that was arising.

Similarly in music, artists such as David Rokeby in 1989, Bruno Spoerry in 1991 and Todd Winkler in 1997 began using computer vision techniques implemented in the Very Nervous System (developed by Rokeby), in their interactive compositions and installations (Chadabe, 1997; Winkler, 1997). Luciano Berio has composed music using camera-based interaction using the Eyesweb software in his opera Cronaca del Luogo in 1999 (Camurri et al., 2004). Since the Digital Dance Seminar held in Copenhagen in 1996, various artists including Wayne Siegel have used computer vision software, such as Big Eye (developed by STEIM, in Amsterdam) in music studies and compositions (Siegel & Jacobsen, 1998; Siegel, 2009).

Complete Chapter List

Search this Book:
Reset