A Virtual Reality Drumkit Simulator System with a Kinect Device

A Virtual Reality Drumkit Simulator System with a Kinect Device

Alejandro Rosa-Pujazón (ETSI Telecomunicacion, Universidad de Málaga, Málaga, Spain), Isabel Barbancho (ETSI Telecomunicacion, Universidad de Málaga, Málaga, Spain), Lorenzo J. Tardón (ETSI Telecomunicacion, Universidad de Málaga, Málaga, Spain) and Ana M. Barbancho (ETSI Telecomunicacion, Universidad de Málaga, Málaga, Spain)
DOI: 10.4018/IJCICG.2015010105
OnDemand PDF Download:


In this paper, an implementation of a virtual reality based application for drumkit simulation is presented. The system tracks user motion through the use of a Kinect camera sensor, and recognizes and detects user-generated drum-hitting gestures in real-time. In order to compensate the effects of latency in the sensing stage and provide real-time interaction, the system uses a gesture detection model to predict user movements. The paper discusses the use of two different machine learning based solutions to this problem: the first one is based on the analysis of velocity and acceleration peaks, the other solution is based on Wiener filtering. This gesture detector was tested and integrated into a full implementation of a drumkit simulator, capable of discriminating up to 3, 5 or 7 different drum sounds. An experiment with 14 participants was conducted to assess the system's viability and impact on user experience and satisfaction.
Article Preview

1. Introduction

The advances in information and communication technologies in the last years have modified the way in which we interact with computing devices, evolving from the standard interaction metaphor of desktop computers built around the use of keyboard and mouse, to the use of touch-based interaction for the use of smartphones and tablets. On top of that, the success of motion capture sensing-based controllers such as Nintendo's Wii Remote or Microsoft's Kinect in recent years has made it possible to redefine the standard interaction models for many videogame-based applications. Nowadays, we are quite familiar with these new devices, and more and more applications are being developed and released that make use the intricacies behind these interfaces to provide new ways of interaction and user experiences that improve previously available schemes. In particular, the addition of these technologies to the field of music provides new ground to explore through the use of innovative interaction paradigms.

Music interaction interfaces are usually confined to the use of the traditional musical instruments. This is so in spite of the fact that the mechanics and abstract concepts of music are not usually known to most lay people. Furthermore, in order to learn or understand the different aspects of music theory it is necessary to devote a considerable amount of time to such purpose. However, the evolution of sensing and motion-tracking technologies has allowed for the development of new and innovative human-computer interfaces that have changed the way in which users interact with computer applications, thus offering a more 'natural' experience than the one had with a more conventional setting, which can also help to lower the barriers of the inherent abstract nature of musical concepts.

Advanced human-computer interfaces to implement a more natural or immersive interaction with music have been proposed and/or studied in previous works (Barbancho, Rosa-Pujazón, Tardón, & Barbancho, 2013) for a wide array of applications: gaming (Gower & McDowall, 2012) (Wang & Lai, 2011), new instruments creation/simulation (Jordà, 2010), medical rehabilitation (De Dreu, Van der Wilk, Poppe, Kwakkel, & Van Wegen, 2012), modification of visual patterns by using sung or speech voice (Levin & Lieberman, 2004), body motion to sound mapping (Antle, Droumeva, & Corness, 2008) (Castellano, Bresin, Camurri, & Volpe, 2007) (Halpern et al., 2011) (Khoo et al., 2008), orchestra conductor simulation (Morita, Hashimoto, & Ohteru, 1991) (Parton & Edwards, 2009)(Todoroff, Leroy, & Picard-Limpens, 2011), tangible and haptic instrument simulation (Bakker, van den Hoven, & Antle, 2011) (Holland, Bouwer, Dalgelish, & Hurtig, 2010), drum-hitting simulation (Höofer, Hadjakos, & Mühlhäuser, 2009) (Ng, 2004) (Trail et al., 2012) (Odowichuk, Trail, Driessen, Nie, & Page, 2011), etc.

Some of the problems commonly identified with advanced human-computer interfaces is that they are usually expensive, intrusive and/or bulky, being prone to raise ergonomic issues. Fortunately, the emergence of devices like the Wiimote and Kinect has helped to mitigate such issues. Off-the-shelf devices have also been studied for the creation of new interaction models for music performance, such as the previously mentioned Wiimote (Qin, n.d.) and Kinect (Mandanici & Sapir, 2012) (Todoroff et al., 2011) (Odowichuk et al., 2011) (Yoo, Beak, & Lee, 2011) (Stierman, 2012) (Rosa-Pujazón, Barbancho, Tardón, & Barbancho, 2013) devices, and even mobile phones and smartphones (Essl & Rohs, 2009).

Complete Article List

Search this Journal:
Open Access Articles: Forthcoming
Volume 8: 2 Issues (2017): Forthcoming, Available for Pre-Order
Volume 7: 2 Issues (2016)
Volume 6: 2 Issues (2015)
Volume 5: 2 Issues (2014)
Volume 4: 2 Issues (2013)
Volume 3: 2 Issues (2012)
Volume 2: 2 Issues (2011)
Volume 1: 2 Issues (2010)
View Complete Journal Contents Listing