Article Preview
Top1. Introduction
The advances in information and communication technologies in the last years have modified the way in which we interact with computing devices, evolving from the standard interaction metaphor of desktop computers built around the use of keyboard and mouse, to the use of touch-based interaction for the use of smartphones and tablets. On top of that, the success of motion capture sensing-based controllers such as Nintendo's Wii Remote or Microsoft's Kinect in recent years has made it possible to redefine the standard interaction models for many videogame-based applications. Nowadays, we are quite familiar with these new devices, and more and more applications are being developed and released that make use the intricacies behind these interfaces to provide new ways of interaction and user experiences that improve previously available schemes. In particular, the addition of these technologies to the field of music provides new ground to explore through the use of innovative interaction paradigms.
Music interaction interfaces are usually confined to the use of the traditional musical instruments. This is so in spite of the fact that the mechanics and abstract concepts of music are not usually known to most lay people. Furthermore, in order to learn or understand the different aspects of music theory it is necessary to devote a considerable amount of time to such purpose. However, the evolution of sensing and motion-tracking technologies has allowed for the development of new and innovative human-computer interfaces that have changed the way in which users interact with computer applications, thus offering a more 'natural' experience than the one had with a more conventional setting, which can also help to lower the barriers of the inherent abstract nature of musical concepts.
Advanced human-computer interfaces to implement a more natural or immersive interaction with music have been proposed and/or studied in previous works (Barbancho, Rosa-Pujazón, Tardón, & Barbancho, 2013) for a wide array of applications: gaming (Gower & McDowall, 2012) (Wang & Lai, 2011), new instruments creation/simulation (Jordà, 2010), medical rehabilitation (De Dreu, Van der Wilk, Poppe, Kwakkel, & Van Wegen, 2012), modification of visual patterns by using sung or speech voice (Levin & Lieberman, 2004), body motion to sound mapping (Antle, Droumeva, & Corness, 2008) (Castellano, Bresin, Camurri, & Volpe, 2007) (Halpern et al., 2011) (Khoo et al., 2008), orchestra conductor simulation (Morita, Hashimoto, & Ohteru, 1991) (Parton & Edwards, 2009)(Todoroff, Leroy, & Picard-Limpens, 2011), tangible and haptic instrument simulation (Bakker, van den Hoven, & Antle, 2011) (Holland, Bouwer, Dalgelish, & Hurtig, 2010), drum-hitting simulation (Höofer, Hadjakos, & Mühlhäuser, 2009) (Ng, 2004) (Trail et al., 2012) (Odowichuk, Trail, Driessen, Nie, & Page, 2011), etc.
Some of the problems commonly identified with advanced human-computer interfaces is that they are usually expensive, intrusive and/or bulky, being prone to raise ergonomic issues. Fortunately, the emergence of devices like the Wiimote and Kinect has helped to mitigate such issues. Off-the-shelf devices have also been studied for the creation of new interaction models for music performance, such as the previously mentioned Wiimote (Qin, n.d.) and Kinect (Mandanici & Sapir, 2012) (Todoroff et al., 2011) (Odowichuk et al., 2011) (Yoo, Beak, & Lee, 2011) (Stierman, 2012) (Rosa-Pujazón, Barbancho, Tardón, & Barbancho, 2013) devices, and even mobile phones and smartphones (Essl & Rohs, 2009).