Volume Control by Adjusting Wrist Moment of Violin-Playing Robot

Volume Control by Adjusting Wrist Moment of Violin-Playing Robot

Koji Shibuya (Ryukoku University, Japan), Hironori Ideguchi (Ryukoku University, Japan) and Katsunari Ikushima (Ryukoku University, Japan)
Copyright: © 2012 |Pages: 17
DOI: 10.4018/jse.2012070102
OnDemand PDF Download:


This paper introduces the details of the anthropomorphic violin-playing robot built in the authors’ laboratory and an algorithm for controlling the sound volume by adjusting its wrist moment. Investigating the relationship between such sound parameters as sound volume and human impressions is an important research field in Kansei Engineering, which is a growing research field in Japan. They focused on the violin and built a violin-playing robot with two 7 DOF arms for bowing and fingering. Then they constructed an algorithm to adjust the wrist moment to control the sound volume. Based on the result of the experiments, the authors concluded that the moment-based algorithm works well to successfully control the sound volume. Finally, they analyzed the sound spectrum of the produced sounds of different wrist moment, and concluded that there is a possibility for the controlling sound spectrum, which affects human impressions.
Article Preview


Kansei and Music

Kansei is a Japanese word whose meanings include sensibility, sentiment, feeling, and so on. From the 1980s, many Japanese researchers have focused on Kansei, because it is an interdisciplinary field and includes many topics on informatics, ergonomics, psychology, brain science and so on (Inokuchi, 2010). In particular, Japanese companies have developed techniques for designing commercial products based on the data obtained by surveys of users. This research field is called Kansei Engineering, and the Japanese Society of Kansei Engineering was established in 1998 (Japanese Society of Kansei Engineering, http://www.jske.org/).

There is also a strong relation between Kansei and music, because we have to deal with the emotional aspects of music when we focus on musical performances. Thus, many researches on music have been conducted on Kansei. For example, Inokuchi et al. realized a system that outputs the impressions received by a computer from musical sounds using the Kansei database that contained the relationships between adjectives expressing impressions and musical primitives (Katayose, Imai, & Inokuchi, 1988).

Kansei in Robotics and Musical Performance Robot

In the field of robotics, researchers are also focusing on Kansei (Hashimoto, 2009). They have tackled gesture and facial recognition, emotional recognition, and emotional expression (Sakamoto & Ishiguro, 2009; Cho et al., 2009). Some researchers are focusing on musical performance robots, and many musician robots have been developed. Sugano and Kato developed an organ-playing robot, WABOT-II that can play an electric organ with its two arms and legs and can read musical notes by a camera mounted on its head (1987). Recently, Solis and Takanishi built a flute-playing robot that can accompany a human (Solis et al., 2009). The robot communicates with a human flutist through the acoustic sounds produced by both of them. They also developed a saxophone-playing robot (Solis et al., 2010), whose sound closely resembles the pitch and volume performance of a human.

Among various instruments, violin playing seems to be one of the most difficult instruments for human to perform. To become a good violinist, one must practice for many years and needs much experience and knowledge of violin performance and music (Konczak, Velden, &Jaeger, 2009). We focused on two aspects of violin-playing: physical skills and musical aspects. To make a robot play the violin like human, we must improve the technologies of both aspects: the hardware and the control technique of robots and Kansei information processing technology.

Of course, previous research has built violin-playing robots or machines. For example, Kajitani built musician robots called “MUBOTs,” which consisted of robots that played a violin, a cello, and a recorder (1989). The violin-playing robot of the MUBOT system had an arm for bowing and a hand for fingering. One finger was prepared for one fingering position and each finger was driven by a solenoid; the robot had many fingers. Jordà built a violin-playing machine with one string, and the fingering fingers were each driven by a solenoid like MUBOT (2002). Sobh and Wange built an interesting violin-playing robot that had two arms for bowing (2003). Kuwabara et al. built a violin musician robot using an industrial robot (2006). The above violin-playing machines or robots were not completely anthropomorphic because their fingering fingers were driven by solenoids or had no fingering system. In 2005 and 2007, Toyota Motor Corporation released a trumpet-playing and a violin-playing robots to the media (Kusuda, 2008; Toyota Motor Corporation, 2012). The violin-playing robot, in particular, was completely anthropomorphic and had right and left arms that can perform bowing, fingering, and vibrato. However, the mechanism and control details have not been presented officially. Toyota’s robots were built for developing technologies for supporting human life and did not focus on musical expression and Kansei Engineering. Currently, a great deal of effort continues to be spent on building musician robots and systems (Solis & Ng, 2011).

Complete Article List

Search this Journal:
Open Access Articles: Forthcoming
Volume 8: 2 Issues (2017)
Volume 7: 2 Issues (2016)
Volume 6: 2 Issues (2015)
Volume 5: 2 Issues (2014)
Volume 4: 2 Issues (2013)
Volume 3: 2 Issues (2012)
Volume 2: 2 Issues (2011)
Volume 1: 2 Issues (2010)
View Complete Journal Contents Listing