Almost ten years ago computer and software giant Microsoft introduced a movement-detecting controller for its gaming platforms known as Kinect. Aside from being able to play games, there was much excitement among musicians, scientists, and even healthcare professionals who saw its potential as a virtualised control for their particular areas.
A user could play a computer-based musical instrument through gestures and movements picked up by the 3D sensing device. However, the system required large movements, of arms and legs for instance, which perhaps limited the nuances of the music one might play with such an “instrument”. Now, writing in the International Journal of Computational Science and Engineering, a team from Taiwan describes their algorithm for real-time finger movement tracking using a Kinect device that would allow a performer to simulate playing the guitar and the software to generate appropriate sound.
One might envisage the wannabe rock guitarist playing “air guitar” and generating guitar or other sounds synchronised to their finger movements with the melody, chords, and sounds pre-programmed. But, more seriously, the system could be used to genuinely play music through finger movements alone without the need for an actual guitar.
The team’s experiments with the system show that the proposed method can be used to play music of different genres with acceptable quality. They add that the application might a novice who has no or little experience of playing real musical instruments as well as experimental musicians seeking an alternative paradigm to the conventional instruments available to them.
Hakim, N.L., Sun, S-W., Hsu, M-H., Shih, T.K. and Wu, S-J. (2019) ‘Virtual guitar: using real-time finger tracking for musical instruments‘, Int. J. Computational Science and Engineering, Vol. 18, No. 4, pp.438-450.