In “Continuous realtime gesture following and recognition,“ Frederic Bevilacqua, Bruno Zamborlin, Anthony Sypniewski, Norbert Schnell, Fabrice Guedy, and Nicolas Rasamimanana on the Real Time Musical Interactions Team at IRCAM, CNRS – STMS in Paris, France describe a system they built for real-time gesture analysis.
The system outputs continuously a set of parameters relative to the gesture time progression and its likelihood of occuring. These parameters are computed by comparing the performed gesture with stored reference gestures. The method relies on a detailed modeling of multidimensional temporal curves.
Compared to standard HMM systems, the learning procedure is simplified using prior knowledge allowing the system to use a single example for each class. Several applications have been developed using this system in the context of music education, music and dance performances and interactive installation.