KONTRAKTION – Sonification of Metagestures with electromyographic Signals (PDF)
Reference: ACM Digital Library


Only contains the wrapper to connect the Myo armband to Max/MSP via OSC.
No sonification algorithms or multichannel spatial positioning patch included. It does include a rough machine-learning example using ml.lib.


With todays advanced sensor technologies unprecedented opportunities to pick up on very subtle gestures and energetic traces are opening up, which generally go unnoticed by basic human perception. These gestures can be processed and used to trace our mood, behavior and bodily mechanics. Being able to perceive this abstract information in real time gives us the ability to learn more about our own bodies and minds. This can enhance everyday interactions and expand our perception on the world around and within us.
Exploring these subtle expressions provides us with an intimate and finely grained interface beyond observable movement and therefore gives us more detailed controls for human machine interactions.

‘Kontraktion’ is an embodied musical interface using biosignals to create an immersive sonic performance setup. The user wears a sensor armband that captures multiple streams of data about his arms position and muscle movement. The captured data is turned into sound and routed to an object based surround setup.

By using the setup as a biofeedback system the user explores his own subconscious gestures with a heightened sensitivity. Even subtle, usually unaware neural impulses are brought to conscious awareness by sensing muscle contractions and projecting them outward into space in realtime. The users gestural expressions are embodied in sound and allow for an expressive energetic coupling between the users body and a virtual agent.

Utilizing the newly adopted awareness of his body the user can take control of the sound and perform with it using the metagestures of his body as an embodied interface. The body itself is transformed into a musical instrument, controlled by neurological impulses and sonified by a virtual interpreter.

Since all the captured Biosignals are treated discrete in the sonification process, it is possible to route the signals to a object based surround setup – making it possible to move the sounds in space during the performance. This gives adds spatial correlation between the arm and the sound sources and enhances the extrapolation of the muscle contractions.


I need more information on this...