• Legos project: Sensori-motor learning in gesture-sound interactive systems : International Workshop on Movement Sonification and Learning
  • March 2, 2015
  • Ircam, Paris
  • Program note: Legos
Participants
  • Eric Boyer (conférencier)

We have acquired, through our sensorimotor system, a strong relationship with the auditory space that surrounds us. We have implicitly learned to integrate the sound of our actions and use them everyday. The development of motion sensing and audio technologies makes it possible to design specifically auditory feedback through the interactive sonification of movement features. We propose several experimental frameworks to assess the contribution of auditory feedback to sensorimotor control and learning in interactive systems. First, we studied the processing of spatialized auditory feedback for online motor control. Second, in a visuo-manual tracking task, we found that both error and task sonification can improve the performance while implying learning mechanisms. We also show that the sonification of user’s movement tends to increase the motion energy.

Finally, we present the concept of sound-oriented task, where the target is expressed as acoustic features to produce through a movement, demonstrating that motor adaptation can be driven by interactive audio cues only. Overall, we argue that continuous movement sonification should be further investigated in auditory-motor coupling research, and we highlight important applications through original setups we developed, like perceptual and physical training, and playful scenarios for rehabilitation.

Legos project: Sensori-motor learning in gesture-sound interactive systems International Workshop on Movement Sonification and Learning

From the same archive