• Set SĂ©minaires Recherche & Technologie
  • Saison 2016-2017 - None - None > Grab-and-play mapping: Creative machine learning approaches for musical inclusion and exploration
  • Jan. 4, 2017
  • Ircam, Paris

We present first implementations and evaluations of a new tool for prototyping digital musical instruments, which allows a user to literally grab a controller and turn it into a new, playable musical instrument almost instantaneously.

The tool briefly observes a user interacting with a controller or sensors (without making any sound), and then it automatically generates a mapping from this observed input space to the control of an arbitrary sound synthesis program. The sound is then immediately manipulable using the controller, and this newly-created instrument thus invites the user to begin an embodied exploration of the newly-created relationships between human movement and sound.

We hypothesize that this approach offers a useful alternative to both the creation of mappings by programming and to existing supervised learning approaches that create mappings from labeled training data. We have explored the potential value and trade-offs of this approach with two user groups. In a workshop and a case study with a music therapist working with disadvantaged young people who are unlikely to learn instrumental music, we observed advantages to the rapid adaptation afforded by this tool. In a second workshop and three interviews with computer musicians, we learned about how this "grab-and-play" interaction paradigm might fit into professional compositional practices.