Do you notice a mistake?
NaN:NaN
00:00
The development of new digital musical instruments has grown significantly over the last decade. An important number of these instruments are based on tangible interfaces and/or body engagement of the performer. This talk will present several examples, from augmented instruments based on existing acoustic instruments (e.g. augmented violin), to alternate controllers based on body motion and touch. The different types of relationships that can be designed between gesture and sound will be explained and illustrated with concrete examples. Recent applications taken from artistic practices, music pedagogy, gaming will be discussed in the broader context of Human Computer Confluence.
Frederic Bevilacqua is the head of the Real Time Musical Interactions team at IRCAM - Institute for Music/Acoustic Research and Coordination in Paris. His research concerns the understanding of the interaction between gestures and sound processes, and the development of gesture-based musical interaction systems. He holds a master degree in physics and a Ph.D. in Biomedical Optics from the Ecole Polytechnique Fédérale de Lausanne (Swiss Federal Institute of Technology in Lausanne). He also studied music at the Berklee College of Music in Boston and has participated in different music and media arts projects. From 1999 to 2003 he was a researcher at the University of California Irvine. He joined IRCAM in October 2003 to develop gesture analysis tools for music and performing arts.
July 28, 2022 01:14:24
July 28, 2022 00:48:06
Do you notice a mistake?