Vous constatez une erreur ?
NaN:NaN
00:00
The development of new digital musical instruments has grown significantly over the last decade. An important number of these instruments are based on tangible interfaces and/or body engagement of the performer. This talk will present several examples, from augmented instruments based on existing acoustic instruments (e.g. augmented violin), to alternate controllers based on body motion and touch. The different types of relationships that can be designed between gesture and sound will be explained and illustrated with concrete examples. Recent applications taken from artistic practices, music pedagogy, gaming will be discussed in the broader context of Human Computer Confluence.
Frederic Bevilacqua is the head of the Real Time Musical Interactions team at IRCAM - Institute for Music/Acoustic Research and Coordination in Paris. His research concerns the understanding of the interaction between gestures and sound processes, and the development of gesture-based musical interaction systems. He holds a master degree in physics and a Ph.D. in Biomedical Optics from the Ecole Polytechnique Fédérale de Lausanne (Swiss Federal Institute of Technology in Lausanne). He also studied music at the Berklee College of Music in Boston and has participated in different music and media arts projects. From 1999 to 2003 he was a researcher at the University of California Irvine. He joined IRCAM in October 2003 to develop gesture analysis tools for music and performing arts.
28 juillet 2022 01:14:24
28 juillet 2022 00:48:06
Vous constatez une erreur ?
1, place Igor-Stravinsky
75004 Paris
+33 1 44 78 48 43
Du lundi au vendredi de 9h30 à 19h
Fermé le samedi et le dimanche
Hôtel de Ville, Rambuteau, Châtelet, Les Halles
Institut de Recherche et de Coordination Acoustique/Musique
Copyright © 2022 Ircam. All rights reserved.