Vous constatez une erreur ?
NaN:NaN
00:00
Abstract: Over the past years, music information research has elaborated powerful tools for creating a new generation of applications that redefine the boundaries of music listening and music making. The recent availability of affordable motion capture technology has not just allowed for creating novel musical instruments, but also for integrating the study of bodily motion and gesture into the mainstream of music information research. We will present a variety of playful real-time interactive applications based on analysis techniques and models combining digitized sounds, movements, and symbolic representations.
Bio: Norbert Schnell is researcher and developer at the Real-Time Musical Interactions team at IRCAM focussing on real-time digital audio processing techniques for interactive music applications. He studied Telecommunications and Music in Graz/Austria and worked as studio assistant at the IEM. At IRCAM he initiated and participated in numerous international research and development projects as well as artistic works in the field of interactive audio-visual installations, music pedagogy, and sound simulation. He chaired the 6th International Conference on New Interfaces for Musical Expression (NIME) in 2006 and held the DAAD Edgard Varèse Guest Professorship for Electronic Music at the Technische Universität Berlin in 2007. Currently he focuses on his PhD on the animation of digitized sounds and their re-embodiment by bodily movements and gestures.
2 juin 2012 16 min
Abstract: Un des aspects les plus prospectifs de la relation instrument/machine réside dans le développement des moyens d’analyse acoustique des sons instrumentaux et vocaux en temps réel que l’on appelle descripteurs audio. Le nombre de ce
2 juin 2012 56 min
Abstract: This talk introduces a new music interaction system based on style modeling, in a MIR-oriented perspective called VirtualBand. VirtualBand aims at combining musical realism and quality with real-time interaction, by capturing esse
2 juin 2012 54 min
2 juin 2012 44 min
Abstract: Interactive Improvisation Systems involve at least three cooperating and concurrent expert agents: machine listening, machine learning, model based generation. Machine listening may occur during the initial learning stage (off-lin
2 juin 2012 59 min
Abstract: For the past 60 years, machines have been involved in all aspects of music: playing, recording, processing, editing, mixing, composing, analyzing, and synthesizing. However, in software terms, music is nothing but a sequence of nu
2 juin 2012 59 min
Abstract: The wealth of tools developed in music information retrieval (MIR) for the description, indexation, and retrieval of music and sound can be easily (ab)used for the creation of new musical material and sound design. Based on automa
2 juin 2012 44 min
Abstract: Although MIR did arguably not start as a research discipline for promoting creativity and music performance, this trend has begun to gain importance in recent years. The possibilities of MIR for supporting musical creation and mus
2 juin 2012 57 min
Vous constatez une erreur ?
1, place Igor-Stravinsky
75004 Paris
+33 1 44 78 48 43
Du lundi au vendredi de 9h30 à 19h
Fermé le samedi et le dimanche
Hôtel de Ville, Rambuteau, Châtelet, Les Halles
Institut de Recherche et de Coordination Acoustique/Musique
Copyright © 2022 Ircam. All rights reserved.