Vous constatez une erreur ?
NaN:NaN
00:00
Abstract: The wealth of tools developed in music information retrieval (MIR) for the description, indexation, and retrieval of music and sound can be easily (ab)used for the creation of new musical material and sound design. Based on automated audio description and selection, corpus-based concatenative synthesis allows to exploit large collections of sound to compose novel timbral and harmonic structures. The metaphor for musical creation is here an explorative navigation through the sonic landscape of the corpus. We will present examples and applications of real-time interactive corpus-based concatenative synthesis for music composition, sound design, installations, and interactive performance.
Bio: Diemo Schwarz is researcher–developer at the Real-Time Music Interaction (IMTR) team at Ircam, working on sound analysis and interactive corpus-based concatenative synthesis in multiple research and musical projects at the intersection between computer science, music technology, and audio-visual creation. He holds a PhD in computer science applied to music from the University of Paris, awarded in 2004 for the development of a new method of concatenative musical sound synthesis by unit selection from a large database.
28 juillet 2022 00:16:19
28 juillet 2022 00:56:51
28 juillet 2022 00:54:30
28 juillet 2022 00:44:05
28 juillet 2022 00:59:05
28 juillet 2022 00:28:03
28 juillet 2022 00:59:18
28 juillet 2022 00:57:16
Vous constatez une erreur ?
1, place Igor-Stravinsky
75004 Paris
+33 1 44 78 48 43
Du lundi au vendredi de 9h30 à 19h
Fermé le samedi et le dimanche
Hôtel de Ville, Rambuteau, Châtelet, Les Halles
Institut de Recherche et de Coordination Acoustique/Musique
Copyright © 2022 Ircam. All rights reserved.