Do you notice a mistake?
NaN:NaN
00:00
Abstract: The wealth of tools developed in music information retrieval (MIR) for the description, indexation, and retrieval of music and sound can be easily (ab)used for the creation of new musical material and sound design. Based on automated audio description and selection, corpus-based concatenative synthesis allows to exploit large collections of sound to compose novel timbral and harmonic structures. The metaphor for musical creation is here an explorative navigation through the sonic landscape of the corpus. We will present examples and applications of real-time interactive corpus-based concatenative synthesis for music composition, sound design, installations, and interactive performance.
Bio: Diemo Schwarz is researcher–developer at the Real-Time Music Interaction (IMTR) team at Ircam, working on sound analysis and interactive corpus-based concatenative synthesis in multiple research and musical projects at the intersection between computer science, music technology, and audio-visual creation. He holds a PhD in computer science applied to music from the University of Paris, awarded in 2004 for the development of a new method of concatenative musical sound synthesis by unit selection from a large database.
July 28, 2022 00:16:19
July 28, 2022 00:56:51
July 28, 2022 00:54:30
July 28, 2022 00:44:05
July 28, 2022 00:59:05
July 28, 2022 00:28:03
July 28, 2022 00:59:18
July 28, 2022 00:57:16
Do you notice a mistake?