informations

Type
Conférence scientifique et/ou technique
Lieu de représentation
Ircam, Salle Igor-Stravinsky (Paris)
durée
54 min
date
2 juin 2012

Abstract: This talk introduces a new music interaction system based on style modeling, in a MIR-oriented perspective called VirtualBand. VirtualBand aims at combining musical realism and quality with real-time interaction, by capturing essential elements of the style of a musician and by reusing these elements during the musical improvisation of the user so that an interactive, real-time musical engagement takes place just as it happens with a real band of responsive musicians. To enable this to take place, we address style modeling from the new perspective of combinatorial statistical modeling. Markov chains provide a definition of style, though rudimentary, as the set of local patterns of a given fixed length. However, Markov chain approaches suffer from a latent “control problem”: control constraints are not compatible with Markov models, as they induce long-range dependencies that violate the Markov hypothesis of limited memory. To overcome this problem, we have reformulated Markov generation in the framework of constraint satisfaction, and have demonstrated that this approach solves the control problem, and opens the door to fully malleable representations of style. VirtualBand uses this technology to provide interactive jazz accompaniment. VirtualBand proceeds in two steps: a recording and a playing phase. First, recordings of professional musicians are analyzed to extract musical metadata (such as harmony, energy, or rhythm) to build a style database. When the musician plays, VirtualBand explores the style database, for each virtual musician, to produce music that matches the players’ own performance features (e.g., volume, density of notes, pitch). Thanks to this adaptive behavior, the playing experience is unique: every time the user plays with the system the rhythm section adapts to the performance and generates a new accompaniment.

Bio: François Pachet received his Ph.D. and Habilitation degrees from Paris 6 University (UPMC). He is a Civil Engineer (Ecole des Ponts and Chaussées) and was Assistant Professor in Artificial Intelligence and Computer Science, at Paris 6 University, until 1997. He then set up the music research team at SONY Computer Science Laboratory Paris, where he developed the vision that metadata can greatly enhance the musical experience in all its dimensions, from listening to performance. His team conducts research in interactive music listening and performance and musical metadata and developed several innovative technologies (constraint-based spatialization, intelligent music scheduling using metadata) and award winning systems (MusicSpace, PathBuilder, The Continuator for Interactive Music Improvisation, etc.). He is the author of over 80 scientific publications in the fields of musical metadata and interactive instruments. His current research focuses on creativity and content generation, as he was recently awarded an ERC Advanced Grant to develop the concepts and technologies of "flow machines": a new generation of content generation tools that help users find and develop their own "style".

intervenants

Les médias liés à cet évènement

Introduction - Geoffroy Peeters

2 juin 2012 16 min

Vidéo

Descripteurs audio : un enjeux majeur pour la composition en temps réel

Abstract: Un des aspects les plus prospectifs de la relation instrument/machine réside dans le développement des moyens d’analyse acoustique des sons instrumentaux et vocaux en temps réel que l’on appelle descripteurs audio. Le nombre de ce

2 juin 2012 56 min

Vidéo

Table ronde

2 juin 2012 44 min

Vidéo

Information retrieval and deployment in interactive improvisation systems - Gérard Assayag

Abstract: Interactive Improvisation Systems involve at least three cooperating and concurrent expert agents: machine listening, machine learning, model based generation. Machine listening may occur during the initial learning stage (off-lin

2 juin 2012 59 min

Vidéo

Gestural Re-Embodiment of Digitized Sound and Music - Norbert Schnell

Abstract: Over the past years, music information research has elaborated powerful tools for creating a new generation of applications that redefine the boundaries of music listening and music making. The recent availability of affordable mo

2 juin 2012 28 min

Vidéo

Playing with Music - Tristan Jehan

Abstract: For the past 60 years, machines have been involved in all aspects of music: playing, recording, processing, editing, mixing, composing, analyzing, and synthesizing. However, in software terms, music is nothing but a sequence of nu

2 juin 2012 59 min

Vidéo

Interactive Exploration of Sound Corpora for Music Performance and Composition - Diemo Schwarz

Abstract: The wealth of tools developed in music information retrieval (MIR) for the description, indexation, and retrieval of music and sound can be easily (ab)used for the creation of new musical material and sound design. Based on automa

2 juin 2012 44 min

Vidéo

MIR beyond retrieval: Music Performance, Multimodality and Education - Sergi Jorda

Abstract: Although MIR did arguably not start as a research discipline for promoting creativity and music performance, this trend has begun to gain importance in recent years. The possibilities of MIR for supporting musical creation and mus

2 juin 2012 57 min

Vidéo

partager


Vous constatez une erreur ?

IRCAM

1, place Igor-Stravinsky
75004 Paris
+33 1 44 78 48 43

heures d'ouverture

Du lundi au vendredi de 9h30 à 19h
Fermé le samedi et le dimanche

accès en transports

Hôtel de Ville, Rambuteau, Châtelet, Les Halles

Institut de Recherche et de Coordination Acoustique/Musique

Copyright © 2022 Ircam. All rights reserved.