Les médias liés à cet évènement

Gestural-Based Sound Spatialization & Synthesis Strategies in 3D Virtual Environment in Interactive Audiovisual Composition

31 mars 2023 28 min

Media Specific Performance: the screen mediated production during the Pandemic

31 mars 2023 34 min

AI, networked performance and aesthetic judgment

31 mars 2023 27 min

The World of Freedom

31 mars 2023 22 min

ISMM team (IRCAM) - Presentation of the latest projects for Max: Mubu, CataRT, SkataRT, Gesture&Sound Toolkit.

31 mars 2023 30 min

Family Life - recomposed

31 mars 2023 31 min

Conclusions

31 mars 2023 24 min

Shallow Steps - Spatial Cognitive Sonification of Generative Visuals

31 mars 2023 27 min

Max/MSP Spat library, sensors, and Unreal Engine: a workflow for a real-time generative VR project

31 mars 2023 32 min

Point sur MacIntel et les logiciels du Forum - Carlos Amado Agon, Riccardo Borghesi, Karim Haddad, Nicholas Ellis

29 novembre 2006 20 min

Nouveautés AudioSculpt 2.7 et SuperVP 2.91 - Xavier Rodet, Alain Lithaud, Niels Bogaards, Axel Roebel

29 novembre 2006 01 h 07 min

Nouveautes OpenMusic - Gérard Assayag, Jean Bresson, Carlos Amado Agon, Karim Haddad

29 novembre 2006 59 min

Point sur le Spatialisateur - Olivier Warusfel, Rémy Muller, Terence Caulkins

29 novembre 2006 12 min

Nouveautés Modalys - Joël Bensoam, Nicholas Ellis, Jean Lochard

29 novembre 2006 50 min

Mlys - une interface de contrôle de Modalys dans Max/MSP - Manuel Poletti

29 novembre 2006 47 min

Accueil - Andrew Gerzso

29 novembre 2006 18 min

Développements récents de l'équipe applications temps réel - Diemo Schwarz, Riccardo Borghesi, Norbert Schnell

29 novembre 2006 51 min

Ada's Song: Making machine-learning processes visible and tangible

0:00/0:00

Ada’s Song is a ca. 10-minute work for mezzo-soprano, ensemble and an interactive Piano Machine system, commissioned as an hommage to Ada Lovelace in 2019. It was created using AI-assisted composition processes, and employs real-time machine learning in the performance of the Piano Machine. Designed at Goldsmiths College in collaboration with Konstantin Leonenko in 2017, the Piano Machine plays the strings of the piano directly through mechanical, sustained vibration created by a set of motors and finger-like appendages controlled by microprocessors, thus creating dynamic control of notes over time, piloted by wireless OSC messaging. The material performed by the Piano Machine was generated by a concatenation of recordings of a work by Henry Purcell, Hosanna to the Highest, such that the repetitive ground bass of the original creates a foundation for the expressive intervention of real-time machine-learning processes.
In an attempt to render the Piano Machine more expressive and responsive to the ‘human’ musicians’ performance, the repeating harmonic patterns performed by the Piano Machine are shaped by machine learning processes that ‘listen’ to the instrumentalists during the rehearsals and performance. These processes filter the reservoir of notes and amplitudes produced from the concatenated recordings, not only in relation the notes that been played, but how they have been performed. This is achieved by building up training sets of timbral data over the course of rehearsals. Thus, the Piano Machine inscribes itself into the expressive sonic world of the ensemble.

intervenants

informations

Type
Ensemble de conférences, symposium, congrès
Lieu de représentation
Ircam, Salle Igor-Stravinsky (Paris)
durée
24 min
date
31 mars 2023

IRCAM

1, place Igor-Stravinsky
75004 Paris
+33 1 44 78 48 43

heures d'ouverture

Du lundi au vendredi de 9h30 à 19h
Fermé le samedi et le dimanche

accès en transports

Hôtel de Ville, Rambuteau, Châtelet, Les Halles

Institut de Recherche et de Coordination Acoustique/Musique

Copyright © 2022 Ircam. All rights reserved.