Vous constatez une erreur ?
NaN:NaN
00:00
CoMo Vox est une application Web réalisée en partenariat avec Radio France, destinée aux novices qui souhaitent apprendre les éléments de bases des gestes de direction (aucune connaissance musicale préalable n’est réquise). L'application permet, téléphone en main, de modifier les nuances et le tempo de diverses pièces musicales en effectuant les gestes du chef. Cette application est réalisée avec le soutien du ministère de l’Éducation nationale.
Here we demonstrate newly composed music, a real-time controller, and data acquisition system for the project EAR Stretch. EAR Stretch aims to improve Contemporary Music reception in non-expert audiences by enhancing embodied temporal expec
25 mars 2022 29 min
This presentation outlines a new project that offers a novel methodology for the development of new styles of acoustically optimised auditoria and room shape based on biometric sensing and evolutionary computation. The work is based on e
25 mars 2022 29 min
The composition is a comment to Karim Haddad’s String Trio "And I have tried to keep them from falling" (2001)"I would prefer a small mountain temple" is written between 2018 and 2021. Both titles refer to Ezra Pound’s (185-1972) poem Canto
25 mars 2022 26 min
Corpos sonoros is a long-term research project proposed by Thembi Rosa and João Tragtenberg to work with sound-movement interactions using Giromin, a IMU based wearable Digital Dance and Music Instrument. It is an open space for the develop
25 mars 2022 30 min
Since the re-launch of the IRCAM Forum platform in 2019, a lot of improvements has been done to ensure that the users can manage their own projects, contents and discussions. The current features will be presented as well as the roadmap foc
25 mars 2022 27 min
My research is focused around listening to electromagnetic energy in everyday urban environments through sound walks whilst capturing these experiences through the use of multichannel sensing and recording devices or ‘assemblages’ and 360 v
25 mars 2022 24 min
During the past year, our industry has witnessed an accelerated democratization of several technologies that enable the creation and distribution of audio and music content in new immersive audio formats - now supported in popular streaming
25 mars 2022 34 min
Asterisms is a form of participatory concert, without a traditional frontal stage, making it possible to develop the active listening of the public by placing them at the center of the experience. The project is based on a distributed sound
25 mars 2022 32 min
25 mars 2022 01 h 09 min
25 mars 2022 32 min
The presentation will discuss ongoing work implementing various AI-based techniques in om# and OpenMusic, for use in musical composition workflows. Human creativity is ill-defined by nature. Achieving "those right kinds of errors" may p
25 mars 2022 34 min
25 mars 2022 25 min
Sasha Wilde, Nicole Bettencourt Coelho and Yuki Nakayama are a trio of sound nerds who met last year and began bringing their various skills and practices together to form an experimental audio band. They employ wearable gestural technologi
25 mars 2022 28 min
A System for the Synchronous Emergence of Music Derived from Movement is an immersive audio and visual work whose purpose is to define and explore a relationship between the movement of an artist’s hand (brush or pen, etc.) and a generative
25 mars 2022 26 min
I propose AI system called Neural Ordinary Differential Equations (NODE) for sound synthesis. My method provides a simple and intuitive way to construct new sound objects and new type of sound synthesis by manipulating matrixes of maps. Dif
25 mars 2022 18 min
25 mars 2022 41 min
The Autocoder package is a tool based around a variational autoencoder––a neural network capable of learning a spectral representation of a soundfile and synthesizing a novel output based on the trained model. A spectral representation i
25 mars 2022 27 min
How can rich spatial data from acoustic instruments be applied to situate synthesized sound spatially with dynamic three-dimensional forms? How can machine learning be used to re-embody the spatial presence of live instruments and performer
25 mars 2022 29 min
Vous constatez une erreur ?
1, place Igor-Stravinsky
75004 Paris
+33 1 44 78 48 43
Du lundi au vendredi de 9h30 à 19h
Fermé le samedi et le dimanche
Hôtel de Ville, Rambuteau, Châtelet, Les Halles
Institut de Recherche et de Coordination Acoustique/Musique
Copyright © 2022 Ircam. All rights reserved.