17 mars 2021 19 min
17 mars 2021 28 min
17 mars 2021 28 min
17 mars 2021 26 min
17 mars 2021 31 min
17 mars 2021 33 min
17 mars 2021 28 min
17 mars 2021 26 min
17 mars 2021 13 min
17 mars 2021 18 min
29 novembre 2006 20 min
29 novembre 2006 01 h 07 min
29 novembre 2006 59 min
29 novembre 2006 12 min
29 novembre 2006 50 min
29 novembre 2006 47 min
29 novembre 2006 18 min
29 novembre 2006 51 min
0:00/0:00
During the past decade, new object-based immersive audio content formats and creation tools were developed for cinematic and musical production. These technologies free the music creator from the constraints of normalized loudspeaker configurations. They also support head rotations along three degrees of freedom (3-DoF), thus unlocking a natural immersive listening experience through headphones or wearable audio devices. Meanwhile, interactive audio experiences in video games and virtual or augmented reality require a scene representation supporting 6-DoF listener navigation, where an audio object models a natural sound source having controllable distance, orientation, and directivity properties. Additionally, acoustic environment properties must be explicitly included and decoupled from the sound source description. We examine and compare these two conceptions of object-based spatial audio and seek to unify them with a view to connecting previously disparate digital media applications and industries.
1, place Igor-Stravinsky
75004 Paris
+33 1 44 78 48 43
Du lundi au vendredi de 9h30 à 19h
Fermé le samedi et le dimanche
Hôtel de Ville, Rambuteau, Châtelet, Les Halles
Institut de Recherche et de Coordination Acoustique/Musique
Copyright © 2022 Ircam. All rights reserved.