Vous constatez une erreur ?
NaN:NaN
00:00
During the past decade, new object-based immersive audio content formats and creation tools were developed for cinematic and musical production. These technologies free the music creator from the constraints of normalized loudspeaker configurations. They also support head rotations along three degrees of freedom (3-DoF), thus unlocking a natural immersive listening experience through headphones or wearable audio devices. Meanwhile, interactive audio experiences in video games and virtual or augmented reality require a scene representation supporting 6-DoF listener navigation, where an audio object models a natural sound source having controllable distance, orientation, and directivity properties. Additionally, acoustic environment properties must be explicitly included and decoupled from the sound source description. We examine and compare these two conceptions of object-based spatial audio and seek to unify them with a view to connecting previously disparate digital media applications and industries.
4 novembre 2024 00:19:26
4 novembre 2024 00:28:26
4 novembre 2024 00:28:47
4 novembre 2024 00:26:56
4 novembre 2024 00:31:43
4 novembre 2024 00:33:09
4 novembre 2024 00:28:03
17 mai 2021 00:26:22
4 novembre 2024 00:13:26
4 novembre 2024 00:18:58
Vous constatez une erreur ?
1, place Igor-Stravinsky
75004 Paris
+33 1 44 78 48 43
Du lundi au vendredi de 9h30 à 19h
Fermé le samedi et le dimanche
Hôtel de Ville, Rambuteau, Châtelet, Les Halles
Institut de Recherche et de Coordination Acoustique/Musique
Copyright © 2022 Ircam. All rights reserved.