From the same archive

Mettre en temps une structure musicale : l'activité de composition de Voi(rex) par Philippe Leroux - Nicolas Donin, Jacques Theureau

April 14, 2005 01 h 01 min

Mettre en temps une structure musicale : l'activité de composition de Voi(rex) par Philippe Leroux - Nicolas Donin, Jacques Theureau

April 14, 2005 24 min

L'estimation de fréquences fondamentales multiples

May 12, 2005 52 min

La harpe électroacoustique

February 4, 2005 01 h 18 min

Utilisation de Modalys pour le projet VoxStruments, lutherie numérique intuitive et expressive - Nicholas Ellis, Joël Bensoam

October 17, 2007 49 min

Présentation des travaux l'équipe PdS dans le cadre du projet européen CLOSED : "Closing the Loop of Sound Evaluation and Design" - Olivier Houix

June 27, 2007 01 h 12 min

Sparse overcomplete methods, matching pursuit and basis pursuit - Bob L. Sturm

July 11, 2007 48 min

Transformations de type et de nature de la voix - Snorre Farner, Axel Roebel, Xavier Rodet

September 12, 2007 01 h 07 min

Segmentations et reconnaissances automatiques de phonèmes de la voix, temps différé, temps réel - Pierre Lanchantin, Julien Bloit, Xavier Rodet

September 19, 2007 01 h 13 min

Synthèse de la parole à partir du texte et construction d'une base de données d'unités de la voix - Christophe Veaux, Grégory Beller, Xavier Rodet

September 26, 2007 01 h 00 min

Projet ECOUTE - Jerome Barthelemy, Nicolas Donin, Geoffroy Peeters, Samuel Goldszmidt

October 3, 2007 01 h 12 min

Projet MusicDiscover - David Fenech Saint Genieys

October 10, 2007 01 h 10 min

Projet CASPAR - Jerome Barthelemy, Alain Bonardi

October 24, 2007 50 min

Projet CONSONNES 1ère partie - René Caussé, Vincent Freour, David Roze

November 21, 2007 57 min

Interactive Machine Learning as Musical Design Tool

0:00/0:00

Supervised learning algorithms can be understood not only as a set of techniques for building accurate models of data, but also as design tools that can enable rapid prototyping, iterative refinement, and embodied engagement— all activities that are crucial in the design of new musical instruments and other embodied interactions. Realising the creative potential of these algorithms requires a rethinking of the interfaces through which people provide data and build models, providing for tight interaction-feedback loops and efficient mechanisms for people to steer and explore algorithm behaviors.
I created the Wekinator software in 2009 to enable composers, game designers, and other creative practitioners to apply such an interactive approach to machine learning to their work. In this talk, I’ll discuss some of my findings from 6 years of observing this software in use in creative contexts, and my thoughts on the future of data and machine learning as design tools. I’ll also give a live demo of a new version of the software that will be released this summer.

Rebecca Fiebrink is a Lecturer at Goldsmiths, University of London. Her research lies at the intersection of computer science, human-computer interaction, and the digital arts. Fiebrink is the developer of the Wekinator system for real-time interactive machine learning (with a new version out this summer!), a co-creator of the Digital Fauvel platform for interactive musicology, and a Co-I on the Horizon 2020-funded RAPID-MIX project on real-time adaptive prototyping for industrial design of multimodal expressive technology. She was previously an Assistant Professor at Princeton University, where she co-directed the Princeton Laptop Orchestra. She has worked with companies including Microsoft Research, Sun Microsystems Research Labs, Imagine Research, and Smule, where she helped to build the #1 iTunes app "I am T-Pain." She holds a PhD in Computer Science from Princeton University.

speakers

information

Type
Séminaire / Conférence
performance location
Ircam, Salle Igor-Stravinsky (Paris)
duration
01 h 00 min
date
May 20, 2015

IRCAM

1, place Igor-Stravinsky
75004 Paris
+33 1 44 78 48 43

opening times

Monday through Friday 9:30am-7pm
Closed Saturday and Sunday

subway access

Hôtel de Ville, Rambuteau, Châtelet, Les Halles

Institut de Recherche et de Coordination Acoustique/Musique

Copyright © 2022 Ircam. All rights reserved.