Events Calendar
Sign Up
Building E14, 6th Floor Multipurpose Room View map $5 MIT community, $10 general public
View map

Experience a work-in-progress performance of new technology that probes the boundaries of where machine learning meets musical control and expression, created by Jordan Rudess and the MIT Media Lab Responsive Environments Group. Rudess performs live with a machine learning model trained on his playing style and technique, joined on select pieces by guest violinist-vocalist Camilla Bäckman. Sometimes leading, sometimes following, the model and humans together create new and unique music that interacts in real time with a kinetic sculpture that responds to and influences the behavior of the model.

 

Keyboardist/Technologist: Jordan Rudess, CAST Visiting Artist

AI/Music System Designer: Lancelot Blanchard, Research Assistant, Responsive Environments Group, MIT Media Lab

Installation Artist/Designer: Perry Naseck, Research Assistant, Responsive Environments Group, MIT Media Lab

Faculty Advisor: Joe Paradiso, Alexander W. Dreyfoos (1954) Professor and Director of the Responsive Environments Group, MIT Media Lab

 

With special guest Camilla Bäckman, violin/vocals.

 

Funded by the MIT Center for Art, Science & Technology (CAST), with support from the MIT Media Lab.

Event Details

See Who Is Interested

0 people are interested in this event