TitleCombining Audio And Gestures For A Real-Time Improviser
Publication TypeConference Paper
Year of Publication2005
AuthorsMorales-Mazanares, R, Morales, EF, Wessel, D
Conference NameInternational Computer Music Conference
PublisherInternational Computer Music Association
Conference LocationBarcelona, Spain
Abstract

Skilled improvisers are able to shape in real time a music discourse by continuously modulating pitch, rhythm, tempo and loudness to communicate high level information such as musical structures and emotion. Interaction between musicians, correspond to their cultural background, subjective reaction around the generated material and their capabilities to resolve in their own terms the aesthetics of the resultant pieces. In this paper we introduce GRI an environment, which incorporates music and movement gestures from an improviser to adquire precise data and react in a similar way as an improviser. GRI takes music samples from a particular improviser and learns a classifiers to identify different improvision styles. It then learns for each style a probabilistic transition automaton that considers gestures to predict the most probable next state of the musician. The current musical note, the predicted next state, and gesture information are used to produce adequate responses in real-time. The system is demonstrated with a flutist, with accelerometers and gyros to detect gestures with very promising results.

URLhttp://cnmat.berkeley.edu/publications/combining_audio_and_gestures_real_time_improviser