Add to iCal
Find on Google Maps
Share

Tuesday, November 23, 2010, 9:00pm to 11:00pm

Per Anders Nilsson is a Ph.D. student/Senior lecturer at the Academy of Music and Drama at University of Gothenburg, Sweden:

During the course of the years, I have developed a hyper instrument, that is, a collection of interconnected digital music instrument aimed for improvisation. The entire system consists of a number of modules, which are assembled and interconnected into a hyper instrument. The modules show a huge variety with respect to pitch behavior, degrees of control and sound generation etc. Inevitably, it is possible to classify the modules with regards to different criterion; however, I choose to make a categorization with respect to function and performance practice. The categories of choice are main instrument, ancillary instrument, ancillary effects and a miscellaneous fourth group, GRM Tools. In order to categorize instrument, playing behavior and audible response with regard to bodily gestural input is used as the decisive factor, and therefore I will make a distinction between played instrument, and controlled, instrument.

Some examples of my instruments:

The Walking Machine. A virtual jazz rhythm section mainly developed here at cnmat. Basically, it generates new events on behalf of probability distribution of intervals and durations.

The SyncLooper. A 4-track synchronous looper, which slices a sample into 8, 16 04 32 segments, with arbitrarily control of order, pitch and duration.

The Expressure Pad. A 14-dimensional vector synthesis engine, controlled by Trigger Finger. The ExpressurePad family is a series of instrument that are co-developed by Palle Dahlstedt and myself. To begin with, we posed us the following question: How can we explore and control complex electronic sound spaces in improvisation, retaining the millisecond interaction that is taken for granted in acoustic improvisation, but has somehow got lost in electronic music?

-per anders