Regenerative Music, developed by James Fung at the University of Toronto, explores new physiological interfaces for musical instruments. The computer, instead of taking active only cues from the musician, reads physiological signals (heart beat, respiration, brain waves, etc.) from the musician.
These signals are used to alter the behaviour of the instrument. For instance filter settings on the sound can be applied, to which the musician responds by changing the way s/he plays. The music will in turn generate an emotional response on the part of the musician/performer, and that this response will be spotted by the computer, which then modifies the behaviour of the instrument further.
The musician and instrument can both be regarded as an “instrument” playing off of each other.
The concept was extended during DECONcert 1. 48 people’s electroencephalogram signals were hooked up to affect collectively the audio environment.
The EEG sensors detected the electrical activity produced in the brains of the participants. The signals from the brains were used as signals to alter a computationally controlled soundscape.
The resulting soundscape triggered a response from the participants, and the collective response from the group was sensed by the computer, which then altered the music based upon this response. And so forth.