Starfish is a network of delay lines that tend to expand when sound is present in them. The performer inputs sounds using a microphone, and has some limited global controls over the network. The result is an audio-visual performance.
m@ze°2 is a computer-based electronic instrument which serves as a realtime environment for composition and improvisation as well.
Essl writes —
“The software is written entirely in Max/MSP and takes advantage of my own “Real Time Composition Library” (RTC-lib). It is based on an ever-growing database of samples which are processed by so-called “structure generators” which can be triggered and controlled by the computer keyboard, the mouse and several MIDI controllers. An elaborated section of DSP routines (including reverb, filters, transposers, pitch shifter, equalizer, delay lines, LFO, ring modulator, panning) enables the player to process the generated sound structures in the end.”
“Many Hands” is a series of software interfaces that allow a pianist to shape the unfolding of a range of electronic music resources sounds, processes, events and electronic instruments and devices in response to that pianist’s playing. The software interprets data about real-time performance how fast is the playing, how dense are the chords or clusters, how long are melodic phrases, relative rhythmic steadiness, and basic harmonic analysis. The basic data from which this more complex information is calculated which note is being played at what volume level is generated by use of a piano data-tracking device (Yamaha Disklavier or a conventional acoustic piano fitted with a Moog PianoBar). It is this data that is used by the “Many Hands” software interface to calculate more complex information that can be used to expand the nature of piano performance.
Bruce Gremo: concept, mouthpiece and ergonomic design, MSP programming, performer
Jeff Feddersen: body and circuit design, name, Basic programming
Eric Singer: C++ programming, ongoing tech support
This instrument has two parts; a physical ‘flute controller’ and software application (Max/MSP). Only its performance gestures are modeled after the Japanese Shakuhachi, its sound world employing complex synthesis, sample manipulation and re-synthesis techniques. It aspires to be both a nuanced ‘event’ controller (like an acoustic instrument) and a ‘process’ controller. Its mouthpiece splits an air column produced using an open emboucher. Control data is derived from an analysis of the split air column’s dynamics. Rather than finger holes, 5 high-resolution three-dimensional track pads enable numerous finger techniques that can be used on non-keyed flutes.
Per Anders Nilsson
Magnus Eldénius: advisor
Palle Dahlstedt: advisor
The Walking Machine is a realtime controlled generative rhythm section, which has been developed for live performances in a free jazz context. The instrument provides a stable rhythmic platform as well as being dynamic enough in order to allow its player to interact with other musicians in a direct musical way. A Game pad is used as the controller, with realtime access to high-level parameters that controls probability distribution of intervals and durations, the performer controls the overall behaviour of the instrument. The instrument is allowed to sound and act “artificial” at the same time retaining the traditional role of a rhythm section.
The Voicer is a singing voice synthesizer instrument, real-time controlled with a digitizing graphic tablet (the kind of tablet used for writing recognition by computers) and a joystick. Pitch and articulation of vowel are driven in a very expressive way giving musical possibilities never reach before with voice synthesizer.
Ben Neill: designer
Terry Pierce: brass fabricator
James Lo: electronic fabrication
Frank Balde/Jorgen Brinkman: Junxion design/construction
The mutantrumpet is a hybrid electro-acoustic instrument with three bells, two sets of valves, and a trombone slide. The sound is converted via a pickup in the mouthpiece to MIDI information by a pitch to MIDI converter. The instrument has eight momentary MIDI controllers in the form of switches and 8 continuous controllers in the form of potentiometers, joysticks and a fader. These controllers are powered by the Steim Junxion board mounted on the body of the mutantrumpet. The dynamics of the acoustically played instrument are also used for MIDI control.
Blue Air is an infrared MIDI controller that measures distance. Continuous controller information is produced by movement an object or a performer’s hand vertically above the infrared eye located on the top panel of Blue Air. MIDI data produced by Blue Air are fast and tightly packed together. The typical time between blocks of MIDI datum is 5 milliseconds. The tremendous volume of data being sent is often best managed by MIDI manipulation software such as MAX by Cycling ’74. Blue Air is fast, accurate, stable, rugged, and simple to use.
Wireless MIDI Glove, consisting of flex sensors and switches communicating with MAX/MSP. The Wireless MIDI Glove is based on the DIEM digi-dancer unit, customized with a set of switches to increase number of parameters – software interface developed at RPI and Harvestworks artist residencies.
The T-Stick was developed and built in the IDMIL by Joseph Malloch, in collaboration with composer D. Andrew Stewart and performers Fernando Rocha and Xenia Pestova. The physical input device can sense where and how much of its surface is touched by the performer, as well as tilting, shaking, squeezing and twisting gestures. Unlike most digital musical instruments, the T-Sticks exist as a family, with soprano, alto, tenor, and bass members.
The T-Stick is intended to be an “expert” musical interface: engaging to new users, allowing virtuosic playing, and “worth practicing” in that practice time results in increased skill. The T-Stick has been performed and demonstrated many times in Canada, Brazil, Italy, and the USA.