m@ze°2 is a computer-based electronic instrument which serves as a realtime environment for composition and improvisation as well.
Essl writes —
“The software is written entirely in Max/MSP and takes advantage of my own “Real Time Composition Library” (RTC-lib). It is based on an ever-growing database of samples which are processed by so-called “structure generators” which can be triggered and controlled by the computer keyboard, the mouse and several MIDI controllers. An elaborated section of DSP routines (including reverb, filters, transposers, pitch shifter, equalizer, delay lines, LFO, ring modulator, panning) enables the player to process the generated sound structures in the end.”
“Many Hands” is a series of software interfaces that allow a pianist to shape the unfolding of a range of electronic music resources sounds, processes, events and electronic instruments and devices in response to that pianist’s playing. The software interprets data about real-time performance how fast is the playing, how dense are the chords or clusters, how long are melodic phrases, relative rhythmic steadiness, and basic harmonic analysis. The basic data from which this more complex information is calculated which note is being played at what volume level is generated by use of a piano data-tracking device (Yamaha Disklavier or a conventional acoustic piano fitted with a Moog PianoBar). It is this data that is used by the “Many Hands” software interface to calculate more complex information that can be used to expand the nature of piano performance.
Piano performance analysis in "Many Hands" interface, designed with Max/MSP
Bruce Gremo: concept, mouthpiece and ergonomic design, MSP programming, performer
Jeff Feddersen: body and circuit design, name, Basic programming
Eric Singer: C++ programming, ongoing tech support
This instrument has two parts; a physical ‘flute controller’ and software application (Max/MSP). Only its performance gestures are modeled after the Japanese Shakuhachi, its sound world employing complex synthesis, sample manipulation and re-synthesis techniques. It aspires to be both a nuanced ‘event’ controller (like an acoustic instrument) and a ‘process’ controller. Its mouthpiece splits an air column produced using an open emboucher. Control data is derived from an analysis of the split air column’s dynamics. Rather than finger holes, 5 high-resolution three-dimensional track pads enable numerous finger techniques that can be used on non-keyed flutes.
Remapping of MIDI instruments to modify predefined sound parameters on any subtractive “analog” VST synthesizer. Requires Max/MSP, VST plugins, my clavisound-0.1 patch (for Max), connection to my database located in korgman.is-a-geek.net and a MIDI instrument.
The “clavisound” Max program is routing the MIDI notes to VST parameters. A database (MySQL) stores the mapping of the midi notes to VST parameters. The database is publicly available in through MySQL connection in korgman.is-a-geek.net.
Stefan Tiedje describes Les Ondes Memorielles as —
“It’s a software (Max/MSP) controlled by a normal fader box and keyboard, and other assignable controllers, which allows to play a memory of the sounds heard before. It will play back the memory modified though (pitched, stretched, with rythms and melodies).
The real time control is the essential part, as its intended to imrovise the playback of the memory…
A memory of sound, which consists of a delay line with 8 taps and 8 buffers. All with a maximum duration of 90 seconds. Each tap has a pitch shift, a feedback and na additional send. Pitches can be controlled with a keyboard and/or an algorithmic process…”
The The Composers’ Playpen is a Max/MSP program which enables very sophisticated MIDI control of existing synthesizers, sound modules, and samplers.
Barton McLean writes of his software —
“Originally inspired by “M” and “Jam Factory”, it builds on these seminal programs by offering many more ways and different approaches of manipulating note and chordal material. It uses a collection of single-track sequences and MIDI loops as its basic units. Once these are recorded, they are manipulated in many ways, some never before seen in programs of this nature. There is opportunity to keep everything completely live, or to save some or all of the material for future performance.
The Composers’ Playpen, therefore, has the advantage of using an elegant Max/MSP program for unique and comprehensive areas of control while maintaining utilization of any existing MIDI synthesizers or samplers. As such, it will not make existing setups obsolete, but rather will enhance them.”
Intelligent rhythm generator in which individual agents interact with one another to create complex evolutionary rhythms. Can send data out as MIDI, or use internal samples. Responds to high-level user input: density, volume. Individual agents assume “personalities” that shape their response; these personalities can be influenced. Written in Max/MSP.
Agent-based robotic drumming, Arne Eigenfeldt and Ajay Kapur.
Bob Gluck: designer
Incorporated I-Cube glove and digitizer by Infusion Systems
eShofar1 (2001 – 2005) is held by the right hand, enclosed within an I-Cube sensor glove. Finger movements upon the body of the shofar thus control the filtering and other sound processing, that is achieved by means of a custom designed Max/MSP software interface. The sounds played on the shofar are mic’d, enhanced and transformed by means of digital sound processing. eShofar2 (2005) uses complex algorithms to create a chaotic improvisational system based on the live performed sounds of the shofar.
SPI is a real-time computer sound processing instrument, which makes no sounds itself, only processes sounds input to the instrument. It has been designed specifically, but not exclusively for improvised music. It uses a number of human interface sensors to input gestural information to the processing.
Mark Bokowiec: software, hardware design and composition
Julie Wison-Bokowiec: software design
The Bodycoder interface is a flexible sensor array worn on the body of a performer that sends data generated by movement to an Max/MSP environment via radio. Movement data is mapped in a variety of ways to live processing and manipulation. The primary expression functionality of the Bodycoder System is Kinaesonic. The term Kinaesonic is derived from the compound of two words: Kinaesthetic meaning the movement principles of the body and Sonic meaning sound. In terms of interactive technology the term Kinaesonic refers to the one-to-one, mapping of sonic effects to bodily movements. In our practice this is usually executed in real-time.