I am currently looking at ways to combine a piano with elements of different interfaces, to enable fully interactive control of electronics whilst playing the instrument normally and on the strings/soundboard/body. I currently use icube sensors and MIDI, serial-driven & FFT-responsive motors. It is in constant development.
“Many Hands” is a series of software interfaces that allow a pianist to shape the unfolding of a range of electronic music resources sounds, processes, events and electronic instruments and devices in response to that pianist’s playing. The software interprets data about real-time performance how fast is the playing, how dense are the chords or clusters, how long are melodic phrases, relative rhythmic steadiness, and basic harmonic analysis. The basic data from which this more complex information is calculated which note is being played at what volume level is generated by use of a piano data-tracking device (Yamaha Disklavier or a conventional acoustic piano fitted with a Moog PianoBar). It is this data that is used by the “Many Hands” software interface to calculate more complex information that can be used to expand the nature of piano performance.