The T-Stick was developed and built in the IDMIL by Joseph Malloch, in collaboration with composer D. Andrew Stewart and performers Fernando Rocha and Xenia Pestova. The physical input device can sense where and how much of its surface is touched by the performer, as well as tilting, shaking, squeezing and twisting gestures. Unlike most digital musical instruments, the T-Sticks exist as a family, with soprano, alto, tenor, and bass members.
The T-Stick is intended to be an “expert” musical interface: engaging to new users, allowing virtuosic playing, and “worth practicing” in that practice time results in increased skill. The T-Stick has been performed and demonstrated many times in Canada, Brazil, Italy, and the USA.
The paper and slides from the Panel Session: The Need of Formats for Streaming and Storing Music-Related Movement and Gesture Data, 2007 International Computer Music Conference, Copenhagen provides a good overvie of the following topics —
Motion Capture Formats
Movement-related Markup Languages
Gesture Motion Signal (GMS) format
Gesture Description Interchange File Format (GDIF)
Performance Markup Language (PML)
Sound Description Interchange Format (SDIF)
Motion capture systems
Custom made instruments and devices
Jensenius, A. R., A. Camurri, N. Castagne, E. Maestre, J. Malloch, D. McGilvray, D. Schwarz and M. Wright.
The Clarinet Gestural Analysis project, based in the Music Technology Area, Schulich School of Music, McGill University is investigating the correlation between physical and musical gestures.
The research has a specific focus on expressive movements (also called ancillary or non-obvious gestures), movements that do not have a direct link to the generation of sound, but are an integral part of the performance.
The Musical Gestures Project, University of Oslo, Department of Musicology, is a broadly conceived research project on music-related gestures, based on the conviction that there are intimate links between music, understood as sonic art, and gestures, understood as human bodily movement.
The Input Devices and Music Interaction Laboratory (IDMIL) deals with projects related to the topic of human-computer interaction, design of musical instruments and interfaces for musical expression, movement data collection and analysis, sensor development, and gestural control.
It is affiliated with the Music Technology Area of the Schulich School of Music at McGill University, Montreal, Quebec, Canada and directed by Prof. Marcelo M. Wanderley.
A set of sensors sewn into a wearable cloth substrate that can be comfortably worn by a conductor or performer on stage.
The Conductor’s Jacket is a unique wearable device that measures physiological and gestural signals. Together with the Gesture Construction, a musical software system, it interprets these signals and applies them expressively in a musical context. Sixteen sensors have been incorporated into the Conductor’s Jacket in such a way as to not encumber or interfere with the gestures of a working orchestra conductor. The Conductor’s Jacket system gathers up to sixteen data channels reliably at rates of 3 kHz per channel, and also provides real-time graphical feedback. Unlike many gesture-sensing systems it not only gathers positional and accelerational data but also senses muscle tension from several locations on each arm. We will demonstrate the Gesture Construction, a musical software system that analyzes and performs music in real-time based on the performer’s gestures and breathing signals. A bank of software filters extract several of the features that were found in the conductor study, including beat intensities and the alternation between arms. These features are then used to generate real-time expressive effects by shaping the beats, tempos, articulations, dynamics, and note lengths in a musical score.
Bill writes —
“Timbral and gestural characteristics are extracted from real-time audio input from a single instrument (usually saxophone). A virtual ensemble of improvising agents references these characteristics in their performance. A user may also control the agents at various levels, such as selecting high-level behavioral tendencies, or fine-grain shaping of timbrally rich gestures.”