Wright, Matthew matt@cnmat.berkeley.edu
Wessel, David wessel@cnmat.berkeley.edu
Freed, Adrian adrian@cnmat.berkeley.edu

Center for New Music and Audio Technologies
1750 Arch Street
Berkeley, CA 94709
USA
phone (510) 643-9990, fax (510) 642-7918

ICMC 1997 Demonstration Proposal

New Musical Control Structures from Standard Gestural Controllers

Keywords: Controllers, Gestures, Digitizing Tablet, MAX, Joystick

Content Area: Performance Interfaces

Resources Required: Power Macintosh, VHS

Throughout history, people have adapted whatever objects were in their environment into musical instruments. The computer industry has invested significant resources in creating broadly available, low cost gestural controllers without any musical application in mind; thoughtful adaptation of these controllers for music is a fruitful yet overlooked route and is the subject of this presentation.

We find the latest incarnations of the venerable digitizing tablet (a.k.a. "artist's tablet") very interesting for musical control. Tablets offer accurate and fast absolute position sensing of cordless devices in three dimensions. Additionally, pressure, orientation, tilt and rotation estimates are available. The tablet we use allows for simultaneous sensing of two devices, usually one in each hand.

This rich, multidimensional control information can be mapped to musical parameters in a variety of interesting ways. The most direct kind of mapping associates a single synthesis parameter with each control dimension, for example, vertical position controlling loudness, horizontal position controlling pitch, distance from tablet determining articulation, tilt controlling brightness, and pressure controlling vibrato depth. Other interfaces define regions of the tablet associated with particular behaviors, e.g., dividing the tablet surface into four quadrants corresponding to four voices in a chord. Another possibility involves the time from when the pen touches the surface until it releases. This period can be viewed as a single gesture, its shape can be used produce both event and parameter information.

We demonstrate software created in a MAX environment that we use to develop control structures for the two-handed digitizing tablet, and how some of these control structures have been used in musical performance. We will also demonstrate an interactive musical installation that uses two joystick controllers. Included will be structures for navigation in timbre space, multidimensional synthesis control, note stream synthesis, and emulations of the gestures of strumming, plucking and bowing strings.