New Musical Control Structures from Standard Gestural Controllers

Matthew Wright, David Wessel, Adrian Freed

Center for New Music and Audio Technologies, U.C. Berkeley

matt,wessel,adrian@cnmat.berkeley.edu, http://www.cnmat.berkeley.edu/People

Abstract

We have adapted a digitizing tablet as a musical gestural controller. We characteristize the device in terms of the data it outputs, the resolution of that data, and temporal behavior. We define a powerful, high-level model for mapping tablet data to musical control information and conclude with a list of example applications.

1 Introduction

Throughout history, people have adapted objects in their environment into musical instruments. The computer industry has developed the digitizing tablet (a.k.a. "artist's tablet") primarily for the purpose of drawing shapes in computer graphics illustration programs. These tablets are broadly available and low cost, and they sense many control dimensions such as pen or pointing device position, pressure, and tilt. We find that these controllers can be used for musical control in a variety of interesting and musically expressive ways.

2 Characteristics of the Wacom Tablet

We use a Wacom [9] ArtZ II 1212 digitizing tablet, a 12 inch by 12 inch model with both a stylus (pen) and a puck (mouse-like device) that allows two-handed input [7]. The stylus and puck are cordless, batteryless, and light weight. The tablet has a single DB-9 connector that carries both power and control information; on the other side of the cable is a mini-din connector suitable for the serial port on a Macintosh, PC, or SGI workstation. We've had no trouble running the cable at distances of about 30 feet.

The Wacom tablet accurately outputs the absolute X and Y dimensions of both the puck and stylus as integers in the range 0 to 32480. The stylus has a pressure-sensitive tip (and a pressure-sensitive "eraser" tip on the other end) that produces pressure readings in the range 0 to 255. When only the stylus is used, the Wacom tablet also outputs tilt values in two dimensions, in the range -60 to 60 degrees. Position and tilt are reported whenever the stylus or puck is in "proximity" of the tablet, within about a centimeter of the surface.

In addition to these continuous control variables, there are many ways to trigger events. The puck has 4 buttons, and the stylus has two buttons as well. Each tip of the stylus can be considered a button; pressing the stylus into the tablet will cause the same kind of button event. Another kind of event is generated when either device enters or leaves the proximity of the tablet. For the stylus, this event indicates which side (stylus tip side or eraser side) enters or leaves proximity.

It is important to evaluate the temporal behavior of computer systems for music [6]. We performed timing experiments on SGI machines using their fast UST clock, measuring the elapsed time between events reporting values of position, tilt, and pressure while continually moving the stylus so as to change each parameter constantly. The good news is that about 75% of parameter updates came within 1 ms of the previous update. The bad news is that the elapsed times greater than 1 ms averaged about 28 ms and were distributed more or less in a bell curve around 28 ms. The source of this erratic temporal performance seems to be context switches between the X server and our application program, a consequence of Wacom's decision to make tablet data available on Unix using X11 extension device valuator events. We expect that direct access to the serial stream from the tablet will address this difficulty.

3 A Model for Mapping Tablet Data to Musical Control

We have developed a model for mapping tablet data into musical control information (either MIDI or Open SoundControl [11]) that allows performers to simply customize the musical behavior of the tablet. In our model, the two dimensional surface of the tablet is populated by any number of arbitrarily-shaped polygonal regions. These regions can overlap, and they have a vertical stacking order that determines which of two overlapping regions is above the other. Each region has a user-assigned symbolic name.

The following events may occur in a region:

· Puck or stylus enters (or leaves) the proximity of the tablet in the region

· Either end of the stylus makes (or breaks) contact with the tablet

· Puck or stylus moves into (or out of) the region

· Button press (or release) while pen or stylus is in the region

For each region the user can define a list of actions to take when any of these events occur, e.g., the stylus tip touching the tablet in a region might cause a pair of MIDI note-on events.

There are also several continuous parameters that are updated constantly by the tablet:

· X and Y coordinates for puck and stylus

· X and Y axis tilt for stylus (when not using puck)

· Stylus pressure

Each of these continuous parameters can also be mapped to a list of actions, e.g., the X axis tilt might correspond to pitch bend.

What makes our model dynamic is that actions can be added to and deleted from these lists in response to events that occur in a region. For example, pressing a button while the puck is in a region could add a mapping from puck Y position to overall volume, and having the puck enter a different region could remove this mapping. This features facilitates complex musical behaviors in response to various gestures.

To allow the mapping of data values from one range to another we provide two primitives. The first is a linear scaling operator that implements functions of the form f(x)=ax+b where a and b are constant. The desired output range is specified; the system knows the range of possible input values and can compute a and b automatically. The second is stored function evaluation. We compute user-specified memoryless nonlinear functions with an abstraction that subsumes table lookup, expression evaluation, neural network forward-pass, and other techniques. Of course the tablet itself is an excellent tool for drawing function curves.

One important feature of the tablet is that its position sensing is absolute rather than relative. This means that the performer does not need any visual feedback from a computer monitor to see which region the stylus or puck is in. The Wacom tablet has a clear plastic overlay under which paper and small objects may be placed. We can print a "map" of the user's defined regions to the scale of the tablet and place it under the overlay; this lets the user see exactly where each region is on the tablet. Removable adhesive strips for drafting may be used to create surface irregularities offereing tactile feedback.

4 Examples

4.1 Digital Tambura Interface

Our first use of the digitizing tablet for musical control was to create an interface for a drone instrument similar in function to the Indian tambura. A real tambura has 4 strings with no fretboard or fingerboard; the player plucks each open string and lets it ring. Tambura players typically pluck the 4 strings one at a time and then start over at the first string, providing a continuous drone.

A tablet interface was designed that retains features of the playing style of a real tambura. We defined six regions on the tablet corresponding to six virtual strings. Each of these virtual strings had a corresponding monophonic synthesizer voice set to a particular drone timbre and pitch. The action of touching the stylus to the tablet in a particular region represented touching a finger to a real string: it caused the currently sounding note on that string to decay fairly rapidly. Releasing the stylus from the tablet represented the second half of the pluck, where the finger releases the string and sets it vibrating. This caused the voice to start a new note.

The loudness of each note was determined by the total amount of horizontal distance traveled between the time the stylus touched the region and the time the stylus left the tablet. Timbral control came from a mapping from the X axis position at the time the stylus left the tablet to the relative balance of even and odd harmonics in the synthesizer [5].

4.2 Strumming

We use the tablet to emulate the gesture of strumming a stringed instrument. We define four to ten thin rectangular horizontal regions which represent virtual strings much as in the tambura example. Again, each of these regions has a corresponding synthesizer voice which is virtually plucked in two steps by the stylus entering and then leaving the region. The speed of the pluck (i.e., the reciprocal of the time between when the stylus enters and leaves the region) combines with the pen pressure to determine the loudness of each note.

When the upper stylus button is depressed the stylus' function changes from exciting the virtual strings to muting them. In this case, the note for each string still decays when the tablet enters the region, but a new note does not start when the string leaves the region. The middle button corresponds to a "half mute" mode where the synthesized sound is more inharmonic and each note decays faster, corresponding to the playing technique on real strummed instruments of partially muting strings with the palm.

On real stringed instruments, the position of the picking (or plucking or bowing) along the axis of the string is an important determinant of timbre. We emulate this control by mapping the X position at the time the stylus leaves the string region to a timbral control such as brightness.

We have several techniques for controlling pitch with the puck in the left hand. The first is modeled loosely on the autoharp. There are regions of the tablet that correspond to different chord roots, and each of the 4 puck buttons corresponds to a different chord quality (e.g., major, minor, dominant seventh, and diminished). Pressing a button with the puck in a particular region determines a chord, which in turn determines the pitches of each virtual string. The layout of chord regions and chord qualities, and the voicings of each chord are all easily configurable for different styles, pieces, or performers.

Another technique for pitch control is modeled on the slide guitar, with the puck taking the role of the slide. In this case the pitches of the virtual strings are in fixed intervals relative to each other, and the puck's horizontal position determines a continuous pitch offset that affects all the strings. When the puck is not in proximity of the tablet the strings are at their lowest pitches, corresponding to strumming open strings on a slide guitar. The buttons on the puck select from among a set of different tunings.

4.3 Timbre Space Navigation

CNMAT's additive synthesis system provides timbral interpolation via the timbre space model [10]. The tablet is a natural way to control position in a two-dimensional timbre space.

The simplest application uses the tablet in conjunction with another controller like a MIDI keyboard. In this case the job of the tablet is just to determine position in the timbre space, so we map the two position dimensions to the two timbre space dimensions and use the other controller to determine pitch, loudness, and articulation.

Another approach is to use both hands on the tablet. The stylus position determines timbre space position. When the stylus touches the tablet a note is articulated, and when it leaves the tablet the note ends. Stylus pressure maps to volume. The puck, held in the left hand, determines pitch via a variety of possible mappings.

A third approach is to implement a process that continually produces a stream of notes according to parameterized models of rhythm, harmony, and melody. The left hand can manipulate the parameters of this model with the puck while the right hand navigates in timbre space with the stylus.

5 Future Work

We would like to be able to recognize various kinds of strokes and gestures made with the stylus. Previous work on stroke recognition in the context of computer conducting [2] seems applicable, as does work on the integration of segmentation, recognition, and quantitative evaluation of expressive cursive gestures [8].

6 Conclusion

The tablet interface provides a musically potent and general two-dimensional interface for the control of sound synthesis and compositional algorithms. With the addition of stylus pressure and tilt, two additional dimensions are available for each hand. Our experiments have demonstrated that irregularities can be added to the surface providing tactile reference. The tablet interface is basically a spatial coordinate sensor system like the Mathews Radio Drum [1], the Buchla Lightning [3], the Theremin [4], joy sticks, and other sensor systems like ultra sound ranging. But, unlike some of these systems it offers the possibility of tactile reference, wide availability, flexible adaptation, precision, and reliability. The tablet interface and its spatial coordinate sensor cousins offer the possibility of long-lived alternative musical control structures that use mathematical abstractions in a reliable manner as the basis for a live-performance musical repertoire.

7 Acknowledgments

CNMAT gratefully acknowledges the support of Silicon Graphics, Inc., and Gibson Guitar Corporation for generous contributions that supported this work.

References

[1] Boie, B, M. Mathews, and A. Schloss. 1989. "The Radio Drum as a Synthesizer Controller," Proc. ICMC, Columbus, Ohio, pp. 42-45.

[2] Brecht, B., and G. Garnett. 1995. "Conductor Follower," Proc ICMC, Banff, Canada, pp. 185-186.

[3] The Buchla and Associates web site describes Lightning II: http://www.buchla.com/

[4] Chadabe, J. 1997. Electric Sound: The Past and Promise of Electronic Music, Englewood Cliffs, New Jersey: Prentice Hall.

[5] Freed, A. 1995. "Bring Your Own Control to Additive Synthesis," Proc ICMC, Banff, Canada, pp 303-306.

[6] Freed, A. 1997. "Operating Systems Latency Measurement and Analysis for Sound Synthesis and Processing Applications," Proc. ICMC, Thessaloniki.

[7] Kabbash, P, W. Buxton, and A. Sellen. 1995. "Two-handed Input in a Compound Task," Proc. SigCHI, Boston, Massachusetts, pp. 417-423. http://www.dgp.utoronto.ca/OTP/papers/bill.buxton/tg.html

[8] Keeler, J., D. Rumelhart, and W. Loew. 1991. "Integrated segmentation and recognition of hand-printed numerals," In R. Lippmann, J. Moody, and D. Touretzky, Editors, Neural Information Processing Systems Volume 3, San Mateo, CA: Morgan Kaufmann.

[9] The Wacom WWW site is www.wacom.com

[10] Wessel, D. 1979. "Timbre Space as a Musical Control Structure," Computer Music Journal, Volume 3, Number 2, pp 45-52.

[11] Wright, M., and A. Freed. 1997. "Open SoundControl Protocol," Proc ICMC, Thessaloniki.