A Digital Signal Multiprocessor and its Musical Application

Jean-Baptiste Barrière

IRCAM

Adrian Freed

CNMAT

Pierre-François Baisnée

Marie-Dominique Baudot

Introduction

We are interested in unifying different sound synthesis and processing techniques in an interactive real-time environment. The availability of increasingly general and cheaper DSP chips and the increased performance of desktop computers is bringing the dream of such an environment within reach. We are using the digital signal multiprocessor system described here to provide the signal processing for research in synthesis and processing algorithms and real-time control software. This research is for musical application, so its use by musicians throughout the development stage is essential.

The digital signal multiprocessing system has a peak performance of 108 million multiply/accumulate/delay operations per second. For audio processing and music synthesis applications, this is roughly 100 times the performance of a medium priced computer workstation and 10 times the speed of single chip DSP coprocessors [Lowe 1989]. We present, in bottom up fashion, the machine, its software, and its musical application in the piece Aïon.

The Machine

Physically, the machine is a stack of 8 processor boards interconnected with flat cable busses. Each board contains a Motorola DSP56001 [Motorola 1989], up to 4k x 24 bit words of fast dual-ported static RAM, a DB15 connector for serial I/O, and glue logic for an 8-bit bussed connection to a controlling host processor.

System design, circuit simulation, schematic capture, PCB layout, software design, prototyping, implementation and testing were done by two people on a single computing platform (Macintosh) on a modest budget in a few months. Eight identical double-sided printed circuit boards are used with no backplane. All the parts used are readily available (only the DSP chip is not second sourced). The system is physically small: 9" x 9" x 6". CMOS parts which require little power were used wherever possible.

One of the eight processor boards is customized to be a master "move engine" as shown below. The move engine is the only processor that has access to all the external memory of the remaining seven processors. It is used to route signals between the processors and to monitor and control the activities of each processor.

The move engine does not include a dual port RAM; we use the freed space to include a small "bulk" memory of 32k 24-bit words. There is no provision to add a large dynamic RAM array, as this adds considerably to the cost and complexity of the design [WaveFrame 1988]. The most important signal processing task precluded by this decision is reverberation. Sampling "synthesis" and waveguide models [Smith 1985] requiring long delay lines may also be inappropriate for this architecture.

Each processor is booted and controlled over an 8-bit host bus. A standard 8-bit NuBus board is used for connection to Macintosh workstations. It is easy to completely saturate the host processor with midi and control tasks.

The 56001's two serial ports are available externally through a protection and RFI network on a DB15 connector, using the same pinout as the NeXT computer's DSP port. Audio peripherals, such as A/D and D/A convertors, R-DAT recorders and CD players can be connected with this port. It is also possible to use the same connector to attach the NeXT machine's 56001 to the multiprocessor.

One of the two serial ports (SCI) can be used for standard RS232 or RS422 connections or MIDI. The other one (SSI) is designed for high speed connections to convertors. Each port is capable of transmitting and receiving 5 channels of 24-bit digital audio at 48kHz. The total channel capacity of the multiprocessor is therefore 40 inputs and 40 outputs.

The Software

Signal processing code was developed using Motorola's assembler and simulator under Apple's MPW environment. A low-level driver handles communication between the host Macintosh and the digital signal processors. A HyperCard based user interface management system called HyperDSP is used to develop interactive test harnesses for DSP code modules. It provides access to the driver's facilities in a high level interpreted language, HyperTalk:

With its new editing facilities and real-time midi control of the multiprocessor, MacMix [Freed 1987] has become the primary development environment for the machine. The figure above on the right is a graphic representation of a resonant model of a piano note:

We are extending the Max language [Puckette 1988] to control the multiprocessor. We hope as well that the NeXT music kit [Jaffe & Boynton 1989] can be adapted to the system.

Musical Application of the System to Aïon

Jean-Baptiste Barrière's piece "Aïon" is the first to make use of the system. It is scored for 2 percussion players, the multiprocessor, a DMP7 digital mixer, an AKAI S1000 sampler (for playing sampled attacks used as excitations, a TX816 (used for textures), and 2 Macintosh II computers (one to control the synthesis parameters, the other to control the whole system).

Central to the application of the multiprocessor to the piece is an analysis/resynthesis technique based on resonance models [Barrière et al 1985; Potard et al 1986]. This follows from earlier work on synthesis based on tuned filters to model acoustic resonances of physical systems [Wawrzynek et al 1984, Rodet et al 1984-2] The basis of these methods is to extract filter parameters by analyzing recorded sounds and exciting a bank of these filters with synthesized, sampled or live excitations.

This resonance modelling technique was implemented with FORMES [Rodet et al 1984-1] on a Vax-780 controlling an implementation of CHANT [Rodet et al 1984-2] using an FPS-100 array-processor for synthesis. Later a real-time implementation of only the synthesis part was implemented on the 4X digital synthesizer [Asta et al 80], and used by J.B.Barrière in "Epigénèse" [Baisnée et al 86].

Compositional materials for the piece include a database of resonance models of percussions and samples of their attacks. Processes of the piece are built from these materials with Esquisses [Baisnée et al 88].

Attacks were sampled and classified according to type: struck, plucked, bowed, strummed, breath, etc. These attacks are used as excitations for the models and permit unusual couplings: a breath sound exciting a gong or a timpani, and a plucked sound exciting a voice or a wind instrument.

The figure below illustrates how the resonance models are implemented on the digital signal multiprocessor:

Four of the DSP boards run identical code implementing a pair of two-pole digital resonator banks. The outputs of each bank pair are crossfaded to create interpolated timbres. The move engine routes and mixes to spatialize the outputs from the resonator banks into 4 channels of 18-bit D/A conversion. An excitation engine synthesizes, routes and mixes excitation functions destined for the resonator banks. Excitations include spectrally and temporally shaped noise, jittered pulse trains and externally generated sounds from 4 18-bit A/D convertors. The remaining boards can be used for further excitations (additive and sampled, for example) and for non-linear processing [McIntyre et el 84]. We are working on other modules based on generalizing features of physical models of instruments, such as non-linear elements, waveguides, linear filters, oscillators and FOF's [Rodet et al 84-2].

Aïon

Aïon is the second principle of time in Greek philosophy, dealing with the internal, psychological time, as opposed to Chronos, chronological time. The piece is concerned mainly with the relation of complex timbral structures with non-pulsed time.

Two percussionists play a variety of amplified percussion instruments, which are processed through the resonance models of other percussive and non-percussive instruments. This is an example of real-time cross synthesis. Both the trigger and the sound of the live percussions are processed. The trigger is used to start sequences of events as well as timbral interpolations within a single event, the sound itself being both amplified and mixed with its processed form and other complementary sources. the percussionists are often acting as generators: their gestures are provoking processes that in turns either compete with or compliment one another.

The musical situations evolve through intertwined forms of solo (either by each of the percussionists or by the computer performing a single complex timbral interpolation), duo (one instrument being filtered by the models of resonances of others), trio (the computer acting as a more normal soloist playing with the others), quartet (when each instrument is filtered by the model of an other), and finally a whole ensemble (when the computer is given to play a full polyphony with or without the others). Note that the system described is used in two complementary ways - for processing and orchestration of the percussions sounds.

Conclusion

We are building a small number of these DSP multiprocessor systems for computer music centers, individual composers and researchers. Though the machine has already proven useful in musical situations, we see it as transitional - a way of learning how to build components of a truly interactive computer music workstation [Wawrzynek and von Eicken 89].

References

Asta, V. et al. (1980), Il Sistema di sintesi digitale in tempo reale 4X , Automazione e Strumentazione 28(2): 119-133.

Baisnée P-F., Barrière J-B., Koechlin O., Puckette M., Rowe R. (1986), Real-time interaction between musicians and computer: performance utilizations of the 4X, Proceedings of 1986 International Computer Music Conference, La Haye, Computer Music Association, pp.237-240.

Baisnée P-F., Barrière J-B., Dalbavie M-A., Duthen. J., Lindberg M., Potard Y., Saariaho K. (1988), Esquisse: a compositional environment , Proceedings of 1988 International Computer Music Conference, Köln, Computer Music Association.

Barrière J-B., Potard Y., Baisnée P-F. (1985), Models of Continuity between Synthesis and Processing for the Elaboration and Control of Timbre Structures, Proceedings of 1985 International Computer Music Conference,Vancouver, Computer Music Association, pp.193-198.

Freed A. (1987), Recording, Mixing, and Signal Processing on a Personal Computer, Proceedings of the AES 5th International conference on Music and Digital Technology, pp. 158 -162

Jaffe D., Boynton L. (1989), An Overview of the Sound and Music Kits for the NeXT Computer, Computer Music Journal 13(2) pp. 48-59.

Lowe W., Currie R., (1989), Digidesign's Sound Accelerator: Lessons Lived and Learned Computer Music Journal, 13(1) pp. 36-46

McIntyre M. E., Schumacher R. T., Woodhouse J. (1983), On the Oscillations of Musical Instruments, Journal Acousic Society of America 74(5).

Motorola, (1989), DSP56000/DSP56001 Digital Signal Processor User's Manual , Motorola Literature Distribution, PO Box 20912, Phoenix, Arizona 85036

Potard Y., Baisnée P-F., Barrière J-B., (1986), Experimenting with Models of Resonance Produced by a New Technique for the Analysis of Impulsive Sounds, Proceedings of 1986 International Computer Music Conference, La Haye, Computer Music Association, pp.269-274

Puckette M. (1988), The Patcher, Proceedings of the14th International Computer Music Conference, Köln,1988, Feedback Studio Verlag, available from Computer Music Association.

Rodet X., Cointe P., (1984) Formes : Composition and Scheduling of Processes , Computer Music Journal 8(3):32-50.

Rodet X., Potard Y., Barrière J-B., (1984), The CHANT Project : From Synthesis of the Singing Voice To Synthesis in General Computer Music Journal 8(3):15-31.

Smith J.O. (1985) Waveguide Digital Filters, Internal Report, CCRMA, Dept of Music, Stanford University.

WaveFrame Corporation, (1988), The AudioFrame Digital Audio Workstation: DSP Developers Toolkit, WaveFrame, 2511 55th St, Boulder CO 80302.

Wawrzynek J.C., Tzu-Min Lin, Mead C. A., Liu H., Dyer L. (1984), A VLSI Approach to Sound Synthesis, Proceedings 1984 International Computer Music Conference, Paris, Computer Music Association.

Wawrzynek J. C., von Eicken T., (1989), Mimic, a custom VLSI Parallel Processor for Musical Sound Synthesis, Proceedings of IFIP VLSI 89, Munich, FRG.