Mapping Strategies in real-time computer music

Marcelo M. Wanderley
Schulich School of Music - McGill University
Montreal, July 2002.

Traditionally the main research directions in gesturally controlled real-time computer music have been the design of novel input devices (Paradiso 1997) and the research on new sound synthesis algorithms (Borin, De Poli and Sarti 1997).

New input devices - also known as (gestural or musical) controllers, control surfaces, or (hardware) interfaces - currently allow the acquisition of virtually all possible performer gestures and movements (Mulder 1994). Conversely, existing synthesis algorithms are capable of creating an unlimited range of sounds in real-time using affordable hardware.

But once gestural (or performance) variables - signals resultant from performer movements - are available in digital form, one needs to devise methods to relate them to the available synthesis variables - the inputs of the sound generating system. This relationship is commonly known in computer music as (parameter) mapping.

The three aforementioned parts - the gestural controller, the synthesis algorithm and the mapping strategies between them - constitute what can be called a digital musical instrument (DMI) (Wanderley 2001). But as cited above, with the focus on devices and on synthesis algorithms, the proposition of DMIs with simple one-to-one mappings between gestural variables and synthesis variables was the rule.

Recently, we witness the emergence of a trend to broaden the scope to include considerations on the intrinsic role of different mapping strategies, including considerations on their influence on instrument design (Bowler, Purvis, Manning, and Bailey 1990) (Winkler 1995) (Garnett and Goudeseune 1999) (Hunt, Wanderley and Kirk 2000).1)

In fact, mapping is many times viewed through different perspectives: a) as a constituent part of a DMI as explained above or b) as part of a composition. In both cases, gestural variables are mapped to sound synthesis variables, but in the first case, mapping strategies are determinants on instrument expressivity (Rovan, Wanderley, Dubnov and Depalle 1997)(Favilla 1997)(Hunt and Kirk 2000), whilst in the second case they are the essence of the composition itself (Doornbusch 2000). On a higher level, effort is being made to bridge these two aspects into a higher-level view of mapping as the key to system design (Oppenheim 2001).

In this issue of Organised Sound, we set our goal to analyse in detail the various approaches to the definition of mapping strategies in both the design of new digital musical instruments and as part of interactive music systems.

Questions addressed in this issue include:

  • Is mapping part of a composition, part of an instrument, or both?
  • How can one devise mapping strategies for these different systems? Are there models of mapping strategies available?
  • Should mapping be explicitly defined or devised using methods such as neural networks? Should it be static or dynamic? Simple or complex? Intuitive or learned?
  • What is the influence of the choice of mapping strategies in the expressive capabilities of new instruments? Is it simply an aesthetic choice?

The ten original contributions that follow focus on the role of mapping and on the design of mapping strategies, providing an overview of several of the main existing developments concerning mapping in computer music:

  • Four articles provide a detailed review of existing works where the definition of mapping is intrinsically analysed: Goudeseune; Hunt and Wanderley; Fels, Gadd and Mulder; and Arfib, Couturier, Kessous and Verfaille. It is interesting to notice the richness of points of view - sometimes in opposition to each other - that reflect the substantial interest on this subject.
  • Four other articles by Myatt, Ng, Nichols and Burtner describe the design, implementation and performance issues related to novel interactive systems and digital musical instruments where mapping was carefully devised as an essential feature.
  • Doornbusch presents an interesting discussion of mapping seen as a compositional feature, where different composers discuss their approach to mapping in their own works.
  • Levitin, McAdams and Adams present a conceptual framework describing musical tones and describe a new scheme for characterisation of a musical control space. Applications of this research include the foundations for the design of mappings and instruments to increase creativity and expression in computer music performance.

This edition of Organised Sound constitutes, to the best of our knowledge, the first editorial attempt to explicitly address the several questions related to mapping strategies in real-time computer music. It nevertheless cannot, due to space constraints, completely set the discussing on parameter mapping.

We therefore welcome the reader to further participate in the ICMA/EMF Working Group on Interactive Systems and Instrument Design in Music (ISIDM), where one interest group discusses the different aspects of mapping (Hunt and Wanderley 2000-2002). There the reader will find links to many of the papers referenced in the contributions that follow, as well as texts and on-line discussions of the several topics related to parameter mapping, including its importance (Hunt, Wanderley and Paradis 2002) and limitations (Chadabe 2002).

References:

  • Bowler, I. , Purvis, A. , Manning, P. , and Bailey, N. 1990. On Mapping N Articulation onto M Synthesiser-Control Parameters. In Proc. of the 1990 International Computer Music Conference. San Francisco, International Computer Music Association, pp. 181-184.
  • Borin, G. , De Poli, G. , and Sarti, A. 1997. Musical Signal Synthesis. In C. Roads, S. Travis Pope, A. Piccialli, and G. DePoli (eds. ) Musical Signal Processing. Lisse: Swets & Zeitlinger, pp. 5-30.
  • Chadabe, J. 2002. The Limitations of Mapping asa Structural Descriptive in Electronic Instruments. In Proc. of the 2002 International Conference of New Interfaces for Musical Expression, NIME02. Dublin, Ireland. Keynote address.
  • Doornbusch, P. 2000. Personal Communication. Jan. 25th 2000.
  • Favilla, S. 1997. Real-time Control of Synthesis Parameters for LightHarp MIDI. In Proc. of the 1997 ACMA Conference. Auckland, New Zealand.
  • Garnett, G. , and Goudeseune, C. 1999. Performance Factors in Control of High- Dimensional Spaces. In Proc. of the 1999 International Computer Music Conference. San Francisco, International Computer Music Association, pp. 268 - 271.
  • Goudeseune, C. 2001. Composing with Parameters for Synthetic Instruments. PhD Thesis, University of Illinois at Urbana-Champaign.
  • Hunt, A. 1999. Radical User Interfaces for Real-time Musical Control. DPhil thesis, University of York UK.
  • Hunt, A. and Wanderley, M. (eds. ) 2000-2002. Mapping of Control Variables to Synthesis Variables. Interactive Systems and Instrument Design in Music Working Group. Website: igmusic. org/ http://www. igmusic. org/
  • Hunt, A. , and R. Kirk. 2000. Mapping Strategies for Musical Performance. In M. Wanderley and M. Battier (eds. ) Trends in Gestural Control of Music. Ircam - Centre Pompidou.
  • Hunt, A. , Wanderley, M. and Kirk, R. 2000. Towards a Model for Instrumental Mapping in Expert Musical Interaction. In Proc. of the 2000 International Computer Music Conference. San Francisco, International Computer Music Association, pp. 209 - 212.
  • Hunt, A. , Wanderley, M. and Paradis, P. 2002. The Importance of Parameter Mapping in Electronic Instrument Design. In Proc. of the 2002 International Conference of New Interfaces for Musical Expression, NIME02. Dublin, Ireland, pp. 149-154.
  • Mulder, A. 1994. Virtual Musical Instruments: Accessing the Sound Synthesis Universe as a Performer. In Proc. of the First Brazilian Symposium on Computer Music, pp. 243-250.
  • Oppenheim, D. 2001. Personal Communication. Dec. 28th 2001.
  • Paradiso, J. 1997. Electronic Music: New ways to play. IEEE Spectrum, 34(12), pp. 18-30.
  • Rovan, J. , Wanderley, M. , Dubnov, S. , and Depalle, P. 1997. Instrumental Gestural Mapping Strategies as Expressivity Determinants in Computer Music Performance. In Proc. of the AIMI International Workshop Kansei, The Technology of Emotion. Genoa: Associazione di Informatica Musicale Italiana, October 3-4, 1997, pp. 68-73.
  • Wanderley, M. 2001. Performer-Instrument the 1995 Interaction: Applications to Gestural Control of Music. PhD Thesis. Paris, France:University Pierre & Marie Curie - Paris VI.
  • Winkler, T. 1995. Making Motion Musical:Gestural Mapping Strategies for Interactive Computer Music. In Proc. of the 1995 International Computer Music Conference. San Francisco, International Computer Music Association, pp. 261-264.
1)
At least 3 PhD theses have substantially dealt with mapping strategies: Hunt 1999, Goudeseune 2001, Wanderley 2001.