Research Data as Music; The Climate Symphony

Submitted by admin on 30. May 2011 - 22:04
Author
Quinn, Martin
Year
2000

Performance/workshop

Research Data as Music:
The Climate Symphony and Sonification of Radar, Seismic and Solar Wind
The Climate Symphony:
Rhythmic Techniques Applied to the Sonification
of Ice Core Data

Martin Quinn

Paper presented at the 4th Annual Conference of the International Environment Forum
organized jointly with the Social and Economic Development Seminar for the Americas
12-14 December 2000, Orlando, Florida, USA

[This paper is as presented at the Conference, and has not been subject to editorial review by the IEF]


[See the cited web site for an audio clip of the Climate Symphony] no longer available

Abstract

This paper describes the rhythmic generation of musical components as a core element in the sonification of data representing 110,000 years of earth's climate history. Ten data files from the Climate Change Research Center at the University of New Hampshire interact in musically interesting ways to create a compelling five minute presentation of the data. The sonification is created by Design Rhythmics software developed by the author and is described in detail. Keywords: Sonification, Design Rhythmics, ice core, rhythm, and climate.


1. Introduction

This paper presents a software system called Design Rhythmics and its application of rhythmic techniques to sonify the GISP2 glaciochemical ice core series2, 3. The data spans 110,000 years and gives significant insight into earth's climate history over that period. Major naturally occurring astronomic and biosphere cycles were identified as a result of mathematical analysis. These cyclic files were sonified over a two year period and presented in initial, interim and final results.

The initial sonification was presented at the 1995 Annual Science Honors dinner at the University of New Hampshire to about 100 people. Paul Mayewski, the guest speaker and director of the Climate Change Research Center invited the presentation and was instrumental in obtaining and understanding the data and encouraging the project. Interim results a number of months later were presented individually to Paul and a year later to a graduate class in modelling at the University of New Hampshire along with a questionnaire for feedback. A final version was played at the 1998 NASA Goddard Space Flight Center Technology Showcase and to a number of researchers there. It is also posted in Real Audio form on the web at http://www.nh.ultranet.com/~mwcquinn/icecore.html along with a brief description of the sonification. The project is an independent work by the author.

The author has a background as a professional musician specializing in drumming, composition and voice. He plays tablas, drums, electronic and other percussion, and keyboards and has recorded for many artists including Pat Martino, Darius Brubeck and Doah World Music Ensemble. He also currently performs throughout the world with his wife and daughter in two original theatrical music and dance productions. Other sonification research includes turning text, as in log files, directly into music for immediate symbolic comprehension. Future work in sight through sound is planned.

2. Design Rhythmics - Sonification Based on Rhythmic Principles

Design Rhythmics is a form of combinatoric software1, especially designed to create sequences and patterns of things. While the initial application has focused on generating patterns that are musical in nature, the architecture is designed to be extensible to include graphic or other forms of output in the future.

The goal of Design Rhythmics was to create software which models the processes we use to create pleasing rhythmic sequences. It is designed to produce complex musical patterns that would be difficult, if not impossible, to play in a conventional manner. Individual events arising from these pattern processes can be triggered by data points and by other events (MIDI for instance).

The architecture of Design Rhythmics is based on the architecture of drumming. In drumming, a drummer plays in a rhythmic way on a set of drums and the drums make sounds. In Design Rhythmics, computer input plays in a rhythmic way a set of drums that do things in the computer. In this system, the idea of the sound of a drum is expanded to be the meaning of a drum. The meaning of a drum determines what the drum will do when it is played. In musical terms this includes setting note values, dynamics (velocity), and timbre (program changes and channel assignments). In control terms, a drum might read from a file, change particular variables, etc. Any task that can be accomplished in the computer is a potential meaning for a drum. When a drum is played, its task or action, i.e., its meaning, is performed.

Design Rhythmics is defined as a collection of named drumsets, each of which can have one to many drums contained within it. Every object derives from the drum object and so drumsets are also drums. Each named drumset is analogous to a real drum having a certain depth to it. One drumset is considered the 'active' set at any one time.

There are three ways to play a drumset: sequentially, dynamically and concurrently. To sequentially play a drumset is to simply play, with each access, every drum in it (they are ordered as in a list) in either a forward or reverse direction, with wrap around or change direction allowed when the end of the set is reached. To dynamically play a drumset means to use some input value (perhaps from a data point) to offset into the ordered set of drums and playing the drum selected. To play concurrently means to play either all the drums in a drumset, or specified drumsets and have all the output manifest at one time, as in a chord of music.

One practical use of dynamically accessing a drum list is to emulate a range of pitches on a instrument that are selected based on some input. The pitches can be specifically stated or based on scales of various kinds. For example, using an input source that records the intensity of hitting an electronic sensor, pitches out of the `scale' are played or selected based on the intensity of the hit. The softer the hit, the lower the pitch selected, the harder the hit, the higher the pitch selected for play. If the drums are notes, then pitches will sound, if they are dynamics or channel assignments, then presumably they will get attached to pitches that some other process has been set up to generate.

Sequential patterns are created by triggering a drumset over and over again. Each drum will be stepped thru one at a time and when the end is reached, the sequence starts over again. But what about rhythmic patterns, patterns that are more complicated and closer to musical rhythms? For this, a special kind of drum, which we term a Design Rhythmic drum, is needed.

A Design Rhythmic drum contains two other drums. One is the selection pattern drumset which contains selection order drums. Each selection order is an order to stay on the current drum for so many hits before moving on to the next drum. The other is the drumset being played. This means that when this Design Rhythmic drum is played, it uses the next selection order drum to determine how many times it will stay on the next meaning drum, playing it with each hit. Complex and beautiful patterns are created from the interplay of these two types of drumsets, especially when drums are themselves drumsets.

Because everything is considered a drum that is played, the complexity of the system is limited and its flexibility increased. A drum is a drum, a drumset is a drum and a Design Rhythmic is a drum. By using this approach the user can define a set of drums to be as simple as a normal drumset, or as complicated as drumsets within Design Rhythmics within Design Rhythmics. Because of the permutations that evolve during the computation of a Design Rhythmic, extremely long and interesting musical patterns can be created. Of course, the resulting qualities of the patterns will depend heavily on the choices of selection patterns and drum lists. Whether they are musical or not will depend on their composition.

The final output of Design Rhythmics is a MIDI datastream 4 used to produce music in external or internal synthesisers. The datastream can be built up by independent rhythmic processes, allowing integrated polyrhythms of pitch, timbre, and dynamics to be created.

Design Rhythmics was developed on a PC under DOS and Windows 3.1 and NT. It is currently being ported to Java. A commercial implementation is expected to be available at some point.

3. The Domains

Before attempting any conversion or mapping from the domain of data to the domain of music, it is helpful to examine each domain, identifying the elements that describe it and comparing the possible translation mechanisms available to us. We will examine the domain of data and the intermediate domain of Design Rhythmics. (in the final paper we also discuss MIDI, and the domain of sound.) Each plays a part in the problem, each is important in the decisions made during translation.

3.1 Mapping the Data Domain into Design Rhythmics

In the domain of data we gather data by some means such as sensors or chemical analysis or mechanical measurement or the like. This data relates to some phenomenon that we desire to measure or quantify. The measurement allows us to conclude various facts about the behaviours or qualities of things. There are times when certain measurements are related and other times when they are not. Measurements tend to have low and high values and thus exhibit a range of values that can be viewed and manipulated to produce statistics about the set of numbers. We can compare our results to other sets of measurements and thereby better understand our current measurements in the light of historical data. We commonly see our data in visual forms either as numbers comprised of individual numerals or graphed using widely available and generally understood techniques.

In order to present data in a musical form, we must make decisions about how to map the data values into the domain of Design Rhythmics. What are the drumsets that will be triggered by what input? Some of the ways of mapping data values into Design Rhythmics are listed below. Many of these techniques were used to produce the musical translations.
* Break a data file's values up into equal ranges and have each range trigger a different Design Rhythmic musical process.
* Map data values into a scale of note value drums - the higher the data value, the higher the pitch.
* Map data values into a scale of Design Rhythmics - the higher the data value, the higher the process triggered.
* Use one data file's values as the transposition amount for all note values of other files.
* Have a data file simply trigger events from a Design Rhythmic using the value in the data file as the volume for the notes generated.
* Read a number of files, using their values to trigger independent Design Rhythmics, but gather all the results together and output them at the same time.
* Read a file, triggering a drum for each data value.
* Read two files, using one file to determine the sound of the notes, the other determines the notes.

These can be used to specify sound effects, the pitch of a sound, the timbre of a sound, the harmony of a set of sounds, the transposition of a sound or group of sounds, the timing of the music, the dynamics of the music, the style of the music, the patterns in the music, the selection of patterns in the music, the selection of sounds, and the rhythmic sequencing and intertwining of themes in the music.

4. Goals of Sound Design in Sonification

In the musical domain, many aspects of music can be targets or goals of the musical visualization. What effect do we want the music to have? Does it remind one of a place? Does it fill us with joy, with warmth, or sadness? Does it sound pleasant or dissonant? Is it playful? Are the notes high pitched or low pitched? Do we wish to have certain sounds relate to certain actions? How these effects relate to the presentation of data is open to much experimentation and discussion.

One goal in sound representation of data should be to create music in which the listener can understand the data in new ways. If desired, they should be able to extract out of the sound mix a particular data file to listen to, or can focus on the overall effect. It is important for the listener that they be able to switch quickly between these ways of listening. In any case, they should not have to work very hard to hear the different files being played, nor have to be reminded periodically which file is which sound. An introduction (aural key) or visual representation of the mapping of data to sound is probably necessary to give the audience a clue to what they are listening for, unless the sounds have obvious meanings.

A sound acquires meaning and recognition in a number of ways. Natural sounds like those of animals or the sound of fire, running water, rain and the like are recognizable and have obvious meanings. They are linked to physical objects and conditions of those objects. Sound effects fall into this category. They convey meaning directly. Place is yet another way sounds become recognizable. Certain sounds happen only in certain places - an auction, a rodeo, an amusement park, the school bell, a courtroom, church bells, and so forth. Familiar kinds of music also remind us of certain places or emotions. Newly designed sounds rely on their musical effect to create the desired meaning, effect or emotion.

The following list summarizes the goals of an effective music sonification:

Elements of Effective Musical Visualization
1. The sound of a data file can be easily distinguished from other data files.
2. The data is easily extracted out of the sound mix by the attention of the listener.
3. The changes in the data are perceptible.
4. Only a brief explanation is required before listening.
5. The desired emotional effect is created.
6. The music is interesting and keeps the listener's attention but not at the expense of data correctness.
5 The Sonification and Presentation

The sonifications are briefly described in the following table. In the final paper submission, diagrams and a discussion evaluating the merits of each sonification will accompany this table. The sonification may be heard by accessing the following web page: http://www.nh.ultranet.com/~mwcquinn/icecore.html. In the presentation, part of the explanation of the sonification can be demonstrated live by myself playing the role of the data and triggering the various sonifications separate from one another. This may be done by playing electronic drum pads or small triggers, or directly from the dynamics of my voice.

The sonifications are built around a 5 octave C scale starting about 1 octave below middle C. This was chosen in order for the music to be pleasant and to make sense harmonically between pitches being triggered by different files.

Table 1: The Ice Core Files and Their Corresponding Sonification

Data File Periodicity
Data Meaning
Music
70K Eliptical Orbit of the Earth Transposes all music from 0 to 7 semitones.
41k Obliquity or change in tilt of the earth Five sets of 3 notes in 4th and 5th intervals. Each set is played as an arpeggio up and down. The sets are considered a scale, so the lower sets are played when the earth has less tilt, higher sets are played when the earth has more tilt.
1460  Ocean Circulation The ocean effect selects one out of a scale of three instruments that the 41k arpeggios play thru.
23K Precession or wobble (which hemisphere is closest to sun) Lower two octaves of a C scale, lower values select lower notes. Played as a pipe organ.
11k Sub precession Higher two octaves of a C scale, lower values play lower notes. Same tone as 23K.
6300 Ice Sheet Movement  Complex pattern of tom toms and congas play when it is warmer and values lower, simpler pattern of agogo bells when it is cooler and values are higher.
3200 Sub harmonic of 6300 Closed hi-hat whose volume is controlled by data values.
2200 Solar Variability Marimba patterns. Higher note patterns when warmer, lower note patterns when colder, separated by an octave or more.
Volcanic Volcanic Activity Timpany drums express the explosive nature of volcanos. The lower and louder the drum, the bigger the eruption. Also, crash cymbals accompany the drum hits. The volume is controlled by data values, louder crashes equal more activity.
Time (110000 BC to 1975) Time line String section starting from very low note of a 5 octave C scale up to very highest note of that scale.

Marty Quinn
Design Rhythmics Sonification Research Lab
92 High Rd
Lee, NH USA 03824
(603) 659-5239
mwcquinn @ nh.ultranet.com
http://www.nh.ultranet.com/~mwcquinn/icecore.html


6. References

1 Jackson B, Thoro D. Applied Combinatorics with Problem Solving. Addison Wesley Publishing Company, 1990
2 Mayewski P. Glacial Research Group Newsletter. Durham, NH. University of New Hampshire, 1995
3 Mayewski P, Meeker L, Twickler M, et al. Major Features and Forcings of High Latitude Northern Hemisphere Atmospheric circulation Over the Last 110,000 Years, Glacier Research Group, Climate Change Research Center, University of New Hampshire, 1996
4 Norman J. An Introduction to MIDI. NY, NY. MIDI Press, 1984



Return to Papers List
Last updated 25 December 2000