Currents: Movement 1 (2020)

Website Image.jpg

Currents is a musical work that depicts the NOAA’s database of the last forty years of billion dollar weather disasters in the United States. The data is translated into music through sonification, which can be described as “the process of mapping data with some other meaning into sound” (John Luther Adams).  This is achieved using JavaScript programming (Node.js) inside of the software Max/MSP to access and parse large data structures (JSON and API’s).

Below is a breakdown of how Movement I works. For a more complete look at all six movements of Currents, click here.





Initial Programming Experiments

Before creating Movement 1, I spent some time experimenting with using an API (Application Program Interface) from the website Open Weather to acquire live weather data from any city in the world. The sonification starts with the data shown below, which is an example of what the Open Weather API displays:

Weather API 2.jpg


This data is accessed using JavaScript programming and the runtime Node.js. The programming can be seen below:

Weather API 3.jpg

This data is sent into the program Max/MSP, which is a visual programming language designed for the creation of music and live signal processing. Max/MSP uses Node.js to parse the data:

Weather API 4.jpg

Now that the live weather data has been accessed and parsed, Max/MSP is used to sonify the data (i.e. turn the data into sound and music). It sends the data into a series of FM (frequency modulation) synthesizers to create the musical sounds, as well as graphs to display the data visually.

Below is the final product of the live weather data sonification in Max/MSP. It is a crowd sourced musical example where I asked people on Facebook and Instagram to name a city in the world for the program to use. Notice that you can hear three streams of data represented by the musical sounds: temperature (rising/falling pitch), pressure (a low, rumbling pitch best heard with headphones), and wind speed (a pulsing rhythm that gets faster with higher winds).








Movement 1

Movement 1 uses the same basic process as above, but with a few key differences. Instead of accessing an API for the live weather data, I chose to access a much larger block of data: a forty year record of billion dollar weather events that have occurred in the United States. This data was taken from a JSON file on the NOAA (National Oceanic and Atmospheric Administration) website and a sample can be seen below.

Sonification 3.jpg

Using Node.js, I was able to parse this JSON file within Max/MSP just as before. Sonifying the data proved to be a bit more challenging, however, and I had to come up with a few creative solutions to allow the music to successfully translate the data. The first issue was created by outliers in the data set, which in turn caused some of the sonified musical elements to be either too uniform (undetectable changes in musical pitch) or too extreme (pitches played outside the range of human hearing). Using Max/MSP, I created “parameter bands” that could help scale the data and translate it into sound frequencies. A model is shown below.

Sonification 1.jpg

The second issue was purely an artistic/creative one. Although the sounds faithfully represented the data, it was not compelling as a musical piece. In other words, the music was boring! Using random number generators to alter the rhythm and timing of the musical notes, I was able to create a system that made interesting musical phrases while also remaining true to communicating the nature of the data itself. A model shows the evolution through three different versions below.

Sonification 2.jpg


Once these components came into place, the final version of Movement 1 came together. The eight minute work parses out three streams of data (# of disasters/year, cost, deaths) from billion dollar weather disasters that have occurred in the United States in the last forty years. See the final work below.