Audio Portfolio
My name is Andy Jarema and I am a musician, educator, and composer in the metro-Detroit area. I am thrilled you are considering me for a position. Below is a small portfolio of my work that demonstrates my skills as a writer, producer, and musician.
As part of my Master’s Degree in Music Composition (Wayne State University, 2017-2020), I spent time researching the music of composer Andrew Norman and how it utilizes concepts from video game narratives. Below is an excerpt from my research paper (the full paper can be found here) that I later adapted into a presentation I gave at several graduate music conferences, including the University of Toronto (2018) and the University of Kansas-Lawrence (2019). This project demonstrates my commitment to detailed academic/scholarly research in the music field, especially with regard to twenty-first century composers and the future of classical music.
“Symphony orchestras and video games are two mediums seeking to find their footing in the twenty-first century. The orchestra is an artifact carrying the weight of hundreds of years of tradition, trying to find relevance as a performance medium amongst the musical innovations of the twenty-first century. This presents difficulties to twenty-first century composers such as Andrew Norman, who remarks of his personal struggle to find “how my voice and my creative interests might map onto the orchestra world. That side of our field can be very conservative- very much tied to the 19th century canon” (Norman). Video games, on the other hand, are a relatively new medium attempting to position itself as meaningful, or perhaps even artful, despite its associations with “empty spectacle and cynical attempts at cross-platform marketing, both of which are presumed to take precedence over character and traditional storytelling” (Brooker). Norman’s Play (2013), a three movement, forty-five minute work for symphony orchestra, embodies both of these mediums. An analysis of Play illustrates that elements from video games, such as various narrative structures and agents of control, are integral to Norman’s creative process as a composer. By utilizing these video game elements within the construct of the symphony orchestra, Norman demonstrates how a twenty-first century composer can place the orchestra in a fresh context to provide insightful musical and social commentary.”
When COVID-19 shutdown schools in the spring of 2020, I had to quickly think on my feet to reinvent my music classroom for virtual lessons with my 500+ elementary music students K-5th grade. I settled on a podcast format, using the only audio equipment I had available in my house (an SM58 microphone with unfortunately no screen to catch my plosives) to record everything in my living room. The excerpt below is from a podcast episode designed to teach my young students about the composer J.S. Bach. It functions as an audio “listening guide” that helps outline the structure of the piece, including a movement activity that teaches my students the concept of counterpoint through the movement of their bodies. The beginning of the excerpt features a tag I composed/recorded as a nod to the NPR shows I love, and much of the sound editing/design is heavily indebted to Nadia Sirota’s work on the Meet the Composer podcast. This project highlights my ability to be an engaging storyteller with diverse kinds of audiences, including the young, future listeners of classical music. More of these podcast episodes can be found here.
When COVID shut down the theater world in 2020, my wife Danielle produced a podcast version of Shakespeare’s Macbeth with her high school students instead of a stage production. I was responsible for the sound design of the podcast, which ranged from editing all of the remotely-recorded audio from 10 different cast members, adding subtle effects to the audio (like the L/R panning on the three witches you hear in the beginning), and processing my own recorded samples to establish a ghostly sound design. The opening sound of a bubbling cauldron was created by recording the tea kettle on my kitchen counter and then replicating the sound a few times. The sudden sound at 0:06 is a propane tank being struck with a hammer with severe pitch/time shifting effects applied. The eerie “possessed” sound at 0:16 was made using a custom piece of Max/MSP software that can randomly transform and process any vocals or speaking parts. This project showcases how I often utilize my abilities as a musician to communicate narrative through sound.
My most recent work as a music composer seeks to tell the narrative of weather/climate data through sound and music. This is accomplished through the process of sonification, which John Luther Adams describes as “the process of mapping data with some other meaning into sound.” Using the software Max/MSP, I designed a computer system that can pull live weather data from any city in the world and transform it into music (temperature = rising/falling pitch, wind speed = tempo, pressure = low drone pitch). I then asked people on social media (Facebook, Instagram) to help me crowdsource a melodic line of music by choosing their favorite city to pull weather data from. The results can be seen in the video below. This projects demonstrates my organizing power on social media to involve everyday people in conversations surrounding music and real world issues.