While working on my research project on data sonification, Sounding Numbers (2020-2022), I decided to explore a participatory use for the sonifications I was creating. I wanted this outcome to be an immersive, welcoming, and open space for the participants’ perception, allowing them to imagine realities and experience emotionally the scientific theories that I worked with.

In this work, I addressed ocean contamination, deforestation, neutron stars, and spaciality for a critical perspective on LOVE. I chose to use technology to extend our sensorial world of perceptions. I explored motion and heartbeat sensors, MIDI controllers, CO2 sensors, and transducers. Through this technology, the participants can change and compose soundscapes by going around a room, using MIDI controllers, or just by sending their heartbeats to a computer. They can also hear how sonifications change while measuring the amount of CO2 from the air they breathe. Or they can listen to sounds of deforestation while hugging a tree to later plant seeds next to it.

I feel that the physical action of participation is a very strong action – an activist action. E.g. listening to deforestation sonifications resonating from the trunk of a tree, like a heartbeat of a dying species. And after listening, we plant seeds next to that tree, a simple but powerful act of redemption.

In an overall sense, the beauty of these pieces lies in the space of reflection they create, allowing the participants to draw their own conclusions and engage in positive artivist actions that bring a sense of community.

CATALOG

This piece invites the audience to listen to the health of our world’s forests and at the same time engage in a deforestation micro-activism practice. To create the sonifications I used 20 years of data from Global Forest Watch. Throughout the installation, the participants can listen to the sonifications by hugging trees, later plant some seeds, and read a text encouraging them to think about deforestation in a poetic and scientific way.

Materials. Transducers with amplifier. Plant pots. Native seeds. Soil. Mp3player. Coding. Supercollider

The installation is very resilient and can be performed in different scenarios: from gardens to forests, from greenhouses to a lonely tree in a shopping center. You can read the following steps to imagine how it can develop.

1. Entering the space. People scan a QR code to access an online PDF with an introduction, a set of instructions, and a reflective and informative text about deforestation.

2. Accessing the PDF. The QR gives access to this PDF, where the first page contains an introduction and instructions for the installation, and the second page a scientific text next to a poetic reflection on deforestation to reinforce the message of the artwork.

3. Hugging a tree. the sonifications are played using transducers connected to trees, so the tree works as an amplifier. By hugging the tree, the participants can feel the vibration of the sounds, and by positioning their ears against the trunk, they can hear the sounds of deforestation.

4. Planting seeds. Pots filled with soil, and bags with seeds are lying around the trees. The participants can plant the seeds, water them, plant the around the space or take the pots home to grow a new potential forest.

5. After the installation. The whole installation has been reduced to a box, so the participant can take it to their home and plant a forest wherever they want.

Listen to the sonification of deforestation in Lithuania

open in new window

LOVE IS was created to study how people engage and perceive space, using their own bodies to transform the soundscape they listen to.

In this piece, different sounds are triggered by ultrasound sensors: sounds are activated when people move around while wearing a heartbeat monitor that sends their heartbeat to a computer. This computer uses the heart rate as a number to transform the same sounds they are triggering.

When my father passed away, I remembered one of the most beautiful talks we had together about love, and what love was for him. Still today, his words are echoing in my heart. Reflecting on this memory, I thought that even though what love is, is different for every being, there is always an element of presence and strong emotionality.

Following this line of thought, I created this installation, where presence is translated into our movements around the room, and emotion is translated into the beatings of our hearts.

Materials. Heartbeat monitor. Ultrasound sensor. Raspberry pi. ESP32 wifi board. Coding. Phyton. Supercollider. Arduino

To perform this installation I need a dark room with enough space for people to move around. It is very simple and very powerful. You can read the following steps to imagine how it can develop.

1. Entering the space. People scan a QR code to access an online PDF with an introduction, a set of instructions, and quotes from Argentinian writer Julio Cortazar about love.

2. Accessing the PDF. The QR gives access to this PDF, where the first page contains an introduction and instructions for the installation, and the second page a scientific text next to a poetic reflection on deforestation to reinforce the message of the artwork.

3. Wearing the heartbeat monitor. these monitors are the ones sending the heart rate to the computer, once every second. When the rate is received, it interacts with the sounds that are being played at that moment, changing the different qualities of the soundscape!

4. Exploring the space. ultrasound sensors (triggers) are set around the room, and when a participant crosses in front of one of them, it activates a specific sound in the computer that was assigned beforehand, flooding the room with a new sounding texture.

In the dark, it feels something like this.

open in new window

Inspired by the scientific sonifications from Jodrell Bank Observatory, Radio Pulsars was created as an etude, to study the interactive potential of scientific sounds through MIDI controllers.

In this piece, I coded sounds mimicking the aesthetic quality of already made pulsars’ sonifications from the Jodrell Bank Observatory. I used data sets from the European Pulsar Timing Array. I also programmed MIDI controllers to start and stop these sounds, and change the speed ratio at which the data is being read, which creates diverse developments in the sound.

In the installation, two MIDI controllers are positioned in front of each other.  The participants enter the room, and by pressing buttons on the controllers, they play the sound of the stars.

As a sounding result, complex and surprising textures and sonic movements are created, capturing the attention of the participants in a playful interactive environment.

Materials. Raspberry pi. MIDI controller. Coding. Supercollider

To perform this installation I need a dark room with four speakers. It is a very simple and very playful installation. You can read the following steps to imagine how it can develop.

1. Entering the space. People scan a QR code to access an online PDF with an introduction, a set of instructions, and poetic and scientific texts about Neutron Stars.

2. Accessing the PDF. The QR gives access to this PDF, where the first page contains an introduction and instructions for the installation, and the second page a scientific text next to a poetic reflection on deforestation to reinforce the message of the artwork.

3. Entering into a playful starry night. In the room, there are two MIDI controllers connected to a small computer. When the participant presses a button on the controller, the computer creates a sonification assigned to that button beforehand. Data of a star is transformed into sound, and by moving the knobs, the reading speed of data changes. This produces alterations in the outcoming sound, so what sounded percussive is transformed into strange melodies, allowing small nuances to be amplified by our curious interaction!

4. (in)Between the sounds of science. the sounds of this installation are the same as the ones used in scientific research, but with the MIDI controllers, a bridge is created to travel between the scientific and interactive potential of sonifications.

Below is an audiovisual representation of scientific sonifications, and to the right is a playful moment from participants at the installation.

The interactive potential of scientific sonifications.

open in new window

The sounding result of this installation is a forceful, immersive, and mysterious experience, that aims for the listeners’ imagination to feel the changing and raising tides of our world.

To set this installation I need shallow waters: a lake, the sea, or a pool. A place where the audience can immerse themselves in water and sound, walk around and reflect on this climate emergency while feeling the sonifyed pulse of the raising tides.

Still a work in progress! At the moment I am trying to find funding to prepare the CO2 measuring device, and write some code to create mappings between CO2 values and the live sonification. I am also thinking about how should the machine learning algorithm relate to the new CO2 inputs and the historical data. This is a BIG question because the input methods are not quite the same… but how does it make sense artistically? Besides, I need a sound company, that is concerned about climate emergencies, to borrow me +10 Powerful Bluetooth Waterproof Speakers. WOOOOW, a lot going on! but I am very excited, aren’t you?!

Imagine the process and the installation.

1. For this installation, I want to place ten JBL waterproof speakers floating in the shallow waters of Amagerstrand. To make them float, I will attach them to plastic bottles working as buoys. The ten speakers are going to be connected via Bluetooth to a mini computer (raspberry pi), which is going to be playing a live sonification created through the measurement of CO2 in the atmosphere

Waterproof speakers attached to buoys

2. The measurement of CO2 is going to be captured by a sensor next to this computer. With these numbers, the computer predicts the sea level based on a predictive algorithm. Basically, we are sonifying the environment through this algorithm, which learned how to predict sea level based on the historical data provided with the data sets. check the video ->

3. People are invited to walk around the speakers in the water and feel the vibration of the sound. As with the previous pieces, they will have a QR code to scan, and read about the installation and the way they can interact with it.

Accessing the PDF through the QR code.

PDF page 1. The page contains an introduction and instructions, which are meant to reinforce the purpose of the installation, making sure that the participants make use of the tools offered to convey the experience. The participant is instructed to Walk around the speakers in the water:

by walking around the speakers, people can feel the vibration of the sound in the water or listen to the sonifications from the speakers that are not fully submerged. The participants are also welcome to dive and listen to the speakers which are underwater to perceive the live-predictive sonifications in another sounding environment.

PDF page 2. On this page, the participants can read a scientific text next to a poetic reflection on ocean pollution. Both texts aim to confront the participant with an opposite perspective on the same issue. The participant will find in addition to scientific facts, a poetic reflection on ocean contamination:

Hear the wind, that calmly whispers wordless infinities,
and feel the pulse, of the beating tides, in the rising minds.
“This is our soul shadow, the darkness we cannot own, the form we cannot name.” 1
So hear the wind, as your soul shadow and feel the form, that cannot be named. For we love and fear, in tender surrender. We love and fear, for a soulful end.

About this installation

The sounds for this installation were inspired by the book Hyperobjects by philosopher Timothy Morton, and the data for the sonifications was taken from the EU Global Surface Water joint research center, NASA Goddard Institute for Space Studies, and the U.S. Environmental Protection Agency.

In this piece, I work with a predictive algorithm. The algorithm uses live-CO2 input to predict ocean rising levels. How? the code learns beforehand a relation between CO2 and the level of the ocean using historical data, so once it receives new CO2 input, it calculates a new value for the ocean level.

Why use live CO2 measurement? most of the CO2 we release into the atmosphere, e.g. from deforestation, is captured by our oceans, making it one of the most serious issues regarding ocean acidification and rising levels. By using a live measurement, the sonification sounds different in Copenhagen (Denmark), in Recife (Brazil), in a harbor, or in the middle of the pacific. By knowing this, we create in our minds an interrelation between the place we inhabit, the air we breathe, our biosphere, and climate change, as the CO2 we release into the air impacts directly the levels of the ocean.

open in new window

Work in progress 🙂