Living Symphony


Concept 

The idea was to visualize audioreactive fractal-like patterns on human skin with the help of Processing and some libraries. Kouhei Nakama and his work was a huge inspiration in the research stage. Finally we chose the Reaction Diffusion Algorithm to accomplish the patterns and realize our goal. That's how the project Living Symphony was born. 





A symphony of 3 parts

The symphony is derived from the Greek word sýmphōnos meaning "sounding together". The title of our project was carefully chosen. It basically consists of 3 parts which together should form a coherent picture in which the viewer can immerse like in a musical symphony. All 3 parts work together like in an orchestra and result in the final image.

The visual style was achieved with the help of a customized shader, to get the dream-like images. The dots are in a constant flow, like water, which is in motion, but remains the same.

The dancers are moving to the gently rhythm of the piano music. They are telling a story which is free to be interpreted by the viewer.

The visual pattern on the floor is inspired by Rorschach inkblots and responds to the music. If the dynamics of the music swells, the pattern becomes larger, takes up more space, but never intends to be dominant.




Execution

The audio-reactive pattern was created with the Sound Library provided by Processing.  With the help of this library, the sound spectrum was sampled and analyzed. The resulting values got mixed up with volume and amplitude values of the audio input and those constantly transform the shape of the pattern.

For the image composition and masking the video is loaded into the application using the Video Library. In order to extract the foreground - the dancers - from the background, the video was compared frame by frame with a picture from the background of the video. This was done by taking a screenshot of the beginning of the video when no dancers were visible. This method required a static camera image, which does not change during the video.

The shader is basically a Reaction Diffusion Algorithm in a GLSL Shader. It is a simulation of two virtual chemicals which react to each other. Chemical A is our dark background and chemical B is the blue color. The masking regularly creates new chemical B in the scenery at the respective position of the dancers. This creates the patterns that make the silhouettes of the dancers visible.




Deep Space

The Ars Electronica Center in Linz (Austria) offers its visitors a 16 x 9 meters of wall and another 16 x 9 meters of floor projection, laser tracking and 3D animation - The so called Deep Space. Here, image worlds are projected in 8K resolution and raised to a completely new, unprecedented level. Our project was developed for the Deep Space and its two projection areas of wall and floor to create a unique, mesmerizing, impressive and overwhelming experience.  








To provide the basis of our project, we took a video, referred in the link below. We do NOT own this video. All rights belong to it's rightful owner/owner's. No copyright infringement intended.
Choreography:        Emery LeCrone 
Music:                    "Ritornare" by Ludovico Einaudi
Dancers:                Shane Ohmer & Izabela Szylinska
Link:                       https://www.youtube.com/watch?v=HoVIp03REx8&frags=pl%2Cwn



Living Symphony
Published:

Living Symphony

This is the documentation of our project Living Symphony, which was created for the Deep Space at the Ars Electronica Center, based in Linz, Aust Read More

Published: