December 2013 - January 2014
berio.loadPixels(); is an interactive computer visual system that attempts to enhance and broaden the experience of classical music performance. The system was used at the Glasskin Arts Premier Concert in NYC during a live performance of Sequenza I for flute composed by Luciano Berio.
Images from a web-camera controlled by the audience, sounds from a flutist, and physical movements of a performer were passed into computer algorithms to produce a live visual portrait of the Sequenza I performance. Saved frames from these visuals were then transformed into fine-art digital prints which make up the eighteen work series. The series is intended to be a representation and portrait of the performance. All works are one-of a kind and are for sale if a link is provided.
Video by: Eric Torres Edited by Kody A. Trauger
This project continues my exploration of inviting others to influence my art making process. The results of doing so end up being unique and surprising to all individuals that participate, including myself. While I developed berio.loadPixels(); with specific control systems and visual outputs, the details were defined by three inputs from the performance.
Specifically, audience interactivity using a web-camera was the foundation from which the other visual components were built upon. My algorithms altered the pixel color of the video stream based upon brightness and hue levels. The result provided much of the warm tones in the final frames. Additionally, the audience was eager to use the web-camera which I think reflects the “selfie” and visual documentation trends prolific in our modern society. That is to say, a web-camera is inherently accessible given it’s age in consumer electronics and the public psyche.
The second input that influenced details in these visuals were sounds from the flute. The music controlled computer generated semi-transparent ellipses that provided the cooler tones in the frames. This element becomes most prominent during the loudest moments of the performance, such as in frame 377. The differences between volume from one second to the next determined the hue and saturation of these ellipses. Effectively, during quiet moments, the frames are usually more photorealistic, and conversely, the loud moments resulted in abstracted imagery.
Lastly, I used a Leap Motion controller to define details in the visuals. The device uses cameras to track finger and hand positions in 3d space. By positioning my hand in specific places and orientations, I controlled a blurring effect. This had the most drastic result of controlling the linear elements in frame 49 and the clusters of abstraction noticeable in frame 1642. My hand movement was partly in response to the music being played, much as one taps their foot or bops along to a song, and partly a conscious response to attune the visuals to my aesthetic.