Illuminated Tapestry

Illuminated Tapestry

Summary

Illuminated Tapestry was a commissioned collaboration with BalletMet Columbus, and was part of DanceTech, performed at the Capitol Theater in Columbus, April 20-28, 2012.

Project Team

Real-time resposive animation designer: Alan Price
Dance choreographer: Jimmy Orrante

Music composer: Sean Beeson


Project Description

The dance performance Illuminated Tapestry incorporated real time computer animation and custom electronics. The original music score by composer Sean Beeson played on a separate computer with a midi control track that sent triggers to the first computer for scene changes and animation elements that were synchronized to the music. 



An Xbox Kinect camera was suspended on the lighting grid 24 feet above the stage, which generated a 3D map of the dancers as they moved throughout the space using a custom application written in Processing. This 3D map data was sent over a network connection to the graphics computer using OSC. The transmitted map was then used to control the motion of a variety of visual elements in the computer animation.



The rear projection screen behind the dancers was 35 feet wide by 20 feet high. A 4K pixel resolution projection image was created by mounting four 1080p HD projectors in a 2x2 matrix using edge-blending and geometric correction to blend the four projections into a single large image. 



Dancers in front of large screen with digital projection
Dancers interact with illuminated inflatable balls

Near the middle of the performance, six illuminated inflatables are brought out on the stage for the music sequence titled “Glimpse of Heaven”. Each of the spheres has an electronics box containing batteries and a high brightness RGB LED assembly. An Arduino Pro-mini microprocessor was programmed to send PWM (pulse-width-modulation) signals to the power MOSFET transistors that supply current to the LEDs, which results in the fading transitions between color combinations. A tilt-switch is also incorporated into the circuit and was initially used to detect vibrations in order to trigger color changes when touched. This feature was not used in the performance due to the final choreography, but has been re-activated here so that the colors will change each time a participant taps one of the spheres to send it higher into the air.