reversible reaction takes its inspiration from the chemical phenomenon in which reactants and products can form each other, oscillating between chemical states. Contrasting sonic and visual environments create an abstracted microscopic world in this installation: molecular bonds join and break, atoms float in suspension, and the environment changes states when “catalyst” participants disturb the system’s equilibrium.
reversible reaction contains several interactive elements. The most salient is visitor motion tracking with a Kinect infrared camera. Visitors in the space directly affect the installation, switching it from state to state depending on criteria related to their position – in relation to each other and the space itself. Depending on the number of visitors within the space, the criteria adjusts to keep the installation oscillating at a fairly consistent rate.
Example criteria that cause the installation to switch states:
· The space has the same number of visitors for a certain amount of time.
· The space has no visitors for a certain amount of time.
· The number of visitors exceeds a certain threshold.
· A visitor steps into a randomly designated area in the space (changing after it is triggered)
· If the distance between a pair of visitors exceeds a certain threshold, the state changes.
A GUI for this component allows for tweaking of these elements in real time (including the visitor number thresholds, distance tolerances, and a “speed limit” on the rate of change). The goal is for the visitor to be aware they are influencing the installation’s state without being certain of its mechanics, allowing them to experiment to uncover the conditions of change.
The rest of the interactivity occurs among the sonic and visual elements of the piece. The audio for each state is algorithmically layered from prerendered soundfiles in real time, creating a non-repeating aural component. Gradual volume changes, alternation and phasing of sonic events, and harmonic changes create “concentric” time scales that add variety over time to the general sonic atmosphere of each state.
2.1 channel audio
Max/MSP (environment control)
Processing, After Effects (animations)
MadMapper (projection mapping)
Arduino (LED lanterns)
Kinect and KVR Tracker (motion tracking)
Logic (sound design)