Coney Island
a Virtual Environment Performance

Coney Island is fourth in a series of virtual environment performances/installations created by the Machine Child Ensemble. Cast in a composed VE, Coney Island immerses us in the linguistic play of machines, the production of language, and the automation of pleasure. Simulated mechanics in Coney Island drive the dynamics of amusement-machines, forming an arcade landscape of sounding bodies. "Productions" make meaning in media, in labor and in context-free grammar. Performers participate over a local area network to modify the simulations. The ensemble includes audience volunteers who interact with performance tools distributed across the concert hall. Ensemble interaction is directed by a soloist/navigator/conductor, to arrive at a non-improvised group performance.

Sound Synthesis software: vss, NCSA Audio Development Group, Camille Goudeseune, chief engineer. Virtual Environment software: ScoreGraph, Beckman Institute Integration Support Lab, Alex Betts, chief engineer. 3D Models by Josh Nizzi, in Maya software. Technical support provided by the NCSA Audio Development Group.

Program Technology: Coney Island is organized in a ScoreGraph, a virtual scene graph with multi-temporal properties. A ScoreGraph is an architecture designed to integrate multiple dynamic simulations with sound synthesis and 3D graphical display under interactive performance. Each temporal process and graphical object is a separate thread defined as a node in the graph. Inter-thread communications are defined as intelligent edges. Nodes include 3D graphics, equations of motion, grammars, input devices, virtual cameras and lighting, and sound synchronization. The SG topology dynamically reconfigures as the demands at various threads change during a performance. Composition involves spatio-temporal design of the performer's orientation to the dynamical properties of the ScoreGraph. Performers impart energy into simulations using motion-sensors. Some sensor devices provide force feedback from the simulated dynamics. Video-based hand gesture recognition provides a symbolic system of inputs.

Created by the Machine Child Ensemble: Robin Bargar, Insook Choi, Alex Betts, Juhan Sonin

Performed by Insook Choi and members of the audience, with support from MCE.

Back to Concert Report for Coney Island

Back to Concert 2