This interface demonstrates software used to model aspects of human performance. The "virtual performer" (the performer here exists only as a lisp program) alters the playing style in response to the location on the map above. The performance -- notes, phrasing, articulation, etc. -- is generated by a set of musical-style rules, with the resulting musical output realized in real time by a physical model of a flute written in the RTcmix sound synthesis software language.
Rules can be changed dynamically to investigate what makes a particular style of music sound the way that it does.
The performance model software was written by Brad Garton with assistance from Matthew Suttor, and the software model of the flute was based on work done by Perry Cook.