interactive devices 2
with SPECIAL GUESTS
Luke Dubois
and
Ryan Carter!
This week we looked at some more advanced controllers to influence our
musical processes. In at least a few cases, the process being
captured by the interface/controller itself was interesting, so
connecting one complex process to another might yield AMAZING and
ASTOUNDING results. It's all up to you...
We started with my quick demo of the EEG/Brainwave controller I
use with Dave Sulzer/Soldier (see links below). I then showed some
older camera-tracking techniques. Just for fun we tried connecting
the EEG headset to a plugin in Logic using the "from MaxMSP"/"to MaxMSP"
setup in the [ctrlout] object. It sort-of worked. There were
some issues with the controller range-mapping. Annoying, but they
shouldn't be too difficult to work out.
Then Ryan showed off his fun GameTrak controller. Amazing sounds!
After Ryan, Luke showed us all kinds of fun to be had with a
Kinect. He sent me all the software he discussed and showed in class;
it's linked below.
Links
- Luke Dubois
-- Luke's home page
- Ryan Carter
-- Ryan's home page
- Dave and Brad's Brainwave Project
-- our EEG project home page. Videos! Links! Sound! Much, much more!
- Dave Sulzer/Soldier's home page
-- worth visiting for the Thai Elephant Orchestra alone, to say nothing
of The Kropotkins.
- GameTrak game
-- this is the PC version from Amazon. It's a little pricey here,
I think you can find it for a lot less if you google around a little.
Get the "PC" version with a USB cable so you don't have to modify it.
You'll have to look around a little for the Kinect. Luke (yay!) left us
one to fool with at the CMC, but the used models you would want for
development are not expensive. Here's the story about which version
to get (this is from Joshua Goldberg):
Microsoft has subtly altered the innards of the Kinect in a way that
breaks libfreenect, and as a result jit.freenect.grab.
Theo Watson and the libfreenect guys are working on it, but there does
not appear to be a one-size-fits-all fix as of yet:
https://github.com/OpenKinect/libfreenect/issues/316
OpenNI is unaffected by this, so OSCeleton/Synapse/jit.openni/dp.kinect
are all fine.
This affects me because I build installations using Macintoshes and
Max/MSP/Jitter that need jit.freenect.grab's functionality of pulling
data from multiple Kinects.
How to tell which sensor model you have:
Look underneath the base at the sticker, at its upper right hand corner.
If your unit says "Model 1414" you're fine. If it says "Model 1473",
you've got a gimped Kinect.
Class Downloads
- week8-classpatches.zip
-- patches for the EEG sensor, the camera-tracking demo and
the very simple connection to Logic. The "I'm thinking" and
Nissa (my cat) meowing soundfiles are included.
- bradsclass2.zip
-- Luke's software, including the Processing/java code to make
the Kinect work a Whole Lot Better.
To run Luke's patches (and to use the Kinect in general), you will need
to download and install a few other things:
- Processing
-- this is a java-based language 'for artists'. It's a good
thing to have in general. Terry covers it in his class.
- simple openNI
-- you will need both the openNI library and the Processing
library that goes with it.
- OpenSoundControl (OSC)
-- I think you might need this object if you don't have it already.
It's now part of the big CNMAT bundle that you download from this
page. The older CNMAT link on
this page
appears to be broken.