week5



For the last two weeks of the Max/MSP/jitter unit, we plan to focus on the extended signal-processing and synthesis capabilities of MSP. In particular, there is an emerging approach to sound creation that is rooted in the modeling of physical systems -- not "physical modeling" in the Perry Cook/Julius O. Smith sense of low-level sound synthesis, but the use of digital simulations of real-world activities to control the long-term evolution of sonic environments.

This is an interesting area of contemporary computer music research, and a fair amount of work has been done using this approach. In fact, the focus of this particular class (g6610) in the past has often been on this kind of macro-musical modeling. If you'd like, check out:

for examples of what we have done. There are also some interesting links buried in the web pages, although I notice that more than a few need to be updated. We will be exploring several of the topics from these past classes in more detail during the coming weeks of this class.


As before, the class patches (in a StuffIt archive file) are available on the resources page.


The Bouncing Granular Synthesis Thing

Luke first showed the max "pong" object, connected to a ball-visualization scheme, thus producing a bouncing ball with physics (damping etc.). He quickly tied it to some sound, it clicked when it bounced. He did fun things like make it gain energy on each bounce. This was thrilling.

WHAT WE'RE GONNA DO: We will use this to make some weird bouncing things... probably use it to control granular synthesis in various intensely odd ways. ex: we may use the balls 'polyphonically' to control grains of sound -- interacting particle models, etc.

NOTE: Part of what we want to do in this simulation is to explore the rich synthesis technique known as granular synthesis. This technique has been explored extensively by composers/researchers such as Barry Truax and Curtis Roads, and forms the basis for much of the discussion in Road's new book Microsound. (see last week for links to the book. For more information about granular synthesis, the following web sites are pretty decent:

Before we got too carried away, Luke demo'd some some new! improved! Max/MSP objects, and snazzy ways to use them. (an aside: Luke is going to make things up as he goes along, because he didn't like any of the patches he made last night.)

the "clik~" object makes a 1 (impulse) in a stream of 0's, with every bang. This is handy as an impulse into a filter, or something, to hear the response characteristics of a filter or other sig-processing algo.

For example, putting the "click~" into a "reson~" gives you a cool little 'dink' tone -- you can change the width of the filter, center freq, etc. You can use this as a grain element in a gran-synth patch.

By using combinations of "random" and "decide" to gate through a fast "metro", we were able to get a lot of cool granular stuff (also randomized the frequency of the "reson~" filter).

Luke replaced the "metro" with a "zerox~" object, which puts out a click every time the incoming signal crosses 0 -- this is much more efficient than a "metro" because of internal scheduling stuff in Max/MSP.

Then he created an FM patch that slowly modulates the frequency of the sine wave being counted by "zerox~", which generates the clicks into the "reson~" object. Then we could make the slow FM modulation become audio FM, and it was truly amazing. It really was.

Next we thought it would be keen to randomly change the center frequency of the "reson~" filter. The first time we tried it, the "rand~" object was generating ramps instead of discrete values for the center freq of the "reson~" object, so we used the "sah~" object to sample and hold it.


THE FILTER ASIDE:

Smilin' Paul Hogan asked about how digital filters worked, so we took a little excursion. Here are a few links to digital filter theory web pages:


BACK TO THE GRANULAR FILTER THING:

Luke showed off the "buffir~" MSP object, which allows you to specify filter coefficients for an FIR filter in a buffer. The snazzy thing about this is that you can use it to create a 'time-domain convolution', because if the coefficients trace a waveform, the outputs of the filter when given an impulse will by sort-of multiplied into the waveform that is represented by the coefficients (think about it...).

This is ok for one sample of the waveform, but what the heck good is that? Well, if you send an audio signal train of clicks through this "buffir~" object, you can effectively 'waveshape' the output spectrum to produce different waveforms on output.

We can then scan through a sound, using little chunks to load in the "buffir~" filter coefficients to produce a really twisted convolution-sounding thing. Luke played happy Steve Jobs as a granulated individual, it was once again more fun than a barrel of monkeys. This is a bit like a really tightly-controlled (frame at a time!) ring modulation. Oh the things you can do -- change the soundfile intput into the "buffir~" object to a "zerox~" click-generator, or try a ">~" object to put through clicks every once in awhile.

One of the points of all this was to show that using audio-rate event generation (clicks in this case, using "zerox~" or ">~" to put out the clicks) is much more efficient and timing-savvy than trying to force "metro" to do it. This will be important for gran-synth applications.

Going back to the original click->reson 'grain' generator, one of the problems is that we don't have an envelope on the grain we are putting out. We are just relying on the characteristics of the excited filter ("reson~") to create each grain of sound.

One easy way to this uses "rampsmooth~" to generate a ramp-envelope out of the clicks. With an envelope applied, we can control the pitch (within) the grains, the pitch of the filter processing the grains, and the pitch of the grain-train (sequence of grains) all independently. This opens up ALL SORTS of avenues for astounding audio explorations..

[Check out Barry Truax' piece Riverrun for a good exmple of this pitch-confusion using granular synthesis techniques.]

Turns out that using the "slide~" object for envelope-generation is better, because it uses logarithmic instead of linear ramps.

We now have this granular instrument, and we hook it up to "pong" to create some bouncy-ball sounds (each generated grain sounds like a bounce). Luke did a bunch of hacking-around to find out how to detect when the 'ball' from "pong" stopped bouncing. He set it up to dynamically shift the damping factor, so that the ball will start bouncing, stop bouncing, start bouncing, within a certain range. Then he futzed around with which aspects of the bouncy-grain sound to control, and finally we set this up as a subpatch that could be turned on and off.

Why? Because we want to turn this into a polyphonic instrument using the "poly~" object. He added an "in~" port and two "out~" ports (with some snazzy random panning!). By passing the name of this saved subpatch to "poly~", we could instantiate many many mamy versions of it (well, we did 16 in class).

Setting up the subpatch with flexible params, we were able to spawn and make a whole family of bouncy-ball-grain things, it was just amazingly fun.

Luke added a bit more randomness in note-choice (weighted it slightly for lower pitches, etc.) and pretty soon we had a nifty thing going.

More next week...