Chuck process examples


This file is to illustrate some common usage patterns for the chucking classes. I will add more examples as time permits.


Contents:

1. Setup code

2. Simple melody player

3. Adaptive melody player

4. Chordal process -- chord follows top-note melody

5. Chordal process -- arpeggiation with macrorhythm

6. Using regular SynthDefs instead of Voicers



1. Setup code


The chucking classes are all about data storage and quick retrieval during performance. Thus, a piece consists of code to set up all the material that will be needed, and the performance is to use this material.


The preparation code is pretty hefty, but observe later how compact the code is to initiate a process. The idea is to take all the heavy stuff -- materials, code fragments (which require debugging), etc. -- and load it all at once, then use it piecemeal. Running complex processes is very easy once you have the materials set up.


When writing a piece, I prefer to write all the setup code into a separate file. I can then load that file and have all the materials ready for use. Another file can contain code to create the processes and play the piece.


(

s.boot;


// storage for MIDI data

MIDIBufManager.new(nil, 0) => MBM.prNew(0);

MBM(0).v.gui;


// using a weird modal definition

ModalSpec.new(#[0, 1, 4, 5, 7, 8, 11], 12, 2) => Mode(\default);


TempoClock.default.tempo = 104/60;


// chord forms

MIDIRecBuf(\chords, [

#[67, 66, 62, 60, 67, 63, 62, 57, 69, 66, 62, 70, 67, 66, 60],

#[0.01, 0.01, 0.01, 1, 0.01, 0.01, 0.01, 1, 0.01, 0.01, 1, 0.01, 0.01, 0.01, 1],

1 ! 15

].flop.collect(SequenceNote(*_))) => MBM(0);


// melody -- will be used for top note of chord progression

MIDIRecBuf(\top, [

#[62, 69, 78, 70, 77, 86, 83, 78, 81, 92,

89, 86, 85, 82, 81, 78, 75, 70, 67, 61, 66, 69, 66, 69, 71],

#[1.5, 0.5, 0.5, 1, 1, 0.5, 1.5, 0.5, 0.5, 1.5,

0.75, 0.5, 0.25, 0.75, 0.5, 0.25, 0.75, 0.5, 0.25, 1.5, 0.5, 1, 0.5, 0.5, 1.5],

#[1.5, 0.2, 0.6, 0.8, 0.3, 0.5, 1.5, 0.5, 0.5, 1.5,

0.75, 0.2, 0.2, 0.75, 0.2, 0.2, 0.75, 0.2, 0.2, 1, 0.5, 0.4, 0.5, 0.5, 1.2],

0.5 // gates - maybe apply contour later

].flop.collect(SequenceNote(*_))) => MBM(0);



// define some pattern prototypes that will be used in chordal process

// arpeggiation patterns


// to play a chord as a block, I have to convert an array of notes into a single note

// with an array of note numbers -- .asChord method does this for me

#{ |notes|

notes.isArray.if({

Pn(notes.asChord, 1)

}, {

Pn(notes, 1)

});

} => ArpegPat(\block);


#{ |notes| Pxrand(notes, inf) } => ArpegPat(\xrand);

#{ |notes| Pseq(notes.sort, inf) } => ArpegPat(\up);


#{ |notes| Pseq([notes.sort, notes+7, notes+7, notes+14, notes+14].flat, 1) } => ArpegPat(\bubbleup1);


#{ |notes| Pseq([notes.sort({ |a, b| a > b }) + 14, notes+7, notes+7, notes, notes].flat, 1) } => ArpegPat(\bubbledown1);


#{ |notes| Pseq([notes.sort, notes+7, notes+7, notes+14, notes+14].flat, inf) } => ArpegPat(\bubbleup);


#{ |notes| Pseq([notes.sort({ |a, b| a > b }) + 14, notes+7, notes+7, notes, notes].flat, inf) } => ArpegPat(\bubbledown);



// microrhythms


// block chord rhythm -- produces 1 value

#{ |notes, event|

var topNote, gateIndex, gate;

topNote = event[\top];

(gateIndex = topNote.args.tryPerform(\indexOf, \gate)).isNil.if({

gate = gate.tryPerform(\at, 0);

}, {

gate = topNote.args[gateIndex+1];

});

Pn([event.delta, event[\length], gate ? 0.5], 1)

} => MicRh(\blockFollow);


// .estimateLength is how we handle finite arpeggiation -- the microrhythm produces only as

// many rhythmic values as needed for the expected number of notes

// the reason is that a bass note may cause a new note pattern to be generated, but we

// want the rhythmic gesture to keep its integrity

#{ |notepat| Pn(#[0.25, 0.2, 0.5], notepat.estimateLength) } => MicRh('16th');

#{ |notepat| Proutine({

var delta;

notepat.estimateLength.do({ |i|

[delta = 0.2 - ((i * 2pi/25).sin * 0.125), delta, 0.75-delta].yield;

});

}) } => MicRh(\sine);


// macrorhythms

Pn(4, inf) => MacRh(\m1); // 1 bar

Pn(12, inf) => MacRh(\m3); // 3 bars

Prand([4, 6, 10, 15, 27], inf) => MacRh(\prand); // varying pacing


// generative melody player will use this

// it defines keys for Func()'s that will be used to generate variations

Pdefn(\adp, Pwrand(#[\intSplice, \delete], #[0.6, 0.4], inf));


// make a simple instrument to play some notes, store it in a factory class

// this is storage of the function only -- nothing gets run at this point

(make: {

~target = MixerChannel("ghostly", s, 1, 2, level:0.1);

// note: usually you will define your SynthDefs or Instrs elsewhere

SynthDef("ghostly", { arg outbus, freq, gate, attacktime, decaytime, mul;

var amp, sig;

amp = Latch.kr(gate, gate);

sig = Mix.ar(Formlet.ar(PinkNoise.ar(0.2), freq*2, attacktime, decaytime, mul))

* EnvGen.kr(Env.adsr(0.1, 0.5, 0.8, 0.1), gate, doneAction:2, levelScale:amp*2.5);

Out.ar(outbus, sig)

}).memStore;

Voicer(10, \ghostly, [\attacktime, 0.002, \decaytime, 0.9186, \vsense, 0.787, \mul, 0.535], target:~target).latency_(0.5)

},

free: {

~target.free

}, type: \voicer) => Fact(\ghost);


(make: #{

var amps, v;

amps = { |i| (i+1).reciprocal } ! 70;

8.do({ amps.put(20.rand, 1.0.rand) });

~buf = Buffer.alloc(s, 2048, 1).sine1(amps);

~target = MixerChannel("clav", s, 1, 2, level:0.1);

Instr([\osc, \choruspad], { arg freq, gate, ffreq, rq, env, fenv, fenvsense,

detune, lfospeed, bufnum, velsense;

var sig, fm1, fm2;

fm1 = SinOsc.kr(lfospeed, 0, detune, 1); // slow sine wave centered around 1.0

fm2 = SinOsc.kr(lfospeed, pi/2, detune, 1);

ffreq = ffreq * ((EnvGen.kr(fenv, gate)-1) * fenvsense + 1);

sig = Mix.ar(Osc.ar(bufnum, [freq * fm1, freq / fm2], 0,

Latch.kr(gate, gate)-1 * velsense + 1));

sig = RLPF.ar(sig, ffreq, rq) * EnvGen.kr(env, gate, doneAction:2);

}, #[\freq, \amp, \freq, \myrq, nil, nil, nil, [0, 0.1], [0, 5], nil, nil]);

v = Voicer(20, Instr.at([\osc, \choruspad]), [\env, Env.adsr(0.01, 0.3, 0.75, 0.1), \fenv, Env.adsr(0.01, 2, 0.1, 1), \bufnum, `(~buf.bufnum), \detune, `0.0055, \velsense, `0.6, \lfospeed, `0.866, \fenvsense, `1], target:~target).clock_(TempoClock.default);

v.mapGlobal(\ffreq, nil, 3912, \freq);

v.mapGlobal(\rq, nil, 1, \myrq);

v

}, free: #{ ~target.free; ~buf.free; }, type: \voicer) => Fact(\clav);

)


2. Simple melody player


Takes in a raw melodic stream in the form of a MIDIRecBuf and plays it back as is, no modification.


The usual pattern in using these processes is:


- create the voicer that will play the notes. Fact(\name) holds the voicer definition; => VC(\name) uses the definition to build the voicer. VC(\name) => VP(index) makes a voicer available in the GUI, if the VP is already set up. Fact(\name) => VC(\name) => VP(index) can be abbreviated to Fact(\name) => VP(index)


You only need to create the voicer once. You can assign multiple processes to the same voicer (although the maximum number of notes the voicer can play applies to all processes simultaneously).


- instantiate the process prototype into a bound process: PR(\name) => BP(\name).


- assign the bound process to the voicer:  BP(\name) => VC(\name) or BP(\name) => VP(index). These two steps can be done in one: PR(\name) => BP(\name) => VC(\name).


- play.


// instantiate the voicer in the factory

Fact(\ghost) => VC(0);


// instantiate the simple melody player process and assign it to the voicer just created

PR(\mel1) => BP(\mel1) => VC(0);

MBM(0)[1] => BP(\mel1); // provide it with melodic MIDI data


// .play and .stop are quantized, by default to the next beat that is a multiple of 4

// you can set the quant for a single process by doing quant => BP(\name),

// e.g. 16 => BP(\name), [16, -2] => BP(\name) or NilTimeSpec.new => BP(\name) for immediate start


// the global default start quantization is set by quant => BP


// you can override at play time using the quant argument:

// BP(\name).play(3)

BP(\mel1).play;


BP(\mel1).stop;

BP(\mel1).free;

VC(0).free;


3. Adaptive melody player


The adaptive melodic player behaves outwardly just like the simple one, until you pass in material to be used as an adaptation source. Internally it's very different: the incoming material gets converted from MIDI note numbers into a modal representation, split up into phrases and some rudimentary analysis is performed. 


In addition to an adaptation source, you also need to supply a pattern to dictate what kinds of adaptations the process will perform. Adaptation functions are stored in the class Func. The pattern should output a stream of symbols; each symbol will be used to index the Func collection. If you store this pattern into Pdefn with a name, you can simply provide the name: \adaptPat1 =>.adapt BP(\mel2). 


You differentiate the main melodic material from the adaptation source using an adverb applied to the chuck operator: =>.mel vs. =>.adapt


// this uses the setup materials materials from the first code block


Fact(\clav) => VC(\clav);


PR(\aiMel) => BP(\mel2) => VC(\clav);

MBM(0)[1] =>.mel BP(\mel2);

BP(\mel2).play(doReset:true);


// start adaptation -- crossbreed melody with itself

// first say which adaptation functions to use

// \adp refers to the Pdefn declared in the setup code above

\adp =>.adapt BP(\mel2);

// the melody you use here can be different

MBM(0)[1] =>.adapt BP(\mel2);


BP(\mel2).stop;


BP(\mel2).free;



4. Chordal process -- chord follows top-note melody


Chordal processes are, as you might expect, quite a bit more complicated. I use a nested process structure: the outer process is responsible for determining the pacing between chords and how each chord will be played, and the inner process results the input data into a stream of real notes. 


Parameters are: 

- chord form MIDI data. Block or arpeggiated chords in a MIDI buffer. These are abstracted and algorithmically fit to the harmonic context at playback time.

- melodic MIDI data. Melodic notes determine the highest note in each chord.

- macrorhythm. Sets the pacing for each chord: the delta until the next chord, and the length of time this chord should play. Specify as a pattern outputting an array: [delta, length]. This pattern is optional (and will be ignored) if using the \chTop outer process.

- microrhythm. The rhythm of the arpeggiation playing the chord. Specify as a pattern outputting an array: [note delta, note length, gate (velocity)].

- arpeggiation types. A pattern returning symbols, specifying ArpegPats and that will convert the input array of raw notes into a pattern. Notes may be reordered, octave transposed, anything you want in the function defining the ArpegPat.

- adaptation types. Passed into the top note melodic process if the melody is to adapt. 


The inner process will usually be \arpeg1.


Two chordal (outer) processes are currently defined:

- macroRh: uses a macrorhythm patterns to determine pacing;

- chTop: derives its pacing from the top note melody itself. 


The large number of parameters makes it unfeasible to type each one as a chuck operation, so a couple of initialization functions are provided in Func:

- newCh: instantiate a new process, erasing what was in the BP before;

- makeCh: instantiate a new process, wrapping what was in the BP before. 


To call the function, use the form:

Func(\makeCh).doAction(name of new BP, inner process prototype, outer process prototype, chord MIDIBuf, melodic MIDIBuf, macrorhythm, microrhythm, arpeggiation types, adaptation types);


This example uses chTop along with parameters designed to play each chord as a block to harmonize the eccentric melody from earlier examples. 



// use the above melody as the top note of a chord progression

Fact(\clav) => VC(\clav); // not necessary if you ran example 2 first (if VC already exists)


// chord progressions have a lot of parameters

// so this is a single function call that sets all of them

// note that this does not set the melodic adaptation source

// prototype for this function call:

// Func(\newCh).doAction(newBPname, childProcessName, parentProcessName, chordMIDIRecBuf, melodyMIDIRecBuf, macroRhythmPattern, namePatternForMicrorhythm, namePatternForArpegPat, namePatternForMelodicAdaptation, mode, parms);


Func(\newCh).doAction(\ch1, \arpeg1, \chTop, MBM(0)[0], MBM(0)[1], nil, \blockFollow, \block, Pwrand(#[\intSplice, \delete], #[0.6, 0.4], inf)) => VC(\clav);

BP(\ch1).child.fitFunc = \chordFitNotes;


BP(\ch1).play;


// start adaptation

MBM(0)[1] =>.adapt BP(\ch1);


BP(\ch1).stop;


BP(\ch1).free;



5. Chordal process -- arpeggiation with macrorhythm



Fact(\clav) => VC(\clav); // not necessary if you already did this


Func(\newCh).doAction(\ch2, \arpeg1, \macroRh, MBM(0)[0], MBM(0)[1], MacRh(\m1), '16th', \bubbleup, \adp) => VC(\clav);


BP(\ch2).play;


// execute the following one statement at a time to observe the changes in behavior

// note that changes you make now don't take effect until the next chord (outer process event)


\sine =>.micro BP(\ch2); // change microrhythm to sine pattern

\bubbleup =>.arpeg BP(\ch2); // switch to an infinitely running arpeggiation pattern

// so each chord runs until the next starts

\bubbleup1 =>.arpeg BP(\ch2); // switch to a finitely running arpeggiation pattern

// (1 iteration of bubbleup) -- note pause between chords

MacRh(\prand) =>.macro BP(\ch2); // change pacing


'16th' =>.micro BP(\ch2); // arpeggiate in 16th notes

\bubbleup =>.arpeg BP(\ch2);

\up =>.arpeg BP(\ch2);

MacRh(\m1) =>.macro BP(\ch2);


// each chord chooses a new way to render itself

Prand([\bubbleup, \bubbledown, \up, \xrand], inf) =>.arpeg BP(\ch2);

Prand(['16th', \sine], inf) =>.micro BP(\ch2);


BP(\ch2).stop;


VC(\clav).free; // drop the voicer's resources

BP(\ch2).free;


By now it should be clear how flexible chord processes can be in performance. Simple commands cause dramatic changes in sound and texture.


The composition process is also more fluid and experimental. You can develop process components (macrorhythms, microrhythms, arpeggiation patterns, adaptation functions) on the fly and swap them in and out while playing, so you can hear the results immediately. Composition becomes more about playing with the material than about writing one-off patterns suited for a single purpose only. It's a step closer to the ideal of SuperCollider as an environment in which to play freely.


6. Using regular SynthDefs instead of Voicers


You may choose to use SynthDefs instead of Voicers. This is not recommended because you gain additional functionality using Voicer:


- drag 'n' drop gui capability: voicers can be assigned to a VoicerProxy that has been gui'ed, either by dragging it into the gui or by chucking. SynthDefs can't.

- Voicers cap the number of notes that can sound at once, reducing the risk of CPU overloads.

- Voicer aids in the creation of global controls for SynthDef arguments, and handles the mapping of nodes to the respective control buses automatically.

- Voicer allows live MIDI input directly into the process. SynthDef can't because it doesn't have a mechanism to retain references to synth nodes while notes are sustaining.


However, if you must use regular synthdefs, here is example 4 (chords harmonizing melody) using the default SynthDef:


// you must load the synth descriptors

SynthDescLib.global.read;


// Factories are used here too -- you should populate at least ~target

// if ~target is not a MixerChannel, you should also supply ~out as an output bus number

// the make function should return the name of the synthdef

(make: {

~target = MixerChannel(\synthTest, s, 2, 2);

\default

}, free: { ~target.free }) => Fact(\defaultSynth);


// instantiate by chucking into SY (not VC)

Fact(\defaultSynth) => SY(\default);


// arpegSynth is an arpeggiator that uses the \synthNote event instead of \voicerNote

// also we chuck into SY not VC

// these are the only changes from ex. 4

Func(\newCh).doAction(\ch1, \arpegSynth, \chTop, MBM(0)[0], MBM(0)[1], nil, Pn(\blockFollow, inf), Pn(\block, inf), Pwrand(#[\intSplice, \delete], #[0.6, 0.4], inf)) => SY(\default);


BP(\ch1).play;

BP(\ch1).stop;


SY(\default).free;


Because of the modular structure, retrofitting the existing classes to use this alternate event prototype was fairly easy. It took longer to get the synthNote event to work than it did to integrate it into the class interface! Thus, if \voicerNote and \synthNote don't meet your needs, you can roll your own event without too much trouble.