Common Design Patterns in SuperCollider

Do we have a compendium of commonly used design patterns when coding in SuperCollider? What are your favorites? How do you organize your code?

I feel that this topic can be very individual… but at the same time, it would be great to have references about how different people (especially super-experienced ones) are dealing with it.


Having something like this would be really awesome!

Maybe a good starting point would be to gather the patterns used on sccode and write some descriptions about it.

I’ve began to do a project like this, collecting commonly used SynthDefs and the Patterns that would make “more sense” to use with each SynthDef. It’s called SynthDEFaults . IMO it would be great if SC could provide the user a large default library of SynthDefs and Patterns examples, it would smooth the learning curve.
Here is the code:

If you want some help in building and/or contributing to these project, let me know!


This is great, I’ll dive into it for sure, but just to make it clear: I’m looking more for design patterns from a programming point of view, it’s more about how code is organized into (perhaps) projects. I’ll try to give some examples:

  • it looks like a common live coding pattern is indeed to write synthdefs and then improvise sequences that play them (synthdef + “Pbinds”). A sequence of sequences can be written to create the form of a piece. The rest might be fine tuning of parameters or durations.
  • a pattern I use is to create a class (or prototype) to manage a process. Instances will create and store sets of synths or nodeproxies, which accept controls through method calls, guis and midi.
    Furthermore, sometimes a class is a wrapper for a single node, adding features such as osc responders, or parameter coordination (reflecting changes of a parameter to other parameters), or (analysis) data caching. Other times, a class is a wrapper for managing multiple copies of a single synthdef (such as filter banks, or multiple coordinated sample players).
  • a process is carried on entirely within an Ndef.

I know this is very sketchy, and I would like to see clearly more details. The topic is huge, but I feel it would be an important step for communicating, sharing and reusing SC code.
I agree we could start by scanning through sccode, but I hope some of the hardcore long-time users could have some answers ready :slight_smile:


I think it’s hard to make a list of “things I’m doing right.” It may be more useful to approach the question from the other direction – things that are commonly done in a way that isn’t ideal, and suggest alternate approaches.

Off the top of my head:

Antipattern 1: The base SC class should do everything

This is probably a result of the fact that it’s hard enough to learn what’s in the class library, so the natural tendency is to learn that and then struggle through or hack around the limitations. This shows up in a lot of ways:

  • SynthDef should handle dynamic structures
  • SynthDef or Synth should handle controller mapping
  • Patterns should manage resources

What’s often forgotten in this anti-pattern is that, even if SC had a more powerful class to do the more complex job, it would still be necessary to have the base class supporting it.

The solution is some kind of super structure that uses the base classes.

  • Dynamic SynthDefs: Use a helper function to generate variants with different names. (A far extreme of this is crucial library Instr/Patch.)
  • Controller mapping: In my own libraries, I’ve got GenericGlobalControl which wraps a language-inside value and a server-side control bus into one object.
  • Patterns: My own solution is a “process” object that encapsulates a pattern with all of the resources it needs. If the pattern plays onto an audio bus, the process creates the bus (and releases the bus when freed). If the pattern needs a buffer, the process creates and destroys it. Then the pattern is free to do what it does well (put information into events), while there is still a structure binding that pattern to other resources.

Antipattern 2: GUIs should be interchangeable with values

Nope. A GUI is responsible for interaction. It isn’t data storage and it shouldn’t be treated as such.

// naive design: GUI is free-standing
z = Slider(nil, Rect(800, 200, 200, 50)).front;

p = Pbind(
	// and the pattern queries the GUI object directly
	\freq, Pfunc { z.value.linexp(0, 1, 200, 800) },
	\dur, 0.1

ERROR: Qt: You can not use this Qt functionality in the current thread. Try scheduling on AppClock instead.

Instead, data storage should be in dedicated variables or objects. The GUI feeds information into the data storage, and other tasks can query the stored data freely.

f = 440;  // data storage

z = Slider(nil, Rect(800, 200, 200, 50))
.value_(f.explin(200, 800, 0, 1))
// GUI --> data storage
.action_({ |view| f = view.value.linexp(0, 1, 200, 800) })

p = Pbind(
	// pattern queries data storage
	// but does not touch GUI
	\freq, Pfunc { f },
	\dur, 0.1


That actually leads to the Model-View-Controller design pattern for interfaces – which is definitely not obvious to wrap your head around at first, but it really is the most flexible design. But this post is already a little long, so I’ll leave it at that for now.

There are probably others, but I’ve written enough for the moment.



Not sure if that is relevant to your workflow but we have created the CuePlayer Quark which is an encapsulation of a common workflow relevant for compositions that are driven by triggered cues. The structure also includes timeline scheduling functionality for each cue among other things and it is agnostic regarding the content of its cues as it just triggers any SC code.

1 Like

@jamshark70 this idea is super cool, I think it deserves a more complete tutorial or guide!

Could you please show us some examples related to the things you said on antipattern 1 ?

Controller mapping (you need the ddwCommon quark):

a = { |freq = 55, ffreq = 2000, rq = 0.1, amp = 0.1|
	var sig = * [1, 1.005]);
	sig =, ffreq, rq);
	sig * amp

f = GenericGlobalControl(\ffreq, nil, 2000, \freq);

a.set(\ffreq, f.asMap);  // like control bus

f.value = 500;

f * 2  // math usage
f.asPattern  // or in patterns;

// -->, 1)
a = { * [1, 1.005]),, 0.1, 0.1) }.play;;;

Patterns and resources (you need the ddwChucklib quark and dependencies):

PR(\abstractProcess).clone {
    ~event = (eventKey: \singleSynthPlayer, instrument: \bufGrainPan);
    ~prep = {
        ~chan = MixerChannel(~collIndex, s, 2, 2);
        ~buf =, ~path);
    ~freeCleanup = {
        [~chan, ~buf].free;
    // subpatterns for BPStream
    // note that this refers to ~buf, but ~buf is not loaded
    // at the time of creating the PR. So we must use Plazy
    // to wait and evaluate ~buf.duration at play time.
    ~start = Plazy {
        Pwhite(0, ~buf.duration - Pkey(\time), inf) * ~buf.sampleRate
    ~time = Pwhite(0.1, 0.3, inf);
    ~pan = Pwhite(0.6, 0.8, inf) * Pseq([-1, 1], inf);
    ~dur = Pwhite(1, 4, inf) * 0.125;
    ~asPattern = {
            \bufnum, ~buf,
            \dur, BPStream(\dur),
            \time, BPStream(\time),  // must be first; dur depends on it
            \start, BPStream(\start),
            \pan, BPStream(\pan)
} => PR(\bufPlayer);

BP(\buf).free;  // just in case, clear out anything already there
PR(\bufPlayer).chuck(BP(\buf), nil, (
    path: Platform.resourceDir +/+ "sounds/a11wlk01.wav"

MixingBoard(mixers: BP(\buf).chan);


// The Plazy trick works outside the PR definition too
BP(\buf).start = Plazy { Pn(Pseries(~buf.duration - 0.5, -0.1, { rrand(5, 20) }), inf) * ~buf.sampleRate };

BP(\buf).free;  // magically destroys the Buffer and MixerChannel



I just found out about this extension. Marvelous!

I have a question, though. Is it possible to trigger one cue from inside another cue? I was trying to trigger one cue from the last event of a previous timed cue.

a =;
    // beat - function pairs
    timeline: [
        1, { "1 beat later".postln },
        2, { "2 beats later".postln }, // schedule something to happen 2 beats later
        3.5, { 
            "3.5 beats later".postln; 
    timelineOptions: (mode: \beats, quant: 1) // this works in beats

a.put(2, {"whatever".postln});

This might seem useless, but I’m trying to achieve a Pfsm-ish behaviour with each cue triggering any of the other cues. Pfsm doesn’t work because it doesn’t know when a cue is finished, so it triggers them fine, but is not waiting for the last event in a cue to trigger the next one. Is there a way to access that information? Any ideas on how this could be done?

Cheers, and thanks for this wonderful quark!

1 Like

Hello :slightly_smiling_face:,

Glad you found it useful. Could you expand on what you mean by:

but is not waiting for the last event in a cue to trigger the next one

Pfsm triggers the cues in whatever order it comes out with, but triggers them immediately, independently from the timing set in the timeline.

Let’s say I have a the current cue with 3 timed functions. I’d like to trigger another cue when the last function in the current cue is done, not before. Does that make sense?

Hmm, ok here are some ideas and you can tell me if they solve the issue for you:

First of all you can take into account the time that you want the function to be triggered and just add a function on the timeline at that time. This would mean that you have to know the time your last function of the current cue ends at. If you don’t know that then the code doesn’t know it either so you will need to solve this on the level of the function itself. This is an issue that is coming up all the time with generative processes I’m afraid. So, for example, if you have a function that produces a random number of grains you would need to make sure that this function call .trigger at the relevant point. Notice that we can not always know this at the triggering of the process if the process is an iteration (let’s say flip a coin every two seconds and then produce another event if the flip is 1). On the other hand if the generative process is for instance “choose a random number from 1 to 200 and play a drone for that time” then we have the information to schedule a TimeLine function after we generate the random number. It is the curse of the procedural :smile:

An example of one of the solutions:

a =;
    // beat - function pairs
    timeline: [
        1, { "1 beat later".postln },
        2, { "2 beats later".postln }, // schedule something to happen 2 beats later
        3.5, { 
            "An event that lasts 5 beats is triggered".postln; 
        8.5, {
             "triggering cue 2 at the point the previous event ends".postln'
    timelineOptions: (mode: \beats, quant: 1) // this works in beats

a.put(2, {"whatever".postln});

Also the hook method can be used for evaluating code right before a cue is triggered:

.hook = value
From superclass: Cues
Register a function to be avaluated right before triggering a new cue. The function is passed in the cuePlayer as an argument.
b =;
b.hook = {arg cueplayer; cueplayer.current.postln; }

This is meant for resources and cleaning up so I am not sure is synchronous though.

Hope that makes some sense :upside_down_face:

It does make sense :slight_smile: . Actually that’s exactly what I tried but my code threw an error, hence the post. It works now :sweat_smile:

Thanks for you help!

1 Like