ProxySpace, Ndefs or Standard?

This is a fantastic explanation. It’s easy to see how moving between different ways of playing synths makes it easy to break things. I love the idea of proxies, but there is still a lot of unknowns for me. I’m starting to see that a really and full understanding of the classes and how this works is necessary. It’s a little hard to just play around until everything works. Yet, oddly that’s sort of what proxies are for it seems.

Thanks for this excellent explanation. I’ll take this knowledge into SC later and see if I can fix my issues.

I hate to take more of anyone’s time, but I have a few questions that are hard to even figure out how to ask:

I’ve read up a bit on Server Architecture (again) but am trying to really get a grip on this.

From the docs:
A Node is an addressable node in a tree of nodes run by the synth engine. There are two types, Synths and Groups. The tree defines the order of execution of all Synths. All nodes have an integer ID.

When we create synths, they are obviously assigned to a node, which may be in their own group or will end up in the default group.

With proxies (either via Ndefs or ProxySpace) it seems every instantiation is in it’s own Group.

But I still seem (even after reading pretty much every doc on BusPlug->NodeProxy->Ndef or with ProxySpace) how all of this actually interacts. I understand the idea of a proxy, but think that I’m having trouble knowing how order of execution works (which is obviously a bit more loose with proxies).

This: https://doc.sccode.org/Tutorials/JITLib/proxyspace_examples.html
is helpful to a degree, all examples work, but I’m having trouble relating the concepts in a workable sense. @jamshark70 - what you’ve explained makes loads of sense, but at a lower level. I’m trying to fathom the larger picture here and can’t quite seem to grasp it.

For instance even the assignment in the above example of ~out = \staub; how does one then control this proxy? The doc seems more about ~out = SynthDef(... (and the lines with ~out.spawn elude me even further).

Again, I understand in basic terms that ~out = {...} is the same as Ndef(\out, {...} but I’m having more trouble figuring out how these NodeProxies actually work on the server and with the client-side language.

I’m not even sure what to ask. But given a path to look into, I will.

One good example of playing a SynthDef assigned to a proxy with a pattern playing would be wonderful.

Here is a complete example of a Synthdef being played by a pattern and routed and mixed through Ndefs.

I think this plus other examples in this thread demonstrate a lot of different techniques for combining synths, patterns and NodeProxies. I do hope this is helping.

(
SynthDef(\s1, {
	var freq = \freq.kr(220);
	var cutoff = \cutoff.kr(100);
	var fvel = \fvel.kr(8);
	var res = \res.kr(0.5).linlin(0, 1, 1, 0.001);
	var aeg = Env.asr.ar(doneAction:Done.freeSelf, gate:\gate.kr(1));
	var sig = RLPF.ar(Saw.ar(freq), aeg.linlin(0, 1, cutoff, cutoff * fvel), res);
	sig = sig * aeg * \amp.kr(0.3);
	sig = Splay.ar(sig);
	Out.ar(\out.kr(0), sig);
}).add;
)

// monitor - output to speakers
Ndef(\s1).play(vol:1);

// play the synthdef routed through the ndef
(
Pdef(\s1, 
	Pbind(
		\instrument, \s1,
		\out, Pfunc({ Ndef(\s1).bus.index }),
		\group, Pfunc({ Ndef(\s1).group }),
		\degree, Ppatlace([Pseq([0, 4], inf), Pseq([-1, 1, -2, 2], inf)], inf),
		\legato, 0.1,
		\dur, 0.25
	)
)
)
Pdef(\s1).play;

// route the ndef through a delay - output to speakers
Ndef(\delay).play;
Ndef(\delay)[0] = \mix -> {Ndef(\s1).ar};
Ndef(\delay).set(\mix0, 1);
(
Ndef(\delay).filter(10, {|in|
	var sig;
	var fb = LocalIn.ar(2);
	fb = DelayC.ar(fb.reverse, 1, [3/8, 5/8]);
	sig = fb * 0.7 + in;
	LocalOut.ar(sig);
	sig;
})
)

// route the delay through a pitchshift - output to speakers
Ndef(\ps).play;
Ndef(\ps)[0] = \mix -> {Ndef(\delay).ar};
Ndef(\ps).set(\mix0, 1);
(
Ndef(\ps).filter(10, {|in|
	PitchShift.ar(in, 2, 2, 0.01, 0.01)
})
)

// route the dry signal, delay, and pitchshift through reverb - output to speakers
Ndef(\verb).play(vol:0.5);
Ndef(\verb)[0] = \mix -> {Ndef(\s1).ar};
Ndef(\verb)[1] = \mix -> {Ndef(\delay).ar};
Ndef(\verb)[2] = \mix -> {Ndef(\ps).ar};
Ndef(\verb).filter(10, {|in| GVerb.ar(in, 10, 5, 1, 1) } );
// adjust mix
Ndef(\verb).set(\mix0, 1, \mix1, 1, \mix2, 1);
// adjust wet/dry
Ndef(\verb).set(\wet10, 1)
5 Likes

This is equivalent to ~out = SynthDef(\staub...) except that, having defined the SynthDef once, using \staub as the source means you don’t have to repeat the entire synth function.

So you would control it exactly the same way you control the SynthDef source.

Droptableuser did, so I’ll just add a note about conceptual background to this.

Again, there are the two ways of using patterns here:

  • the “new note per event” way – where you would have the pattern assigned to the proxy, and the pattern uses the SynthDef. This is the reverse of what you wrote. Droptableuser’s example follows this model. (Note that \instrument in that example points to the SynthDef, not the Ndef – they are both named \s1 but as noted before, the default event type does not play Ndefs, so it must be the SynthDef.)

  • the “single note, changing” way – where you would assign the SynthDef to the proxy and separately run a pattern to set the proxy’s controls. (“Separately” includes the idea of assigning the pattern to a different numbered slot in the proxy.)

With proxies, you’re supposed to let the proxy manage the concrete nodes, and you address commands to the proxy abstraction layer. It’s good to understand how it works but it isn’t supposed to be required to understand all of it, in order to use it.

hjh

1 Like

I can’t thank you both @droptableuser and @jamshark70 enough for taking the time to explain and give these examples. This helps immensely!

Is there any benefit to using the Ndef/Pdef methods over ProxySpace? I can’t at this point imagine why I’d need more than one environment. ProxySpace syntax is a bit less. I’m going to try them both out, but why do people choose one vs. the other? Is it by project need (and what would be an example). Is it just by personal preference - that I understand.

I guess I should have just asked Pros and Cons of each.

i think it’s mostly stylistic preference. but there is one thing i can think of that is kind of annoying. with proxyspace you can’t really define a function like this:

~myfunc = { ... };

as it will get interpreted as a nodeproxy. so you have to do something like

q = q ? ();
q.myfunc = { ... }

but defining functions on an Event can be confusing depending on how you call the function. additionally you will run into wanting to create functions with logical names that have already been defined somewhere in the Event hierarchy and so you will run into hard to intuit oddities.

my 2 cents is that the proxyspace syntax comes with too many little gotchas. using ndefs you don’t encounter that as much but you will have to type a bit more.

but there is no functional difference at all

Great info @droptableuser - thank you. Heck the only time I use a ? in other programming languages is something like (somthing1 == somethng2) ? result_1 : result_2 - shorthand for an if statement. I have no idea what q = q ? (); even means. But I get the idea and have sort of intuited this.

I’m probably looking for the most ‘bang for the buck’ and ProxySpace seems to offer that up to a point. And part of my original thread is that so much of the help and tutorials are about what I term ‘standard’ environment - that it seems natural in a way to use a system to include Ndefs into such a ‘standard’ environment. However, in doing so you are also still entering into NodeProxies which break one out of that system (at least as I seet it). But I see a lot of people doing incredible things in both.

I’m trying for live-coding but probably recorded out as stems (if that isn’t obvious at this point). Play around, improvise but all while recording with some control.

All of the help given shows that both ways can do this. I’m just trying to figure out which works best.

And there may not be a ‘best’. So, just working out what works based on intent.

Now, I haven’t gotten to recording of stems yet, but the example you sent @droptableuser, along with all of the in-depth help from @jamshark70 - I’m actually starting to get this a bit more. Your Synthdef (\s1) is really cool with that interlaced Pattern. Never would have figured this out and don’t even know entirely what some of it is. But much of this is starting to sync in. It seems when using Ndefs, that Pdefs (and probably other proxy defs like Tdefs) are key to keeping things in sync. This is really amazing.

1 Like

I deleted my last response. Total user error with my SynthDef. It’s all working swimmingly in ProxySpace as well. Now I think I have a working reference code base for both. Thanks again!

A continuation of notes. What seems most evident to me now, after much of this discussion, is really in going through all tutorials available. I was caught up in Node Proxies, which are pretty cool shortcuts, but it seems like learning Tasks, Routines and forking are maybe necessary to understand what is happening in general. Short of the node structure, most of these options seem quite available in working in a normal environment. Proxies are obviously well suited to live coding - however, it does seem the order of precedence is still a factor depending upon what one is doing.

The nature of code driven sound is playful, but still needs to be a bit more “thought out” than just playing around in a DAW, at least as far as ‘knob-twiddling’ and patching goes.

I’ve heard more advanced people, such as @jamshark70 and @droptableuser, produce things that are like having someone tell you how to mix color and apply to the canvas, incredibly, incredibly helpful. But making a work of art is more than just which tools you use as well. I’m still trying to grasp this in SuperCollider. However, it has taught me more about synthesis and sound design than I’d ever have imagined.

I am practicing and learning a bit every day. I feel like did when learning guitar 30 years ago.

2 Likes

Thanks for this example!

I just have two questions when trying to understand this code:

You wrote:

Ndef(\verb).filter(10, {|in| GVerb.ar(in, 10, 5, 1, 1) } );

What does the “10” right after “.filter(” do?
Is it some sort of a mulitiplier?
In the documentation I read:

.filter(i, func)

What about:

Ndef(\verb).set(\wet10, 1)

Is “\wet10” referring to the roomsize of GVerb or is it something totally different?
Thanks!

The “10” is the slot or position of the filter in the signal chain. With Ndefs you have slots 0-infinity where you can place signals or filters. In the example slot zero is the main signal. Slot 10 contains a filter which processes the signal. Ten is chosen arbitrarily but leaves room for mixing additional signals prior to the filter, e.g. slots 1-9. You can add additional filters after 10 which would receive the cumulative signal of slots 0-10. The ‘wet10’ parameter is a NodeProxy convention. When you place a filter in the signal chain a NamedControl is created with the name ‘wet’ + . It controls the wet/dryness of the filter. A value of zero would be the dry signal. A value of one would be the wet signal. You can adjust any value in between. If you use the \mix role there will be a NamedControl named ‘mix’ + . It will allow you to adjust the level of the signal mixed into the output. It would mostly have use where you have multiple signals and want to adjust their levels similar to a mixer.

Documentation on NodeProxy roles: - http://doc.sccode.org/Reference/NodeProxy_roles.html

1 Like

Thanks for your explanation!
Didn’t stumble upon this site in the documentation so far.

I am bit confused by the first example there:

a = NodeProxy(s);
a[0] = { |freq = 440, dt=0.1, rate=2| Ringz.ar(Impulse.ar(rate * [1, 1.2]), freq, dt)*0.1 };
a.play;
(
a[1] = \set -> Pbind(
    \dur, Prand([1, 0.5], inf),
    \freq, Pwhite(200.0, 1000, inf),
    \rate, Pstutter(4, Prand([1, 3, 6, 10], inf)),
    \dt, Pwhite(0.01, 0.1, inf)
)
);
// modify the source in the meanwhile:
a[0] = { |freq = 440, dt=0.1, rate=2| Ringz.ar(Dust.ar(rate * 10.dup), freq, dt)*0.1 };

a.nodeMap.postln; // the values are not set in the node map.
a.clear(3);

So there is a[0] and a[1]. Considering the audible output and the Node Tree:
Is a[1] used as an output and a[0] is only used as a source for that (not audible)?

In case my assumption is true how would one output both a[0] and a[1]?

my explanation above wasn’t exhaustive…

slot one in this case is setting parameters of the synth with a Pbind() so it’s not really outputting sound. NodeProxy and ProxySpace are fairly vast and I can’t do it complete justice in this space. Hopefully I haven’t added to your confusion.

That’s sort of what my really vague question was on this thread. It’s really hard to understand and piece together the syntax necessary for proxies just because they are so involved and operate differently than most of the other help. There is a fair amount of documentation but sort of spread out. I personally think a good proxy tutorial encompassing some of this would be very welcome to a lot of us.

Null-State: https://www.youtube.com/channel/UCnmDTasybbexRtASuxuUwAQ is apparently continuing their tutorials next Monday and left off at some great ProxySpace lessons.

Hello, I hope this finds everyone well.

I’m still trying to assimilate much of the above info and examples. Incredibly helpful!

I have an example set of code here that I’m using just for audio recording. I’ve actually modified some code (mentioned above) that was written to record ProxySpace nodes. That was working well until I ran into some issues with ProxySpace itself.

I modified that Class to work with Ndefs, which seems to work appropriately for the most part - but once again, when I bring these files into another app with the same tempo settings, everything is slightly off the grid.

I can’t at this point tell what I am misunderstanding. Is it latency or something I’m doing wrong with Quantization? If anyone can help I’d be much obliged.

P.S. @droptableuser - using the snippet you showed above a few months ago with a straight multi-channel record seem to have the same results.

I can’t tell if it’s my patterns running off somehow, or not quantized or if it is something larger.

This runs a straight beat at 120 bpm, tempo 2 - but in any case the files are in sync together but with some sort of offset, again, maybe latency or me just doing something wrong. I’ve been in the docs, just can’t find what I’m missing.

Class:
NdefRecorder

NdefRecorder {

	var <nodes;
	var <>folder;
	var <>headerFormat = "aiff", <>sampleFormat = "float";
/*	var dateTime  = Date.getDate.format("%Y%m%d-%Hh%m");
	dateTime.postln;*/

	*new { |subfolder = nil |
		^super.newCopyArgs().init(subfolder)
	}

	init { | subfolder = nil |
		nodes  = ();
		if(subfolder != nil,
			{folder = Platform.userAppSupportDir +/+ "Recordings" +/+ Document.current.title +/+ subfolder },
			{folder = Platform.userAppSupportDir +/+ "Recordings" +/+ Document.current.title  }
		);

		File.mkdir(folder);
	}

	free {
		nodes.do(_.clear);
		nodes = nil;
	}

	add { |proxies|
		this.prepareNodes(proxies);
		{ this.open(proxies) }.defer(0.5);
	}

	prepareNodes { |proxies|
		proxies.do{ |proxy, i|
			var n = Ndef(proxy);
			n.play;
			n.postln;
			nodes.add(
				i -> RecNodeProxy.newFrom(n, 2)
			);
		}
	}

	open { |proxies|
		proxies.do{ |proxy, i|
			var dateTime  = Date.getDate.format("%Y%m%d-%Hh%m");
			var fileName  = ("%/%-%.%").format(
				folder, dateTime, proxy.asCompileString, headerFormat
			);

			nodes[i].open(fileName, headerFormat, sampleFormat);
		}
	}

	record { |paused=false|
		nodes.do(_.record(paused, TempoClock.default, -1))
	}

	stop {
		this.close
	}

	close {
		nodes.do(_.close)
	}

	pause {
		nodes.do(_.pause)
	}

	unpause {
		nodes.do(_.unpause)
	}

	closeOne { |node|

	}
}

My Simple Ndef Test

(
SynthDef(\bplay,
	{arg out = 0, buf = 0, rate = 1, amp = 0.5, pan = 0, pos = 0, rel=15;
		var sig,env=1;
		sig = Mix.ar(PlayBuf.ar(2,buf,BufRateScale.ir(buf) * rate,1,BufDur.kr(buf)*pos*44100,doneAction:2));
		env = EnvGen.ar(Env.linen(0.0,rel,0),doneAction:0);
		sig = sig * env;
		sig = sig * amp;
		Out.ar(out,Pan2.ar(sig.dup,pan));
}).add;
)

(
Ndef(\hh).play;
Pdef(\hhmidi,
	Pbind(
		\type, \midi,
		\midiout, m,
		\midicmd, \noteOn,
		\chan, 1,
));

Pdef(\hhsynth,
	Pbind(
		\instrument, \bplay,
		\out, Pfunc(Ndef(\hh).bus.index),
		\group, Pfunc(Ndef(\hh).group),
		\buf, d["Hats"][1],
));

Pdef(\hhseq,
	Pbind(

		// \dur, Pseq([0.25, 0.25, 0.5, 0.77, 0.25].scramble, inf),
		// \dur,Pbjorklund2(Pseq(l, inf).asStream,12,inf)/8,
		\dur, 0.125,
	));
)
(

Pdef(\hh,
	Ppar([
		Pdef(\hhmidi),
		Pdef(\hhsynth),
	])
	<> PtimeClutch(Pdef(\hhseq))
);
)




Pdef(\hh).play(quant: -1);
Pdef(\hh).stop;

(
TempoClock.default.tempo = 2;
~ndefr = NdefRecorder.new('test');
~ndefr.add([\hh]);
)
~ndefr.record;
~ndefr.stop;

One other note about all of the Pdefs - I’m attempting to capture MIDI as well from a pattern and was helped with this on another thread. That also may be the issue.

Not certain what process you’re following but executing the code below and bringing the resulting multichannel wav file into Audacity, i’m not able to hear anything off. Also exporting into separate files and bringing into Reaper I’m not able to hear anything off, nor am I able to see anything off in the waveform editor.

(
Ndef(\a, {
	SinOsc.ar(220!2) * Env.perc.kr(gate:\trig.tr) * 0.1;
});
Ndef(\b, {
	SinOsc.ar(220!2) * Env.perc.kr(gate:\trig.tr) * 0.1;
});
Ndef(\c, {
	SinOsc.ar(220!2) * Env.perc.kr(gate:\trig.tr) * 0.1;
})
)

Ndef(\a).play(out:0);
Ndef(\b).play(out:2);
Ndef(\c).play(out:4);

(
Pdef(\d, Ppar([
	Pbind(\type, \set, \id, Pfunc({Ndef(\a).nodeID}), \args, #[trig], \trig, 1),
	Pbind(\type, \set, \id, Pfunc({Ndef(\b).nodeID}), \args, #[trig], \trig, 1),
	Pbind(\type, \set, \id, Pfunc({Ndef(\c).nodeID}), \args, #[trig], \trig, 1),
]))
)
Pdef(\d).play;

s.record(numChannels:6);
s.stopRecording
1 Like

Well, interesting. I loaded up what I had last recorded and you are correct. Everything is on the grid. I had done a LOT of tests last night and maybe I had the wrong files on sending this. Just loaded into Reaper again and all looks perfect. Sorry about that.

Over the last several days I’ve been obsessively testing different workflows. One interesting and very stupid thing I did not realize (that has no bearing in the above tests) - you can run Ndefs (and other ‘defs’) in proxyspace as well. Not sure when that will be useful, but I’m also trying to minimize the amount of code needed for various things. Unfortunately, the ProxyRecorder class seems to only work in proxyspace, and my NdefRecorder modifications work only with Ndefs - but I’m sure there is a way to make both work. Thanks for checking my work @droptableuser.

All of the above examples are really helping me learn. With all of them there are more questions, but this is invaluable help.

From my simple grid test above.