Shouldn't Events be (directly) composable?

Currently if you do:

// need a non-default synthdef for some obscure reasons (I'll get to that)
(SynthDef(\mysine2, {  |out=0, amp=1, gate=1, freq=555, pan=0| 
	var sing =, gate, doneAction: 2) *;,, pan, amp));

((instrument: \mysine2, freq: Pkey(\freq) * 2) <> (freq: 300)).play; // silent

it does nothing of course (because <> is not actually defined for Event and more annoyingly since Event has know turned on () <> () is actually nil , whereas prepending an empty Pbind makes it work obviously

(Pbind() <> (instrument: \mysine2, freq: Pkey(\freq) * 2) <> (freq: 300)).play // sound

(because ‘<>’ is defined for patterns “on the left”, while you can have events on the right of <>. Pdef in fact relies on this to <> its envir.

More annoyingly, replacing the instrument with the default one, i.e.

(Pbind() <> (instrument: \default, freq: Pkey(\freq) * 2) <> (freq: 300)).play // silent

also does nothing even with a Pbind, and I don’t know why. Also

(Pbind(\instrument, \default) <> (freq: Pkey(\freq) * 2) <> (freq: 300)).play // still silent
(Pbind(\instrument, \default, \freq, Pkey(\freq) * 2) <> (freq: 300)).play // sound

So, questions:

  • shouldn’t Events be directly composable with <>? I don’t see a lot of difference between an Event and a Pbind. Event could have asPbind as method for example and thus define a <> for itself too. In the absence of this, if you accidentally chain two events (which you easily can with parentheses in a wrong place), you get no sound.

  • Why doesn’t the \default instrument work in the chained Pbind example, unlike my custom instrument?

Pkey(\freq) is not valid in an event. Only Pbind or a variant will evaluate it.

Events should contain fully resolved values. They should not contain patterns. You get no sound here because of sending an invalid frequency, I’m pretty sure (without testing, maybe I’m wrong).

It’s the same as the difference between [Pwhite(0.0, 0.5, inf), [Pwhite(0.5, 1.0, inf)] and Ptuple([Pwhite(0.0, 0.5, inf), [Pwhite(0.5, 1.0, inf)]). You could say “I don’t see a lot of difference between an array of patterns and a Ptuple” but at the end of the day, it’s still the case that collection objects don’t evaluate patterns contained within – only pattern streams do.

Perhaps a way to say it is that patterns are active while collections are passive, with regard to patterns inside them.

It would be reasonable to define event1 <> event2 to return event2.copy.putAll(event1).

Pbind’s pairs are ordered while Event’s are not. I’m guessing it would be fragile to convert an unordered set of pairs into an ordered one – would work under some restrictions, but it’s only a matter of time before someone writes (dur: Pwhite(0.1, 0.5, inf), freq: Pkey(\dur).linexp(0.1, 0.5, 100, 1000)).asPbind and gets annoyed that it doesn’t work. So I would vote no on that.


1 Like

Actually, you’re quite right. I was fooled by my default frequency in my Synth being pretty close to the one I was thinking I was setting. With a synth change like

(SynthDef(\mysine2, {  |out=0, amp=1, gate=1, freq=200, pan=0| // ...

it becomes obvious the freq is not actually getting set in the chain…

Pbind’s pairs are ordered while Event’s are not.

Indeed, I overlooked that… since I’ve dabbling with making my own Event subclass that does remember the user-set order on keys. Since Pdef actually chains its envir, you can use it a bit more unorthodox way as

Pdef(\testE, Pbind())
Pdef(\testE).envir = Pbind(\freq, 200)
Pdef(\testE).play // ok, freq is set
Pdef(\testE, Pbind(\freq, Pkey(\freq)*2))
Pdef(\testE).play // multiplies the envir one
PdefGui(Pdef(\testE), 10) // not showing envir, of course

And it works to play, but of course this has the downside that you can’t set into it anymore, nor can you use PdefGui properly on it anymore (envir won’t show).

On the other hand, <> could still make some sense for Events (or even Environments), perhaps as an equivalent for .copy.putAll, but I haven’t given it a lot of thought.

Actually a simple “record override” like copy.putAll is not a terribly useful notion of Event composition because there’s no way to do any lookups in the “prior” environment, i.e. no way to get a Pkey equivalent.

So instead one can use proto chaining, and a “recursive resolve”

+ Event {

	/* "record override" not very useful
	<> { arg anEvent;
	<> { arg anEvent;
		this.proto_(anEvent); ^this;

	atRec { arg key;
		var val =, env = this;
		while ({val.isKindOf(AbstractFunction)},
               {val = env.proto.use { val.value }; env = env.proto; })

At least for some basic recursive lookup this seems to work, i.e.

e = (freq: {~freq * 2}) <> (freq: 200, bar: 24);
d = (freq: {~freq + ~bar}) <> e;

e.atRec(\freq) // 400
d.atRec(\freq) // 424

I’m guessing there is a way to break this since the ASTs that BinaryOpFunction builds aren’t as comprehensive as Patterns.

Also, there’s no notion of sequencing here, just composition, and one needs to decide when to resolve the values, i.e. before playing:

+ Event {
	resolve {
		var ev = this.copy;
		this.keysDo { |k| ev[k] = this.atRec(k) };

e.resolve // ( 'freq': 400 )
d.resolve // ( 'freq': 424 )

I found out where this approach breaks, namely

~freq = 11
e = (freq: r { loop { yield (~freq * 2) } }) <> (freq: 200, bar: 24)
e.resolve // ( 'freq': 22 )

Routines are AbstractionFunction too, but they don’t look up in the use environment for some reason, i.e.

~foo = 12
f = {~foo + 2}
(foo: 44).use { f.value } // -> 46
// but
r = r { loop { yield (~foo + 2) } }
(foo: 44).use { r.value } // -> 14

I’m not seeing a way to fix this right now. Adding extra nesting like yield ({~freq} * 2) has the problem of decoupling the proto advance/recursion from the value recursion. In Patterns you don’t really have this problem because the association is not externally held, i.e. each pattern holds enumerable refs to those patterns it depends on for data.

Also, generally

f = {42}
g = f + 3
g.value // -> 45; In contrast
h = r { loop { (f + 3).yield } } // needs a double "resolve"
h.value // -> a BinaryOpFunction
h.value.value // 45

Now I get to appreciate why embedInStream “recurses” as a function but “returns values” with yield so it can pop that value out of an arbitrary number of nestlings that are not known in advance… And yeah, there’s a way to fix this, but it’s not quite transparent to the routine (writer), i.e.

f = {42}
h = r { loop { (f + 3).value.yield } } 
h.value // 45

The problem with applying this to the Even chaining here is that Routine needs to do the environment change, and it can’t access it in my setup.

This idea – that events can contain arbitrary future calculations, and compose them arbitrarily – is interesting, but I would suggest to implement that in something other than Event.

I think it’s necessary to have an Event that is simply data storage with the capacity to play. That’s in keeping with Collections in general. Collections in SC are generally for already-resolved data.

So your very first example on this topic –

((instrument: \mysine2, freq: Pkey(\freq) * 2) <> (freq: 300)).play;

– is already outside of what Events are designed to do. It wasn’t correct in the first place to expect Pkey(\freq) to resolve.

So then “Actually a simple ‘record override’ like copy.putAll is not a terribly useful notion of Event composition because there’s no way to do any lookups in the ‘prior’ environment, i.e. no way to get a Pkey equivalent” also becomes moot, because Events don’t do Pkey.

It may be valuable to have some object like an Event that does do Pkey – and if there are other changes in the class library that are needed to make this other object interchangeable with Event, I’d be open to that.

But I’m highly skeptical of changing the basic nature of Event. It is now a container for the results of calculations. You’re proposing to change it into a container for future calculations. That is a massive change to the fundamental assumptions behind Events (or even behind collections in general – then, why not a function-composable array?), with a non-trivial risk of breaking existing user code. And object-oriented modeling already has a solution for that: don’t change the base class, add something else that uses the base class.


I have’t had much time play with this today, but as a quick note, after the “obvious hack” on Routine

+ Routine {
	envir {	^environment }
	envir_ { arg env; environment = env; ^this }

	valueInEnvir { |env ... args|
		var oldenv = this.envir, val;
		this.envir = env;
		val = this.value(args);
		this.envir = oldenv;

I can now do in + Event

	atRec { arg key;
		var val =, env = this;
		while ({val.isKindOf(AbstractFunction)}, {
				{ val = val.valueInEnvir(env.proto) },
				{ val = env.proto.use { val.value } });
			env = env.proto; })

And on a quick check this works now as expected:

e = (freq: r { yield (~freq * 2) }) <> (freq: 200, bar: 24)
e.resolve // -> ( 'freq': 400 )

So (horrors), I could even turn Events into Patterns basically as they can “pull” from streams now.

I can obvious extend valueInEnvir to the Function-based stuff simply so the interface is prettier (basically it’s equivalnet to f.inEnvir(e).value but it turns out I need to touch more than Function because of the BinaryOp business).

On the other hand, I’ve been indeed thinking along the lines you’ve mentioned that a “mini-container” might be better than having atRec directly resolve (i.e. apply) functions/routines, simply because of compatibility issues with stuff Pmono etc, which stick callbacks (i.e. functions) into Events, and I probably don’t want to be force-evaluating those (on resolve)… Cleanups are a similar issue, but those probably should be done differently anyway. I’m still evaluating what the best fixup for that is. It’s actually one of the reasons I’ve been toying with “enriched Events” like in this experiment.

Strongly suggested to use protect here. If the operation throws an error, then a protect error handler will restore the original environment anyway. Without it, an error would leave the routine in a corrupted state. I’d demonstrated this in the other thread.

I was trying to express that, no matter how cool it is to have streams and functions resolve automagically in an Event (or Event-like structure) – and this is a brilliant idea actually – we still need to have a concept of Event as a container for results of calculations done outside of the Event. That is, your concept need not (even should not) supersede what Event already is.

A similar thing comes up with SynthDef sometimes – a user wonders why it doesn’t handle graph topologies that adapt to argument values, or x or y or z. My position on that is that we need a concept of SynthDef matching the server’s restrictions. Having a “physical” SynthDef doesn’t preclude other structures that use SynthDefs to do things that SynthDef by itself can’t do. But often, people see SynthDef everywhere in the documentation and assume that SynthDef is supposed to be responsible for everything. I don’t think do. SynthDef fulfills a necessary role, as it is. Event fulfills a necessary role, as it is.

I suppose we have to agree to disagree about that, but I think I’m on pretty solid ground. Part of the point of Design Patterns is to reduce the risk of breakage by choosing development strategies other than fundamentally altering base classes for every new requirement. I can’t help but note that the thrust of this thread has been “how to make Event do what I want” rather than “how to create a new superstructure that does what I want.” The former is invasive and risky while the other is not, so I think the latter should be preferred as a default position.


Let’s not overstate how much surgery I’ve done here on Events. I’ve just reinterpreted its data… on demand. Those methods I’ve added can be entirely in user functions.

If you wonder how this is related to (fixing) cleanups: imagine that if instead of cleanup functions we’d have cleanup routines, implemented with the obvious logic “do cleanup once, then yield nil”. It would not matter how many times that routine-based cleanup gets called, it is intrinsically idempotent.

There’s a way to obtain the exact same behavior with functions, of course, but it needs a bit more packing, i.e. the actual cleanup function has to be the return value of another function (closure) like

{ |userfun| var done = false; { if(done.not) { done = true; userfun.value } } }

The other thing that me ponder more explicit event composition is that–conceptually–one can think of events that carry the cleanup-function reference (which they do presently anyway in the addToCleanup field) as events obtained by composing with cleanup generating event. So one approach could be consider “sticking idempotent cleanups” just in events, and not have pattern streams keep any local copies of such cleanup structures. It’s probably obvious, but I’ll say it anyway: with this approach a cleanup generator/source (e.g. Pfset) would always send its cleanup by “composing it” into the Event, not just on the first event it generates (as it happens right now). Conceptually, all events (not just the first one) generated by a Pfset (with non-nil cleanup) carry a cleanup promise. So, yeah, I am considering making this conceptual idea the actual implementation of cleanups. And further downstream (again in the dataflow perspective), you compose whatever cleanups you receive from upstream (i.e. typically from subpatterns’ streams) with your own cleanups, if you generate any. (I said “typically” because e.g. a Pchain’s event dataflow graph differs from its sub-pattern graph. A Pchain looks like root of a “flat” tree of from the pattern perspective (i.e. abstract syntax tree perspective), but it actually “pipes data” between its children, so from a dataflow perspective, a Pchain creates a linear chain of streams among its children, with the Pchain itself at the most downstream point. )

I’ve just reinterpreted its data… on demand. Those methods I’ve added can be entirely in user functions.


I guess I should explain a bit why I’m being cautious about it. Scztt has mentioned “technical debt.” One of the ways SC has incurred technical debt is by a developer saying “Wouldn’t it be cool if…?” and then someone had committed an implementation, without much review or discussion, which worked most of the time but failed in some cases (e.g. Pfset vs Pchain, or my own failed design, the previous version of Rest, which you didn’t see). Then someone comes into the project and starts noticing, and becoming annoyed with, gaps or inconsistencies.

The cause of those gaps and inconsistencies was originally moving too quickly… and, I learned the hard way in the past that a clever idea to sidestep a problem needs extra scrutiny. I’m trying now to play that role of scrutinizing – bear with me.

If you wonder how this is related to (fixing) cleanups…

I hadn’t. This appears/appeared to be a separate topic… not anymore, but initially.

Now that I might understand a bit more…

So one approach could be to consider “sticking idempotent cleanups” just in events, and not have pattern streams keep any local copies of such cleanup structures. … with this approach a cleanup generator/source (e.g. Pfset) would always [put] its cleanup into every Event, not just on the first event it generates

That’s a good idea (with a couple of tweaked words) – very good idea.

I do have a question, though: Why do you feel that event composition is the best way to do this?

Event composition, in a general sense, is something that a user could do for any purpose. Cleanup is a specific purpose. A/ If events are composed as a calculation strategy, then the composed-in calculations have to be resolved before the event takes action (i.e. before or during .play) – but composed-in cleanups should not resolve at this time. How do you tell the difference? B/ Composed-in cleanups should resolve at .stop time, but other composed-in operations should not. (Sorry if you’ve answered this already – I don’t see it.)

I’m wondering if it would be simpler/clearer to take the idea of “cleanups stored in every event,” but just use an array under a specific key. Currently we use arrays in specific keys to represent cleanup deltas, but an array in a specific key could just as easily represent the total of all cleanups that have to be done. So, e.g., Pgroup would stash its cleanup into that array before passing the event down, and children could add more cleanups before yielding.

I suppose a cleanup event could mark itself as such with some flag – but this “bear of very little brain” doesn’t, at this stage, grasp why a complex, nested, multi-layer Event structure is preferable to a simple array. One rule of thumb is to prefer the simplest implementation that gets the job done – so what is the job that event composition does that an array of cleanups doesn’t handle?

I hope I’m not stepping on toes (anymore) – I think this is an interesting way to simplify cleanups, but I’m not sure I’m seeing the whole justification for event composition here.

There’s a way to obtain the exact same behavior with functions, of course, but it needs a bit more packing, i.e. the actual cleanup function has to be the return value of another function…

I just realized, Thunk is already an idempotent function in SC. So you don’t need an extra layer of function wrapping, nor a routine (so the funny business of hacking Routine’s environment is not necessary at all, actually). Just Thunk { ... cleanup ... }.


I should add here that PLbindef from misc lib implements this sort of “duality” between events/environments and Pbinds. Quoting some example from its help page:

    \instrument, \sin_grain,
    \dur, 0.2,
    \midinote, Pwhite(60, 90)

// PLbindefEnvironment has been made and assigned to the variable ~x in currentEnvironment, check

// now the PLbindefEnvironment can also be treated as a player

// set params while playing
~x.att = Pwhite(0.01, 0.2)


PLbindefEnvironment : Environment { var <name, <plbindef; //...

PLbindef : Pbindef { var <sourceEnvir, <refEnvir; //...

where sourceEnvir is a PLbindefEnvironment.

PLbindefEnvironment is actually affected by the bug/feature/optimization discussed here if one were to actually access those sub-fields via use (i.e. ~x.use { ~att = ... } . But the help example[s] don’t do that, instead they “save typing” by using one-letter environment variables for the PLbindefEnvironments… (which of course isn’t saving typing compared to use if there are more than a couple of fields to set.)

As far as I can tell, there’s no direct composition implemented for PLbindefEnvironments, one has to do it via the PLbindefs.

As I discovered with more SC experimenting, that actually largely works to a good extent in Event already… but only on play, so you don’t get a “preview” of what’s gonna happen with mere next or some similar non-playing method.

(degree: {rrand(1,8)}, dur: 0.2).play // hit a few times

// And even with "outboard" streams
r = (1..4).iter
(degree: r, dur: 0.2).play // hit a few times

The line actually makes 'freq': a BinaryOpFunction because it plugs the degree function we gave into a formula.

It’s a bit obscure where the function call happens precisely but basically everything sent to the server as synth args gets asControlInput called on it. And for the whole class tree of AbstractFunctions that method does value e.g.:

{3}.asControlInput // -> 3

Unfortunately dur is trouble when changed/set via a function:

(degree: 5, dur: {0.2}).play
// ERROR: Message 'schedBundleArrayOnClock' not understood.
// RECEIVER: a BinaryOpFunction

Also regarding my original topic

There’s a pre-defined method called

composeEvents { arg event; ^this.copy.putAll(event) }

but oddly it’s defined in Environment, although (of course) Event inherits it. Furthermore, Event does define next has having precisely those semantics. The documentation for Event even phrases that as composition:

Combines an event given in the argument with the current event. This is used to enable events to be composed.

(a: 6, b: 7).next((c: 100));

And unsurprisingly the implementation of actually calls composeEvents.

So, at this point it looks to me like the missing <> definition in Event is more of an oversight than something deliberate…

I’m actually still a bit miffed why that composeEvents was even defined separately, when Dictionary (from which Environment inherits) has a method/operator that does exactly the same thing, albeir called ++

++ { arg dict; ^this.copy.putAll(dict) }

Perhaps someone found objectionable the idea that ++ does replacement/override, so they used a different name later on in the Environment subclass… <> would be a better name given the (replacement/override) semantics used.

(foo: 1) ++ (foo: 5, bar: 2)  // -> ( 'bar': 2, 'foo': 5 )

But, for an extra layer of confusion, next and composeEvents actually are “the other way around”, i.e. composition order is reversed in next

(zz: 1).next((zz: 2)) // -> ( 'zz': 1 )
(zz: 1).composeEvents((zz: 2)) // -> ( 'zz': 2 )
(zz: 1) ++ (zz: 2) // -> ( 'zz': 2 )

because actually does:

next { arg inval; ^composeEvents(inval, this) }

Also the composition of Events by existing classlib methods, while not auto-composing functions inside Events does allow some level of “manual composition”

(degree: {~woot.value}, dur: 0.2).next((woot: 19)).play
(degree: {~woot.value}, dur: 0.2).next((woot: {rrand(1,8)})).play
// But just the following doesn't work
(degree: {~woot}, dur: 0.2).next((woot: {rrand(1,8)})).play

That’s one of the reasons why I’m skeptical of it as a general approach. Currently it works for synth arguments largely accidentally (I’m fairly sure it wouldn’t have worked before asControlInput) and it isn’t defined with any consistency outside of that context. (I’m not even certain it will work consistently within that context – it’s probably not too long a trek before you run into some edge cases.)

Making it work consistently would be a “nice to have” but I’m not sure it’s worth as much developer effort as, say, cleaning up the inconsistencies in unit generator initial samples. If I were a “program manager” making decisions about where to allocate developer hours, I’d weight the latter much more heavily. (But SC is a do-ocracy, so if you’re interested, that issue is yours for the taking.)

The initial idea here was quite different: in composeEvents, one event overrides the other’s values, but you were proposing a composite event made of composites of the data entries. It’s an interesting idea but it needs to be designed from the ground up – if the approach is “well, Event already almost does it,” there will be mistakes in the interface eventually.

And I haven’t changed my mind about this: “I think it’s necessary to have an Event that is simply data storage with the capacity to play. That’s in keeping with Collections in general. Collections in SC are generally for already-resolved data.” That doesn’t necessarily object to the idea of an Event alternative with the power to resolve pending calculations over composited data – but Events are central to the pattern-based sequencing workflow. I’d be extremely cautious about changing their fundamental nature (but less cautious about using object-oriented polymorphism to offer an extended alternate).


Yes that was more/later fancy idea that would keep a reference to the “old target” event

	/* "record override" not very useful
	<> { arg anEvent;
	<> { arg anEvent;
		this.proto_(anEvent); ^this;

Both my proposals also happens to reverse the order composition just like next in Event turns out to do compared with composeEvents; see previous towards the end where I now realized (and edited this in). But to repeat that observation here:

(zz: 1).next((zz: 2)) // -> ( 'zz': 1 )
(zz: 1).composeEvents((zz: 2)) // -> ( 'zz': 2 )
(zz: 1) ++ (zz: 2) // -> ( 'zz': 2 )

And (later discovery) so that next does not feel lonely, it also has a synonym: transformEvent:

(zz: 1).transformEvent((zz: 2)) // -> ( 'zz': 1 )

And this one wants to be a slightly more general interface, alas it’s documented on the page for nil!

(\zz -> 1).transformEvent((zz: 2)) // -> ( 'zz': 1 )
(_[\zz] = 1).transformEvent((zz: 2)) // -> ( 'zz': 1 )

And of some interest, this one actually modifies the target argument event, so not actually identical to next. The comment for in the code says

	// Pattern support
	transformEvent { arg event;

The association and function variants of this method also have this zero-copy semantics on the target event, i.e. they allow in-place modification.

But transformEvent is not actually being called in the classlib, so I think it was deprecated in favor of next, which is used in such (Pattern) contexts…

Obviously a <> on Events should have the arguments order what next does, not what ++ does because

(Pbind(\zz, 1) <> (zz: 2)) // -> ( 'zz': 1 )

For my own illumination, I’ve also tested these:

a = (zz: 1, od: 2)
b = (zz: 2, mo: 5)

// b gives defaults for a "with union" for uncommons
a.blend(b, 0) // -> ( 'zz': 1, 'od': 2, 'mo': 5 )
a.blend(b, 0) == // true

// b overrides a "with union" for uncommons
a.blend(b, 1) // -> ( 'zz': 2, 'od': 2, 'mo': 5 )
a.blend(b, 1) == (a ++ b) // true

// b gives defaults for a "with intersection"; 
// i.e. uncommons dropped "both sides"
a.blend(b, 0, false) // -> ( 'zz': 1 )

// b overrides a "with intersection" ...
a.blend(b, 1, false)  // -> ( 'zz': 2 )

But it gets a bit more interesting, as the above are not all the “combos”

// b overrides a on common keys, 
// but a gets to keep just its own uncommons
// could be called "get news only if interested in topic(s)"
a ++ a.blend(b, 1, false) // -> ( 'od': 2, 'zz': 2 )

// the dual of the above, although I don't have a good name for it
// keep common stuff from a, uncommons from b
b ++ a.blend(b, 0, false) // -> ( 'mo': 5, 'zz': 1 )

There are also some boring combinations that yield either a or b back, e.g.

a ++  (a.blend(b, 0, false)) == a, 0, false)) == a

I suppose I could look through the paper in the related discussion again and find the more CS names proposed for these, but I can’t be bothered right now. composeEvents (and ++) does work exactly like “record override” as defined in that paper.

Actually I did look even in the longer paper of Cardelli and Mitchell, but those don’t have names. given, although they are obtainable by a so-called restriction applied before override. The restriction removes keys not found in a given set.

From a more practical perspective, there’s nothing too deep going on in the above. There are 4 set “combinations” of keys (union, intersection, and the two original sets of keys) deciding which key set goes into the result, combined with another decision bit of “who does the override” (or who “wins the conflict”) setting the values on the common keys. Of those 8 combinations in total, 6 are non-trivial but two give back the respective starting dicts. Alternatively explained, there a 3 orthogonal “decision bits”: whether to keep keys that exist only in a, likewise for keys that exist only in b, and who gets to set the values on the common keys. (There’s actually a 4th “bit” if one wants to consider not keeping common keys at all, but I haven’t considered that above, i.e. then there a 3 choices for the common stuff not two. And that’s making the decision “a priory” just based on keys, not values. Bracha and Lindstrom actually defined their “merge” to produce the common keys only if they agreed on values, but be undefined otherwise. Considering the values too makes the “decision space” larger still.)

To refocus this discussion on SC though, I expect that very basic constraint on a <> defined for Events to produce the same result as Pchain, at least for non-function fields, ie.

a = (foo: 2)
b = (foo: 1, bar: 0)
Pchain(a, b) == (foo: 2, bar: 0) == (a <> b)

which of course doesn’t happen at the moment because the rightmost expression is nil with the standard classlib.


e = ()
p = Pbind()

e <> p // should be non-nil and return a Pchain(e, p) at least.

Well, some of that dur eval was easy to fix.

+ AbstractFunction {

	schedBundleArrayOnClock { |clock, bundleArray, lag = 0, server, latency|

		^this.value.schedBundleArrayOnClock(clock, bundleArray, lag, server, latency)

Now playing an event with function dur directly works:

(dur: { 0.1 * rrand(1,4) }).play // ok now

But via Pbinds there are still issues due to delta

Pbind(\dur, { 0.1 * rrand(1, 4) }).play
// ERROR: Primitive '_Event_Delta' failed. Wrong type.

There’s some (optimized) C++ delta code that doesn’t like non-num types for dur

	delta {

I’ve managed to fix that too with a fallback

+ Event {

	delta {
		//" fallback!".postln;
		if ('delta').notNil) {
		} {
			^'dur').value *'stretch').value

Strangely if I allocate a stack variable in there so I don’t call'delta') twice it doesn’t work. I think the fallback from primitives doesn’t allow anything else to be added to the stack?!

Hm, there’s not a particularly good reason why Event:delta doesn’t read:

	delta {
		^if(this[\delta].notNil) {
		} {
			value(this[\dur] * this[\stretch])

I still have some doubts. This is the kind of thing where, if the architecture to support the functionality isn’t designed from the ground up, eventually you will run into something that seems like it should work, but it doesn’t. The current approach here is very likely to push that boundary further into the distance, but not remove the boundary. (It’s impressive that it gets as far as it does without being designed.)

I still think we need a concept of an Event that is primarily for already resolved data. Again, I don’t object to an alternative to Event that does (thoroughly) support future calculations, but I continue to have reservations about doing this in Event itself.


Well, the thing is Event already does a whole bunch of calculations from degree to freq etc. all implemented by defining the non-terminals (e.g. freq) as functions in Event.default.

I agree that my “fancy” proposal to auto-link and resolve a chain of Events (e.g. via the proto fiels[s]) is probably pushing the system too far, making it too “Lisp-y”… And that would be also fairly unnecessary since one can do that more explicitly via callbacks. (I still think a basic <> that does work with Patterns on the right should added to Event though. I almost forgot about that, I’ll write a simple implementation soon-ish.)

a/ in the default Event prototype, not in Event’s implementation; and b/ in a specific context, not as a general mechanism.

I’m totally ok with being inspired by the default event prototype’s built-in calculations to conceive of a new kind of behavior. I just don’t think it should go into Event itself.


1 Like