Shouldn't Events be (directly) composable?

Currently if you do:

// need a non-default synthdef for some obscure reasons (I'll get to that)
(SynthDef(\mysine2, {  |out=0, amp=1, gate=1, freq=555, pan=0| 
	var sing = EnvGen.ar(Env.asr(), gate, doneAction: 2) * SinOsc.ar(freq);
	OffsetOut.ar(out, Pan2.ar(sing, pan, amp));
}).add;)

((instrument: \mysine2, freq: Pkey(\freq) * 2) <> (freq: 300)).play; // silent

it does nothing of course (because <> is not actually defined for Event and more annoyingly since Event has know turned on () <> () is actually nil , whereas prepending an empty Pbind makes it work obviously

(Pbind() <> (instrument: \mysine2, freq: Pkey(\freq) * 2) <> (freq: 300)).play // sound

(because ā€˜<>ā€™ is defined for patterns ā€œon the leftā€, while you can have events on the right of <>. Pdef in fact relies on this to <> its envir.

More annoyingly, replacing the instrument with the default one, i.e.

(Pbind() <> (instrument: \default, freq: Pkey(\freq) * 2) <> (freq: 300)).play // silent

also does nothing even with a Pbind, and I donā€™t know why. Also

(Pbind(\instrument, \default) <> (freq: Pkey(\freq) * 2) <> (freq: 300)).play // still silent
(Pbind(\instrument, \default, \freq, Pkey(\freq) * 2) <> (freq: 300)).play // sound

So, questions:

  • shouldnā€™t Events be directly composable with <>? I donā€™t see a lot of difference between an Event and a Pbind. Event could have asPbind as method for example and thus define a <> for itself too. In the absence of this, if you accidentally chain two events (which you easily can with parentheses in a wrong place), you get no sound.

  • Why doesnā€™t the \default instrument work in the chained Pbind example, unlike my custom instrument?

Pkey(\freq) is not valid in an event. Only Pbind or a variant will evaluate it.

Events should contain fully resolved values. They should not contain patterns. You get no sound here because of sending an invalid frequency, Iā€™m pretty sure (without testing, maybe Iā€™m wrong).

Itā€™s the same as the difference between [Pwhite(0.0, 0.5, inf), [Pwhite(0.5, 1.0, inf)] and Ptuple([Pwhite(0.0, 0.5, inf), [Pwhite(0.5, 1.0, inf)]). You could say ā€œI donā€™t see a lot of difference between an array of patterns and a Ptupleā€ but at the end of the day, itā€™s still the case that collection objects donā€™t evaluate patterns contained within ā€“ only pattern streams do.

Perhaps a way to say it is that patterns are active while collections are passive, with regard to patterns inside them.

It would be reasonable to define event1 <> event2 to return event2.copy.putAll(event1).

Pbindā€™s pairs are ordered while Eventā€™s are not. Iā€™m guessing it would be fragile to convert an unordered set of pairs into an ordered one ā€“ would work under some restrictions, but itā€™s only a matter of time before someone writes (dur: Pwhite(0.1, 0.5, inf), freq: Pkey(\dur).linexp(0.1, 0.5, 100, 1000)).asPbind and gets annoyed that it doesnā€™t work. So I would vote no on that.

hjh

1 Like

Actually, youā€™re quite right. I was fooled by my default frequency in my Synth being pretty close to the one I was thinking I was setting. With a synth change like

(SynthDef(\mysine2, {  |out=0, amp=1, gate=1, freq=200, pan=0| // ...

it becomes obvious the freq is not actually getting set in the chainā€¦

Pbindā€™s pairs are ordered while Eventā€™s are not.

Indeed, I overlooked thatā€¦ since Iā€™ve dabbling with making my own Event subclass that does remember the user-set order on keys. Since Pdef actually chains its envir, you can use it a bit more unorthodox way as

Pdef(\testE, Pbind())
Pdef(\testE).envir = Pbind(\freq, 200)
Pdef(\testE).play // ok, freq is set
Pdef(\testE).stop
Pdef(\testE, Pbind(\freq, Pkey(\freq)*2))
Pdef(\testE).play // multiplies the envir one
PdefGui(Pdef(\testE), 10) // not showing envir, of course

And it works to play, but of course this has the downside that you canā€™t set into it anymore, nor can you use PdefGui properly on it anymore (envir wonā€™t show).

On the other hand, <> could still make some sense for Events (or even Environments), perhaps as an equivalent for .copy.putAll, but I havenā€™t given it a lot of thought.

Actually a simple ā€œrecord overrideā€ like copy.putAll is not a terribly useful notion of Event composition because thereā€™s no way to do any lookups in the ā€œpriorā€ environment, i.e. no way to get a Pkey equivalent.

So instead one can use proto chaining, and a ā€œrecursive resolveā€

+ Event {

	/* "record override" not very useful
	<> { arg anEvent;
		^anEvent.copy.putAll(this)
	}*/
	<> { arg anEvent;
		this.proto_(anEvent); ^this;
	}

	atRec { arg key;
		var val = this.at(key), env = this;
		while ({val.isKindOf(AbstractFunction)},
               {val = env.proto.use { val.value }; env = env.proto; })
		^val
	}
}

At least for some basic recursive lookup this seems to work, i.e.

e = (freq: {~freq * 2}) <> (freq: 200, bar: 24);
d = (freq: {~freq + ~bar}) <> e;

e.atRec(\freq) // 400
d.atRec(\freq) // 424

Iā€™m guessing there is a way to break this since the ASTs that BinaryOpFunction builds arenā€™t as comprehensive as Patterns.

Also, thereā€™s no notion of sequencing here, just composition, and one needs to decide when to resolve the values, i.e. before playing:

+ Event {
	resolve {
		var ev = this.copy;
		this.keysDo { |k| ev[k] = this.atRec(k) };
		^ev
	}
}

e.resolve // ( 'freq': 400 )
d.resolve // ( 'freq': 424 )

I found out where this approach breaks, namely

~freq = 11
e = (freq: r { loop { yield (~freq * 2) } }) <> (freq: 200, bar: 24)
e.resolve // ( 'freq': 22 )

Routines are AbstractionFunction too, but they donā€™t look up in the use environment for some reason, i.e.

~foo = 12
f = {~foo + 2}
(foo: 44).use { f.value } // -> 46
// but
r = r { loop { yield (~foo + 2) } }
(foo: 44).use { r.value } // -> 14

Iā€™m not seeing a way to fix this right now. Adding extra nesting like yield ({~freq} * 2) has the problem of decoupling the proto advance/recursion from the value recursion. In Patterns you donā€™t really have this problem because the association is not externally held, i.e. each pattern holds enumerable refs to those patterns it depends on for data.

Also, generally

f = {42}
g = f + 3
g.value // -> 45; In contrast
h = r { loop { (f + 3).yield } } // needs a double "resolve"
h.value // -> a BinaryOpFunction
h.value.value // 45

Now I get to appreciate why embedInStream ā€œrecursesā€ as a function but ā€œreturns valuesā€ with yield so it can pop that value out of an arbitrary number of nestlings that are not known in advanceā€¦ And yeah, thereā€™s a way to fix this, but itā€™s not quite transparent to the routine (writer), i.e.

f = {42}
h = r { loop { (f + 3).value.yield } } 
h.value // 45

The problem with applying this to the Even chaining here is that Routine needs to do the environment change, and it canā€™t access it in my setup.

This idea ā€“ that events can contain arbitrary future calculations, and compose them arbitrarily ā€“ is interesting, but I would suggest to implement that in something other than Event.

I think itā€™s necessary to have an Event that is simply data storage with the capacity to play. Thatā€™s in keeping with Collections in general. Collections in SC are generally for already-resolved data.

So your very first example on this topic ā€“

((instrument: \mysine2, freq: Pkey(\freq) * 2) <> (freq: 300)).play;

ā€“ is already outside of what Events are designed to do. It wasnā€™t correct in the first place to expect Pkey(\freq) to resolve.

So then ā€œActually a simple ā€˜record overrideā€™ like copy.putAll is not a terribly useful notion of Event composition because thereā€™s no way to do any lookups in the ā€˜priorā€™ environment, i.e. no way to get a Pkey equivalentā€ also becomes moot, because Events donā€™t do Pkey.

It may be valuable to have some object like an Event that does do Pkey ā€“ and if there are other changes in the class library that are needed to make this other object interchangeable with Event, Iā€™d be open to that.

But Iā€™m highly skeptical of changing the basic nature of Event. It is now a container for the results of calculations. Youā€™re proposing to change it into a container for future calculations. That is a massive change to the fundamental assumptions behind Events (or even behind collections in general ā€“ then, why not a function-composable array?), with a non-trivial risk of breaking existing user code. And object-oriented modeling already has a solution for that: donā€™t change the base class, add something else that uses the base class.

hjh

I haveā€™t had much time play with this today, but as a quick note, after the ā€œobvious hackā€ on Routine

+ Routine {
	envir {	^environment }
	envir_ { arg env; environment = env; ^this }

	valueInEnvir { |env ... args|
		var oldenv = this.envir, val;
		this.envir = env;
		val = this.value(args);
		this.envir = oldenv;
		^val
	}
}

I can now do in + Event

	atRec { arg key;
		var val = this.at(key), env = this;
		while ({val.isKindOf(AbstractFunction)}, {
			if(val.isKindOf(Routine),
				{ val = val.valueInEnvir(env.proto) },
				{ val = env.proto.use { val.value } });
			env = env.proto; })
		^val
	}

And on a quick check this works now as expected:

e = (freq: r { yield (~freq * 2) }) <> (freq: 200, bar: 24)
e.resolve // -> ( 'freq': 400 )

So (horrors), I could even turn Events into Patterns basically as they can ā€œpullā€ from streams now.

I can obvious extend valueInEnvir to the Function-based stuff simply so the interface is prettier (basically itā€™s equivalnet to f.inEnvir(e).value but it turns out I need to touch more than Function because of the BinaryOp business).

On the other hand, Iā€™ve been indeed thinking along the lines youā€™ve mentioned that a ā€œmini-containerā€ might be better than having atRec directly resolve (i.e. apply) functions/routines, simply because of compatibility issues with stuff Pmono etc, which stick callbacks (i.e. functions) into Events, and I probably donā€™t want to be force-evaluating those (on resolve)ā€¦ Cleanups are a similar issue, but those probably should be done differently anyway. Iā€™m still evaluating what the best fixup for that is. Itā€™s actually one of the reasons Iā€™ve been toying with ā€œenriched Eventsā€ like in this experiment.

Strongly suggested to use protect here. If the operation throws an error, then a protect error handler will restore the original environment anyway. Without it, an error would leave the routine in a corrupted state. Iā€™d demonstrated this in the other thread.

I was trying to express that, no matter how cool it is to have streams and functions resolve automagically in an Event (or Event-like structure) ā€“ and this is a brilliant idea actually ā€“ we still need to have a concept of Event as a container for results of calculations done outside of the Event. That is, your concept need not (even should not) supersede what Event already is.

A similar thing comes up with SynthDef sometimes ā€“ a user wonders why it doesnā€™t handle graph topologies that adapt to argument values, or x or y or z. My position on that is that we need a concept of SynthDef matching the serverā€™s restrictions. Having a ā€œphysicalā€ SynthDef doesnā€™t preclude other structures that use SynthDefs to do things that SynthDef by itself canā€™t do. But often, people see SynthDef everywhere in the documentation and assume that SynthDef is supposed to be responsible for everything. I donā€™t think do. SynthDef fulfills a necessary role, as it is. Event fulfills a necessary role, as it is.

I suppose we have to agree to disagree about that, but I think Iā€™m on pretty solid ground. Part of the point of Design Patterns is to reduce the risk of breakage by choosing development strategies other than fundamentally altering base classes for every new requirement. I canā€™t help but note that the thrust of this thread has been ā€œhow to make Event do what I wantā€ rather than ā€œhow to create a new superstructure that does what I want.ā€ The former is invasive and risky while the other is not, so I think the latter should be preferred as a default position.

hjh

Letā€™s not overstate how much surgery Iā€™ve done here on Events. Iā€™ve just reinterpreted its dataā€¦ on demand. Those methods Iā€™ve added can be entirely in user functions.

If you wonder how this is related to (fixing) cleanups: imagine that if instead of cleanup functions weā€™d have cleanup routines, implemented with the obvious logic ā€œdo cleanup once, then yield nilā€. It would not matter how many times that routine-based cleanup gets called, it is intrinsically idempotent.

Thereā€™s a way to obtain the exact same behavior with functions, of course, but it needs a bit more packing, i.e. the actual cleanup function has to be the return value of another function (closure) like

{ |userfun| var done = false; { if(done.not) { done = true; userfun.value } } }

The other thing that me ponder more explicit event composition is thatā€“conceptuallyā€“one can think of events that carry the cleanup-function reference (which they do presently anyway in the addToCleanup field) as events obtained by composing with cleanup generating event. So one approach could be consider ā€œsticking idempotent cleanupsā€ just in events, and not have pattern streams keep any local copies of such cleanup structures. Itā€™s probably obvious, but Iā€™ll say it anyway: with this approach a cleanup generator/source (e.g. Pfset) would always send its cleanup by ā€œcomposing itā€ into the Event, not just on the first event it generates (as it happens right now). Conceptually, all events (not just the first one) generated by a Pfset (with non-nil cleanup) carry a cleanup promise. So, yeah, I am considering making this conceptual idea the actual implementation of cleanups. And further downstream (again in the dataflow perspective), you compose whatever cleanups you receive from upstream (i.e. typically from subpatternsā€™ streams) with your own cleanups, if you generate any. (I said ā€œtypicallyā€ because e.g. a Pchainā€™s event dataflow graph differs from its sub-pattern graph. A Pchain looks like root of a ā€œflatā€ tree of from the pattern perspective (i.e. abstract syntax tree perspective), but it actually ā€œpipes dataā€ between its children, so from a dataflow perspective, a Pchain creates a linear chain of streams among its children, with the Pchain itself at the most downstream point. )

Iā€™ve just reinterpreted its dataā€¦ on demand. Those methods Iā€™ve added can be entirely in user functions.

Ok.

I guess I should explain a bit why Iā€™m being cautious about it. Scztt has mentioned ā€œtechnical debt.ā€ One of the ways SC has incurred technical debt is by a developer saying ā€œWouldnā€™t it be cool ifā€¦?ā€ and then someone had committed an implementation, without much review or discussion, which worked most of the time but failed in some cases (e.g. Pfset vs Pchain, or my own failed design, the previous version of Rest, which you didnā€™t see). Then someone comes into the project and starts noticing, and becoming annoyed with, gaps or inconsistencies.

The cause of those gaps and inconsistencies was originally moving too quicklyā€¦ and, I learned the hard way in the past that a clever idea to sidestep a problem needs extra scrutiny. Iā€™m trying now to play that role of scrutinizing ā€“ bear with me.

If you wonder how this is related to (fixing) cleanupsā€¦

I hadnā€™t. This appears/appeared to be a separate topicā€¦ not anymore, but initially.

Now that I might understand a bit moreā€¦

So one approach could be to consider ā€œsticking idempotent cleanupsā€ just in events, and not have pattern streams keep any local copies of such cleanup structures. ā€¦ with this approach a cleanup generator/source (e.g. Pfset) would always [put] its cleanup into every Event, not just on the first event it generates

Thatā€™s a good idea (with a couple of tweaked words) ā€“ very good idea.

I do have a question, though: Why do you feel that event composition is the best way to do this?

Event composition, in a general sense, is something that a user could do for any purpose. Cleanup is a specific purpose. A/ If events are composed as a calculation strategy, then the composed-in calculations have to be resolved before the event takes action (i.e. before or during .play) ā€“ but composed-in cleanups should not resolve at this time. How do you tell the difference? B/ Composed-in cleanups should resolve at .stop time, but other composed-in operations should not. (Sorry if youā€™ve answered this already ā€“ I donā€™t see it.)

Iā€™m wondering if it would be simpler/clearer to take the idea of ā€œcleanups stored in every event,ā€ but just use an array under a specific key. Currently we use arrays in specific keys to represent cleanup deltas, but an array in a specific key could just as easily represent the total of all cleanups that have to be done. So, e.g., Pgroup would stash its cleanup into that array before passing the event down, and children could add more cleanups before yielding.

I suppose a cleanup event could mark itself as such with some flag ā€“ but this ā€œbear of very little brainā€ doesnā€™t, at this stage, grasp why a complex, nested, multi-layer Event structure is preferable to a simple array. One rule of thumb is to prefer the simplest implementation that gets the job done ā€“ so what is the job that event composition does that an array of cleanups doesnā€™t handle?

I hope Iā€™m not stepping on toes (anymore) ā€“ I think this is an interesting way to simplify cleanups, but Iā€™m not sure Iā€™m seeing the whole justification for event composition here.

Thereā€™s a way to obtain the exact same behavior with functions, of course, but it needs a bit more packing, i.e. the actual cleanup function has to be the return value of another functionā€¦

I just realized, Thunk is already an idempotent function in SC. So you donā€™t need an extra layer of function wrapping, nor a routine (so the funny business of hacking Routineā€™s environment is not necessary at all, actually). Just Thunk { ... cleanup ... }.

hjh

I should add here that PLbindef from misc lib implements this sort of ā€œdualityā€ between events/environments and Pbinds. Quoting some example from its help page:

(
PLbindef(\x,
    \instrument, \sin_grain,
    \dur, 0.2,
    \midinote, Pwhite(60, 90)
)
)

// PLbindefEnvironment has been made and assigned to the variable ~x in currentEnvironment, check
~x

// now the PLbindefEnvironment can also be treated as a player
~x.play


// set params while playing
~x.att = Pwhite(0.01, 0.2)

Implementation-wise

PLbindefEnvironment : Environment { var <name, <plbindef; //...

PLbindef : Pbindef { var <sourceEnvir, <refEnvir; //...

where sourceEnvir is a PLbindefEnvironment.

PLbindefEnvironment is actually affected by the bug/feature/optimization discussed here if one were to actually access those sub-fields via use (i.e. ~x.use { ~att = ... } . But the help example[s] donā€™t do that, instead they ā€œsave typingā€ by using one-letter environment variables for the PLbindefEnvironmentsā€¦ (which of course isnā€™t saving typing compared to use if there are more than a couple of fields to set.)

As far as I can tell, thereā€™s no direct composition implemented for PLbindefEnvironments, one has to do it via the PLbindefs.

As I discovered with more SC experimenting, that actually largely works to a good extent in Event alreadyā€¦ but only on play, so you donā€™t get a ā€œpreviewā€ of whatā€™s gonna happen with mere next or some similar non-playing method.

(degree: {rrand(1,8)}, dur: 0.2).play // hit a few times

// And even with "outboard" streams
r = (1..4).iter
(degree: r, dur: 0.2).play // hit a few times

The line actually makes 'freq': a BinaryOpFunction because it plugs the degree function we gave into a formula.

Itā€™s a bit obscure where the function call happens precisely but basically everything sent to the server as synth args gets asControlInput called on it. And for the whole class tree of AbstractFunctions that method does value e.g.:

{3}.asControlInput // -> 3

Unfortunately dur is trouble when changed/set via a function:

(degree: 5, dur: {0.2}).play
// ERROR: Message 'schedBundleArrayOnClock' not understood.
// RECEIVER: a BinaryOpFunction

Also regarding my original topic

Thereā€™s a pre-defined method called

composeEvents { arg event; ^this.copy.putAll(event) }

but oddly itā€™s defined in Environment, although (of course) Event inherits it. Furthermore, Event does define next has having precisely those semantics. The documentation for Event even phrases that as composition:

.next(inval)
Combines an event given in the argument with the current event. This is used to enable events to be composed.

(a: 6, b: 7).next((c: 100));

And unsurprisingly the implementation of Event.next actually calls composeEvents.

So, at this point it looks to me like the missing <> definition in Event is more of an oversight than something deliberateā€¦

Iā€™m actually still a bit miffed why that composeEvents was even defined separately, when Dictionary (from which Environment inherits) has a method/operator that does exactly the same thing, albeir called ++

++ { arg dict; ^this.copy.putAll(dict) }

Perhaps someone found objectionable the idea that ++ does replacement/override, so they used a different name later on in the Environment subclassā€¦ <> would be a better name given the (replacement/override) semantics used.

(foo: 1) ++ (foo: 5, bar: 2)  // -> ( 'bar': 2, 'foo': 5 )

But, for an extra layer of confusion, next and composeEvents actually are ā€œthe other way aroundā€, i.e. composition order is reversed in next

(zz: 1).next((zz: 2)) // -> ( 'zz': 1 )
(zz: 1).composeEvents((zz: 2)) // -> ( 'zz': 2 )
(zz: 1) ++ (zz: 2) // -> ( 'zz': 2 )

because Event.next actually does:

next { arg inval; ^composeEvents(inval, this) }

Also the composition of Events by existing classlib methods, while not auto-composing functions inside Events does allow some level of ā€œmanual compositionā€

(degree: {~woot.value}, dur: 0.2).next((woot: 19)).play
(degree: {~woot.value}, dur: 0.2).next((woot: {rrand(1,8)})).play
// But just the following doesn't work
(degree: {~woot}, dur: 0.2).next((woot: {rrand(1,8)})).play

Thatā€™s one of the reasons why Iā€™m skeptical of it as a general approach. Currently it works for synth arguments largely accidentally (Iā€™m fairly sure it wouldnā€™t have worked before asControlInput) and it isnā€™t defined with any consistency outside of that context. (Iā€™m not even certain it will work consistently within that context ā€“ itā€™s probably not too long a trek before you run into some edge cases.)

Making it work consistently would be a ā€œnice to haveā€ but Iā€™m not sure itā€™s worth as much developer effort as, say, cleaning up the inconsistencies in unit generator initial samples. If I were a ā€œprogram managerā€ making decisions about where to allocate developer hours, Iā€™d weight the latter much more heavily. (But SC is a do-ocracy, so if youā€™re interested, that issue is yours for the taking.)

The initial idea here was quite different: in composeEvents, one event overrides the otherā€™s values, but you were proposing a composite event made of composites of the data entries. Itā€™s an interesting idea but it needs to be designed from the ground up ā€“ if the approach is ā€œwell, Event already almost does it,ā€ there will be mistakes in the interface eventually.

And I havenā€™t changed my mind about this: ā€œI think itā€™s necessary to have an Event that is simply data storage with the capacity to play. Thatā€™s in keeping with Collections in general. Collections in SC are generally for already-resolved data.ā€ That doesnā€™t necessarily object to the idea of an Event alternative with the power to resolve pending calculations over composited data ā€“ but Events are central to the pattern-based sequencing workflow. Iā€™d be extremely cautious about changing their fundamental nature (but less cautious about using object-oriented polymorphism to offer an extended alternate).

hjh

Yes that was more/later fancy idea that would keep a reference to the ā€œold targetā€ event

	/* "record override" not very useful
	<> { arg anEvent;
		^anEvent.copy.putAll(this)
	}*/
	<> { arg anEvent;
		this.proto_(anEvent); ^this;
	}

Both my proposals also happens to reverse the order composition just like next in Event turns out to do compared with composeEvents; see previous towards the end where I now realized (and edited this in). But to repeat that observation here:

(zz: 1).next((zz: 2)) // -> ( 'zz': 1 )
(zz: 1).composeEvents((zz: 2)) // -> ( 'zz': 2 )
(zz: 1) ++ (zz: 2) // -> ( 'zz': 2 )

And (later discovery) so that next does not feel lonely, it also has a synonym: transformEvent:

(zz: 1).transformEvent((zz: 2)) // -> ( 'zz': 1 )

And this one wants to be a slightly more general interface, alas itā€™s documented on the page for nil!

(\zz -> 1).transformEvent((zz: 2)) // -> ( 'zz': 1 )
(_[\zz] = 1).transformEvent((zz: 2)) // -> ( 'zz': 1 )

And of some interest, this one actually modifies the target argument event, so not actually identical to next. The comment for in the code says

	// Pattern support
	transformEvent { arg event;
		^event.putAll(this);
	}

The association and function variants of this method also have this zero-copy semantics on the target event, i.e. they allow in-place modification.

But transformEvent is not actually being called in the classlib, so I think it was deprecated in favor of next, which is used in such (Pattern) contextsā€¦

Obviously a <> on Events should have the arguments order what next does, not what ++ does because

(Pbind(\zz, 1) <> (zz: 2)).iter.next // -> ( 'zz': 1 )

For my own illumination, Iā€™ve also tested these:

a = (zz: 1, od: 2)
b = (zz: 2, mo: 5)

// b gives defaults for a "with union" for uncommons
a.blend(b, 0) // -> ( 'zz': 1, 'od': 2, 'mo': 5 )
a.blend(b, 0) == a.next(b) // true

// b overrides a "with union" for uncommons
a.blend(b, 1) // -> ( 'zz': 2, 'od': 2, 'mo': 5 )
a.blend(b, 1) == (a ++ b) // true

// b gives defaults for a "with intersection"; 
// i.e. uncommons dropped "both sides"
a.blend(b, 0, false) // -> ( 'zz': 1 )

// b overrides a "with intersection" ...
a.blend(b, 1, false)  // -> ( 'zz': 2 )

But it gets a bit more interesting, as the above are not all the ā€œcombosā€

// b overrides a on common keys, 
// but a gets to keep just its own uncommons
// could be called "get news only if interested in topic(s)"
a ++ a.blend(b, 1, false) // -> ( 'od': 2, 'zz': 2 )

// the dual of the above, although I don't have a good name for it
// keep common stuff from a, uncommons from b
b ++ a.blend(b, 0, false) // -> ( 'mo': 5, 'zz': 1 )

There are also some boring combinations that yield either a or b back, e.g.

a ++  (a.blend(b, 0, false)) == a
a.next(a.blend(b, 0, false)) == a

I suppose I could look through the paper in the related discussion again and find the more CS names proposed for these, but I canā€™t be bothered right now. composeEvents (and ++) does work exactly like ā€œrecord overrideā€ as defined in that paper.

Actually I did look even in the longer paper of Cardelli and Mitchell, but those donā€™t have names. given, although they are obtainable by a so-called restriction applied before override. The restriction removes keys not found in a given set.

From a more practical perspective, thereā€™s nothing too deep going on in the above. There are 4 set ā€œcombinationsā€ of keys (union, intersection, and the two original sets of keys) deciding which key set goes into the result, combined with another decision bit of ā€œwho does the overrideā€ (or who ā€œwins the conflictā€) setting the values on the common keys. Of those 8 combinations in total, 6 are non-trivial but two give back the respective starting dicts. Alternatively explained, there a 3 orthogonal ā€œdecision bitsā€: whether to keep keys that exist only in a, likewise for keys that exist only in b, and who gets to set the values on the common keys. (Thereā€™s actually a 4th ā€œbitā€ if one wants to consider not keeping common keys at all, but I havenā€™t considered that above, i.e. then there a 3 choices for the common stuff not two. And thatā€™s making the decision ā€œa prioryā€ just based on keys, not values. Bracha and Lindstrom actually defined their ā€œmergeā€ to produce the common keys only if they agreed on values, but be undefined otherwise. Considering the values too makes the ā€œdecision spaceā€ larger still.)


To refocus this discussion on SC though, I expect that very basic constraint on a <> defined for Events to produce the same result as Pchain, at least for non-function fields, ie.

a = (foo: 2)
b = (foo: 1, bar: 0)
Pchain(a, b).iter.next == (foo: 2, bar: 0) == (a <> b)

which of course doesnā€™t happen at the moment because the rightmost expression is nil with the standard classlib.

Also

e = ()
p = Pbind()

e <> p // should be non-nil and return a Pchain(e, p) at least.

Well, some of that dur eval was easy to fix.

+ AbstractFunction {

	schedBundleArrayOnClock { |clock, bundleArray, lag = 0, server, latency|

		^this.value.schedBundleArrayOnClock(clock, bundleArray, lag, server, latency)
	}
}

Now playing an event with function dur directly works:

(dur: { 0.1 * rrand(1,4) }).play // ok now

But via Pbinds there are still issues due to delta

Pbind(\dur, { 0.1 * rrand(1, 4) }).play
// ERROR: Primitive '_Event_Delta' failed. Wrong type.

Thereā€™s some (optimized) C++ delta code that doesnā€™t like non-num types for durā€¦

	delta {
		_Event_Delta
		^this.primitiveFailed;
    }

Iā€™ve managed to fix that too with a fallback

+ Event {

	delta {
		_Event_Delta
		//"Event.delta fallback!".postln;
		if (this.at('delta').notNil) {
			^this.at('delta').value
		} {
			^this.at('dur').value * this.at('stretch').value
		}
	}
}

Strangely if I allocate a stack variable in there so I donā€™t call this.at('delta') twice it doesnā€™t work. I think the fallback from primitives doesnā€™t allow anything else to be added to the stack?!

Hm, thereā€™s not a particularly good reason why Event:delta doesnā€™t read:

	delta {
		_Event_Delta
		^if(this[\delta].notNil) {
			this[\delta].value
		} {
			value(this[\dur] * this[\stretch])
		}
	}

I still have some doubts. This is the kind of thing where, if the architecture to support the functionality isnā€™t designed from the ground up, eventually you will run into something that seems like it should work, but it doesnā€™t. The current approach here is very likely to push that boundary further into the distance, but not remove the boundary. (Itā€™s impressive that it gets as far as it does without being designed.)

I still think we need a concept of an Event that is primarily for already resolved data. Again, I donā€™t object to an alternative to Event that does (thoroughly) support future calculations, but I continue to have reservations about doing this in Event itself.

hjh

Well, the thing is Event already does a whole bunch of calculations from degree to freq etc. all implemented by defining the non-terminals (e.g. freq) as functions in Event.default.

I agree that my ā€œfancyā€ proposal to auto-link and resolve a chain of Events (e.g. via the proto fiels[s]) is probably pushing the system too far, making it too ā€œLisp-yā€ā€¦ And that would be also fairly unnecessary since one can do that more explicitly via callbacks. (I still think a basic <> that does work with Patterns on the right should added to Event though. I almost forgot about that, Iā€™ll write a simple implementation soon-ish.)

a/ in the default Event prototype, not in Eventā€™s implementation; and b/ in a specific context, not as a general mechanism.

Iā€™m totally ok with being inspired by the default event prototypeā€™s built-in calculations to conceive of a new kind of behavior. I just donā€™t think it should go into Event itself.

hjh

1 Like