Downsizing the Class Library

There’s a long history of discussion around splitting, refactoring, or otherwise downsizing the core class library.

There has been enough recent discussion on the Class Library Development Group thread that it deserves its own thread. So, I’ll do what I can to port the relevant discussion from that thread here (about 15 posts). Hopefully it improves continuity and can be more easily followed here.

Suggestions are welcome for alternative approaches to corral the discussion (and alternative names to the topic).

Feel free to (and please do!) copy in bits of discussion from other threads or past discussions on lists, etc.

NOTE: this is related to, and likely dependent on, an improved Quark/Package Manager system, but that discussion has it’s own thread: Quark Versioning / Dependency Management.

3 Likes

From the Class Library Development Group thread:

@rdd echoed a longstanding question, “Is the ‘tiny core with many libraries’ approach worth considering?” and invited comments:

@josh mentions that it’s likely necessary that we have a robust Quark system with a dependency manager before refactoring the classlib:

@shiihs clarifies the distinction between quarks and “libraries”, and points toward (the lack of) namespaces to manage collisions of third-party code

@jamshark70 compares to Pd’s concept of a vanilla library, with concern that you shouldn’t need to install externals to get a decent sounding set of basic filters

@jordan: +1 for a smaller library, mentioning

and later brings up the complicated issue of how do we (and have we historically) handled breaking changes to the library?

@muellmusik brings some historical context and raises the important point: it can only happen if one or a few core people really commit to a coherent plan.

@jordan: Supercollider does not have non-user space because there are no private methods. … Breaking everything and implementing access specifies might allow us to never break anything again… that might be worth it.

@muellmusik: While there’s no namespace there are conventions like “pr” private methods, “the case for splitting things should consider the specifics carefully and where things are really causing issues”, and the JITLib case, and a library/quark that provides the split-off libs for easy transition/recovery

the discussion of “public/private” methods continues for a few posts, whether private methods should be more strictly enforces/designed, see the original thread for those.

@jamshark70 arriving at a “standard” way to do things—in terms of documentation, examples, user support—might also suggest what to trim and what not to (there’s a chance i mixed up the context here)

The ClassLib Dev Group meeting touched on this topic:

JitLib surfaces in multiple discussions as a candidate to be Quark’d

@julian is in favour of integrating the functionality into the common library, not of moving it out.

@muellmusik sees potential in integrating it (the way buses and node ordering is largely abstracted away) … If that could be abstracted into a middle layer that was clearly part of core…

@Dionysis would like JITLib to stay in the core

@dsheiba comments on good and bad models in other languages, the role of test coverage in a large library base, also that mono-repos have the advantage of no version-clashes, easy CI/CD, easy testing, easy update procedures

which is reiterated here, adding also git can be an impediment to new users/students.

A smaller class library has also shown up on the SC4 Wishlist:

Thinking through concrete examples of what refactoring looks like is useful and necessary, e.g. @jordan has also raised an example in the repo Discussions board Remove Object.as* method to make refactoring easier #6065

3 Likes

As I read back over these quotes, it seems to me that the risk of making things worse is fairly high, if splitting the class library is undertaken for the sake of splitting it. I think there needs to be some guiding principle, 1/ to organize the effort, and 2/ to provide a reason by which to explain the changes.

Standardizing common operations would be one such principle: For requirement x, 1/ what should be the standard way? 2/ Does that standard way make the cut to be core, or would it be better as an extension? 3/ Then, some currently available ways will be relegated to “nonstandard” status – do they move into quarks? Who maintains the quarks?

E.g., async operations. Perhaps the new standard way would be something based on promises, as Jordan has proposed. Or perhaps it’s CondVar. Then, core or not? (I think core.) And… s.sync, Condition, etc… go to a BCThreadSync quark (BC = backward compatibility)… and help is updated to remove s.sync (and users will get annoyed when third-party tutorials use s.sync and it doesn’t work anymore – not that this is a sufficient reason to not do it, just highlighting that there needs to be a reason).

The long-term goal would be to reduce the surface area that “developers” are responsible for maintaining (acknowledging that a split between developers and users is not ideal from a community perspective) – to reduce the risk of SC dying under an ever-increasing maintenance burden. That’s inevitably going to be disruptive to user code – but, less disruptive than the software ceasing to exist (as happened to pd-extended – well, pd-extended was “reborn” as Purr Data, but not under the main Pd team).

Rambled a bit… anyway, I’m in favor of doing the homework first: pick 3-5 topics for cleanup, review existing methods, make some decisions. See how that goes, then pick a few more topics, and so on.

hjh

2 Likes

Two major concerns have been raised:

  1. breaking backwards compatibility
  2. trimming the core so much that the default install is lackluster/less powerful from a new user’s perspective

@scztt speaks to those the points elsewhere:

I honestly was surprised when I first read that some people wanted to trim the core.
So a few question and remarks about this :


Establishing the reason why we’d like to reduce the core is important. What is it ?

Is is to ease the dev’s work? To get better technical performances? Have a more readable documentation? Dev’s OCD about optimisation :stuck_out_tongue: ?

EDIT : I’m sorry for this reference to the OCD, and would like to apologize to anyone who might have been hurt by the reference. This wasn’t what I intended. I recognize OCD as a serious condition, and by no mean would I make fun of anybody suffering from this.


How do we include new users perspective in our discussions?

Most of the people I see discussing those issues are ‘power users’. Students I’m occasionally teaching SuperCollider often have trouble getting into it, and they’d never subscribe to this forum. Still, I’d say that changes made to SC should be done in their favor first, not towards people who already know how to hack this or that, but would like an easier way to do something (in case a change would create a conflict between power users and beginners).


Directly related to the previous point : why is it a bad thing to have too many ways to do something?

SC was the first ‘complicated’ language I learnt (compared to Sonic Pi or Processing, which have rather small libraries). So I’ve done things ‘wrong’ because there was a suboptimal way to do what I was doing. Several times, I realised I was going into the wrong direction, had to rewrite everything from scratch. But I’ve done things. The suboptimal approach suited my knowledge at that moment in time, and got me to design what I had in mind. I think this is a good thing.


What does vanilla mean?

At the restaurant, you got a ‘Mc Deluxe Bacon 3000’, but in the kitchen, they prepared a ‘burger’. I think we can split the core with the package manager and have a complicated way to load modules and everything, but that when you download SC for the first time, you don’t have to be aware that you’re in fact downloading a puzzle software. It’s called ‘SuperCollider 3.14’, it contains a lot of modules you’re not going to use, but if you try using them, they’re ready. And if you want to only use this or that, you can later on desinstall them or restrict them. So you download ‘extended’ by default, and if needed, you can get back to ‘vanilla’?


One good practical example : QTCollider

I use this extensively. I love the synthesis part of SC, but when it comes to real time music, my brain is too small and my fingers too big to do proper livecoding. So I click things. I think QTCollider is obviously out of the core (SCLang).

And still, it’s a desirable feature for art students, projects with graphic designers, ‘easy-to-use’ tools creation, etc. I think this is out of the ‘core’, but present in the default install?

Funny enough, QTCollider relies on Primitives. Some pure SCLang classes might be kept in core while ‘lower’ functionalities are kicked out. What a headache!

I’m currently working on a Quark that proposes new Views. Everything inherits from UserView. That means it will take this form : ‘Quark depends on QTCollider depends on Core’. Do we already have multiple ‘layers’ dependency already, or do we only have ‘Quark depends on Core’ for now?

2 Likes

You cannot put things which rely on C++ code, outside the main project, without massive amounts of dev time investment. There is no mechanism for “language plugins” at the moment.

3 Likes

Some of this is due to outdated (and occasionally bad) design. It is not possible to fix these problems as it would break backwards compatibility. By splitting things into removable chunks, the bad one can be removed, and a new easier to use piece can be put in its place. In my opinion, being able to remove and improve the old is the main benefit of a quark based approach.

1 Like

I agree with many of @Dindoleon’s points.
I also have some teaching (and of course learning) experience with students of various majors using the following software programs:

  • SuperCollider, Max (pd, csound, editing etc)
  • Audacity, Cockos Reaper, Audition, Ableton Live, Magix Samplitude Silver, (Logic Pro, Nuendo, ProTools)
  • Paul’s Extreme Stretch
  • MakeMusic Finale, Avid Sibelius
  • etc.

Students of computer science related subjects or those with programming experience learn SuperCollider with less of a learning curve, although some of them, who learn SuperCollider very quickly and write well written code, do not really understand the acoustic and compositional, musical basics behind the code. I think this shows that SuperCollider is somewhat biased towards the technical aspect, especially programming skills.

Musicians, including composers, generally find SuperCollider more difficult to learn than Max or PD. Musicians are usually used to using GUI based programs. I have not taught Max or PD to non-musicians, but students of other subjects usually find it difficult to use SuperCollider as well.

Where do these difficulties come from?

This difficulty starts with the installation of the plug-in and Quark, not to mention the server-client structure, which has recently been blocked by the OS due to security issues.

I would have to spend too much time checking that all the students had them installed correctly.
In contrast, GUI-based software such as Audacity and Reaper are easy to teach.

In short, using SuperCollider requires computer science skills, programming knowledge and a programmer’s mind, and reducing the core libraries and increasing the number of Quarks will increase this difficulty for the following reasons:

  1. Backwards compatibility. No need to explain again!

  2. There are too many Quarks, many of them are not well maintained and there is a lack of help documents. It is perfectly fine for the Quarks that are not listed in the official Quark directory file (https://github.com/supercollider-quarks/quarks/blob/master/directory.txt) not to provide a help document following the given help document guide. However, many Quarks in the directory files do not seem to be well maintained and do not provide the help documents, although they should. I think these Quarks are considered published, not private use for the publishers, and maintenance and help documents are important.

  3. Installing a Quark may cause compilation errors or overwrite the class method. There are too many such cases. Why does this happen? The reason is that Quarks are not maintained by central developers, but by individual needs. Reducing the size of the core library increases the number of such cases, and it becomes a vicious circle.

To overcome these three things, the skills and knowledge required are not musical or music theory things, but computer science and programming skills.

If SuperCollider is not only for the prominent users, who are not only musicians, artists and researchers, but also developers, but also for normal users (including musicians), the following is required

  • An easy to use installation method:
    • sc3 plugins should be in the core library
    • Useful quark should also be in the core library
  • Well-described help documents

Although there are no competitors to SuperCollider for algorithmic composition, it is very strange that

  • SuperCollider has no way of creating scores like nslider or the Bach library in Max.
  • It does not have its own way of creating musicXML to export the score.
  • It has no way of using pitch names, pitch class sets and pitch intervals. So getting the MIDI pitch number and frequency from the pitch name is not supported by the core class library.

So we should not only discuss how to reduce the sclang core classes, but also what features SuperCollider should have to make it more useful and practical for musicians, artists and researchers.

Many years ago, I asked about the nslider and kslider equivalents of pd in the pd user groups in Telegram. Someone wrote that I should (or could) implement them myself if I needed to. Oh… no… I hope that this kind of perspective is not in the SuperCollider developers.

I hope that SuperCollider will become a more beloved tool not only for computer science majors, but also for other majors, including musicians. (I feel that SC users are getting smaller than before. I hope this is just my misperception).

2 Likes

With a proper dependency system, students could just download prko’s-supercollider-flavour and all the quarks and their dependencies will be imported at once.

The difficulties might start here, but is this the main hurdle? Generally I have found supercollider’s design to cause the most issues, things like:

  • poor or no error messages;
  • old examples and inconsistent documentation, caused by a large and unmaintainable code base;
  • unexpected behaviour, e.g., (\numChannels: 2).numChannels
  • and synchronisation between server and client.

I take your conclusion that supercollider should be accessible to a wide range of people and the user base is shrinking, but I see it as the reason for changing to a quark/module based system, so that the older stuff can be depreciated and replaced by newer easier systems. While quarks are not necessary for this, the alternative is to outright break backwards compatibility, which I don’t think we’d ever reach a consensus on.

Perhaps, a quark based system could be implemented with minimal breaking changes?

Importantly, a modular system also solves some of the backwards compatibly issues supercollider might have in the future, whereas right not, there just isn’t a real mechanism to change things. I believe last breaking change was 20 years ago and done by James McCarthy? - that isn’t a sustainable system.

2 Likes

Thanks for the summary mike!

Could you give an example for this? I think the problem is that there are many different ways of SC - some which consider the other style “bad”, some consider the other style “bad”.

I’ll also ask the same.

Testing gets so much more complex (there are still core packages which now need “integration” tests across different versions - due to the duck-typing nature of sclang and a lack of any kind of type system this makes it really hard as there is no other way than writing many tests), existing code and extensions breaks, tutorials get deprecated, and on top SC has currently no package manager and to do this properly would either require writing a lexer in another language or add networking capabilities into sclang - both non-trivial tasks, and then one would still need to write a package manager :smiley:

Logging and documentation can already now be improved, and while unexpected behavior is definitely a thing within sclang I don’t see how this could not be fixed through quarks by starting a new dialect.


May I suggest an alternative path way to “downsize” the library by introducing namespaces, so one can tailor the environment according to ones needs. Maybe also introduce something like monkey patching to “overrule” any existing code within a scope?

IMO this would allow to have the best of both worlds: providing a backwards-compatible layer as a default environment while allowing for more custom, newer and explicit setups for more advanced users.
For .scd/.sc files the namespace from sc.legacy import * could be automatically applied to maintain compability, and .scdi/.sci files could explicitly state their imports and use for example from JitLib_v2 import Ndef or whatever.

The Legacy code scope would be put more into a maintenance mode which aims to maintain support for newer systems and do some bug-fixes, while anything outside of it could go as crazy as one would want. This would introduce of course an overhead on code-maintenance and could lead to a potential “fork” (as: why should I respect the bulky new code / legacy code?) but I think its better to build bridges between both styles than to burn existing paths.

2 Likes

No, it is not the main obstacle, but it does discourage students from wanting to learn Supercollider.

One more addition:
I think the new version of sclang should load classes dynamically, and the quark used should be specified in the SCD document (in each code? oh no…).

Perhaps worth speculating about what it would take to have dynamic class loading.

Here’s a troublesome case:

MyClass {
    var <>a, <>c;
    ...
}

Now, in the middle of a session, I change it to var <>a, <>b, <>c; and reload.

Let’s assume there were some instances of the old object definition – two-slot objects. Those instances would be invalidated: 1/ they never allocated a third slot, so attempts to access c are likely to return garbage or crash; 2/ x.b would get data that should be c. IMO, then, adding slots to objects at runtime is dangerous and shouldn’t be allowed. Loading new extensions should probably not recompile class files that had been previously loaded. (To support that would require awareness in the compiler of code deltas, and a scan of all reachable slots to apply those deltas, and updating bytecodes to change slot indices … this all frightens me.)

Fortunately, we’re not talking about that. Adding extension files can only add new classes and new methods, and override methods. (+ MyObject { var newThing; } I believe isn’t allowed.) But it’s worth establishing why a dynamic full recompile is full of nasty problems that are better just avoided.

Adding an extension I think would mainly require rebuilding the “big table.” It should be possible to allocate new memory, copy existing entries, and then update with new stuff from the extension. (The big table is a 2d lookup with one axis being every class and Meta_ class, and the other axis being a union of all method names.) There may be interpreter details I’m overlooking.

Removing an extension may be riskier, e.g. a = MyObject.new and then dynamically remove the extension. Then a has no definition anymore = probable crash. So this should probably be disallowed too.

Take as speculative (especially “should be possible” comments) but the caveats may define some boundaries of what should and shouldn’t be attempted.

hjh

2 Likes

This is my perspective on this specific matter.

Regarding benefits, the primary deficiency (compared to modern langs) doesn’t appear to lie in the capacity to dynamically reconfigure classes, especially considering the short recompilation time now; it’s not that bad. What users genuinely desire, I imagine, is a linter to support them in the process of crafting their classes. Just this tool would avoid the need to recompile the class library too often and would prove considerably more advantageous. Also, it would prevent the necessity for language modifications (how complex wouldn’t that be, right? Can we even do this right now?) and instead utilize an external tool. Such a thing could substantially improve the situation without requiring direct language rewriting.

I see the discussion about decreasing the size of the standard library in the same way. That doesn’t seem to be the central problem, and can even create new ones. Sclang is used in a specific way; one doesn’t just write sclang scripts from the command line as we write Python scripts, so we would need a minimal sclang flavor, etc. We should not exaggerate the ‘problem’, I think. I just don’t get all this comparison with pd-vanilla as a model.

Introducing modularization would be a positive step forward. Implementing namespaces is another favorable option. The concept of “workspaces” holds considerable promise and could offer significant benefits. Additionally, pursuing an isolated runtime appears to be a readily achievable and impactful change.

1 Like

Smalltalk handles redefinition of Classes using a ClassBuilder tool - a new Class is built and existing instances are mutated - my understanding s that given your example: a two slot Object whose Class is redefined to have three slots, your Object would be recompiled as a three slot object using the new Class definition (a remains a, c remains c, b will be the new Class’s default value for b) . @rdd may have some experience with this procedure… No idea whether this would be practical in sclang but apparently a solved problem anyhow.

2 Likes

Another option could be something like a --file-watch parameter to sclang. So, every time a class file is changed and saved, automatically the the class library would be recompiled. Maybe that’s not so hard to achieve.

1 Like

For me, since recompiling takes basically no time, the benefit of dynamic classes would be more that sometimes it’s hard to rebuild an interpreter state that e.g. is the result of some random process…

1 Like

Many would argue that in the Python world (using as an example because it’s an interpreted oop lang, similar scenario), it’s preferable to restart the interpreter rather than reloading a module. As far as I understand, reloading modules may lead to unforeseen outcomes (example: old instances created before the reload retain the state and behavior of the old class definition, leads to inconsistencies in behavior between old and new instances, but I mean, it all depends of the context, of course). The autoreload tool aims to mitigate these issues by refreshing functions and parts of older classes.Yet, there are drawbacks: 1. Code replacement can fail. 2. Deleted functions prior to reloading aren’t refreshed.

Maybe tools like a --file-watch option, and a auto-evaluation of a test environment to test the class you’re hacking could be better ideas?

IDK, correct me if I’m wrong or missing something.

We already had something like that and nobody really used it (maybe, if it were more convenient or built in):

1 Like

I didn’t know that. It can be useful. Auto-reload could be more accessible, even integrated in the ide. I’ve never heard of this ruby tool before. Also, it doesn’t need to be specific to Unit Testing. It could be any custom code to create an environment.