Here’s an idea I’ve had for a bit: Making a Quarks-like package system for SuperCollider plugins. Now I’ve gone and done it. It’s pretty easy to use as most of it happens through a simple GUI. You simply click a plugin, click install and it should clone, compile and install the plugin for you.
Note: I’ve only tested on MacOS and Linux. And I’m still missing some important plugin packages that are more advanced (flucoma and vstplugin are examples of this). If anyone wants to help out make that work, please step up
I’ve moved some packages to a draft-packages folder for now since they don’t follow the usual style of CMake setup as geneated by the cookiecutter template. Will need to figure out how to deal with that. Biggest issue with them is they don’t have an install target.
If one wants to test these ut, then it’s as easy as running Plugins.includeDraftPackages_(true) before running the other commands.
This looks great! @madskjeldgaard, I’m presuming that the intention is we may be able to manage sc3-plugins via this route at some point?
If that’s the direction (and I’d approve!) it would be great to include this as part of the official SC distribution once all the kinks are worked out.
I added another little feature: It now automatically clones the SuperCollider repo and uses that to compile the plugins so now all you need to do is literally press a button. Woohoo
I’m going to ref @scztt and @MarcinP here, as this is along the lines of functionality that has been mentioned as a would like to have as part of the SC ecosystem.
Cool! SuperCollider is really missing a package manager for Server plugins.
Question: you mention that the package manager would clone and compile the plugin from source. However, on Windows and macOS, building from source does not work out of the box, you have to first install the necessary toolchains. Also, as you’ve mentioned, certain plugins are more difficult to build than others. Why not distribute the actual binaries (as many system package managers do)?
I think it’s worth looking at Pd’s built-in package manager “Deken”, it makes the process of installing and uploading Pd externals almost trivial. Of course, someone needs to the host the binaries. (In the case of Pd, the server is run by the IEM in Graz.)
The problem with tool chains is definately a problem and an obstacle for especially those who may not have them installed for other projects. I think I can solve the “more difficult” builds eventually by adding pre- and post-build commands to the spec.
Regarding binaries - I think that’s a great idea! My reason for going for the build route immediately is that a lot of plugins don’t distribute the binaries (even though it’s a hope for the future that all future projects will include them if they use the cookiecutter recipe we recently updated to automatically do this on Github). So to include as many plugins as possible - how would we do this in practice? Getting everyone to include binaries on their github projects won’t be possible, but then we can maybe setup a git repo which automatically builds binaries for a bunch of projects and then we can pull them from there into this package manager?
Ideally we could have both approaches in the system - allowing both builds and installing prebuilt binaries ?
Great points - I’m totally open to ideas on this subject!
So to include as many plugins as possible - how would we do this in practice?
With Deken, people can just upload binaries to the Deken server with a simple command line tool. Usually, this is done by the maintainers themselves, but occasionally this is also done by other users. For example, someone finds a Pd library that is not available yet on Deken [for their particular platform], decides to build it themselves and then uploads the binaries to make it available to everyone.
IMO, having a centralized server is much better than relying on individual git repos.
As a side note: Deken really supports any kind of Pd libraries, i.e. also those that only consist of Pd abstractions. I think in the long run it would also be great for SuperCollider to have a unified package manager for language and server extensions.
Thanks so much @madskjeldgaard !
We certainly need a package manager, yes. We need to be aware of the long-term implications of choosing a particular implementation though.
I just wanted to relay some points from previous discussion devs had on the issue of server package manager:
IIRC, building from source was considered too problematic to support long-term. I’m glad you got it to work, but can you (or the community) maintain this long-term for various platforms? (maybe the answer is yes, that would be nice)
and yes, this difficulty varies between platforms; it’s almost a non-issue on Linux, more difficult on macOS and maybe most difficult on Windows
it seems that the consensus was closer to offering a package manager for prebuilt binaries. Maintaining URL schemes for downloads for various platforms/architectures is certainly an issue though…
We were also discussing re-using an existing package manager for this. It’s a balance between adapting something from a different ecosystem and reinventing the wheel…
Maintenance “cost” is definitely something to consider. SC dev resources are rather slim these days and that’s an argument against adding yet another sub-project for this community to maintain. In that case, it would be nice if we could use something that’s already built out. But, if there’s a desire to maintain this project, then maybe this is the way to do.
A mixed source/binary approach would be the most powerful, I think. IF this is feasible to maintain…
As for centralized database - this one I’m not really in favor of. At least not in the way that the SC community would need to provide the binary builds. That maintenance burden is one of the main issues with sc3-plugins. IMO if this really is to be a centralized plugin manager, I’d like to see it as a database that can download plugin from any location (github, gitlab, static links), with applicable logic to download binaries etc.
And yes, if we decide that this project is the way to go for the plugins, we should eventually combine this and the quarks.
All in all, thanks for working on this, Mads. I hope we can find the best path forward to bring this functionality to SC!
For a binary package manager, someone has to provide the binaries. Of course, you could just submit links, but I don’t think it’s too much to ask to upload the actual files. One obvious problem with links are link rot… On the other hand, uploading files would need a server with sufficient storage.
Again, I would ask everyone to have a look at Pd’s package manager “Deken”! IMO, it has worked really well for us so far and I believe it can serve as a model for a future SC package manager.
I agree that binaries would be the way to go for wider distribution (and of course great if building from source is also supported). This may also help to keep maintenance burden on the plugin authors rather than the sc devs.
Is it possible to pull binaries by a specific release tag on github and unpack them with/out system security checks? If so, this would take care of the hosting problem. And would let users pull the a specific plugin release (e.g. if you know version x.y works for your piece from 2020), or could support dependency lists as is done with Quarks now (though dependencies are less of an issue for plugins).
In the previous conversations there was a gravitation towards github-centric solution, where github actions would be used to create binaries and upload release artifact to gh release (I don’t think anybody wants this to be locked in to gh of course, gh simply provides multiple services to make this work so was a starting point).
Now, automating this so that the correct binary in correct version is pulled for correct system and architecture, that’s certainly a tricky problem.
Is it possible to pull binaries by a specific release tag on github and unpack them with/out system security checks?
@mike circumventing security checks is probably not a good idea, right?
For building from source, we’d expose a security issue - a potentially malicious code could get built and executed on the user machine, possibly bypassing OS’s security measures. On the other hand, when binaries are downloaded, then the person publishing them is responsible for e.g. signing them on macOS (and possibly Windows).
I agree that we should have a close look at Deken!
Didn’t mean to imply circumventing security checks, just to raise the issue that it’s likely to be something to address if things are automatically pulled and unpacked
SC is already knee deep in GH, so I don’t think GH lock-in is a primary concern at this point.
I raised the versioning feature just to put it on the horizon, I don’t think it would (or should) be part of a first version of this feature. I recall getting this dialed in for Quarks was non-trivial.
From a quick look, Deken is centrally managed, but anyone cane upload externals, and security is only loosely supported (authors are asked to sign their packages with GPG-keys— you must have GPG installed and you need to have a GPG key for signing—which users can verify on their own).
On creating and uploading externals with deken.
And this is cool—deken uses something called objectlists which list objects within a library, which can be searched by their package manager. So if you know of an object you want but don’t know the library it’s in, you can just search for the object. This would be great to have for Quarks, many of which are essentially black box libraries with lots of useful stuff inside that is undiscoverable. The idea could be generalized to lists of not just classes in a Quark, but also useful methods or even just thematic keywords.