Libscsynth

Have any of you ever worked with libscsynth as a shared library? It seems to be something that is not very well documented, but I once managed to compile a C++ program in which libscsynth was linked as a shared library, and the server started and connected to JACK. As far as I observed, it works. I was just wondering about what people have done with it. In theory, it could even work with things like inline-cpp (hs) or work side by side with other systems. Why it never happened?

Are there any examples out there?

EDIT: Maybe SCAU used it somehow? I forgot about it.

The “internal” Server is basically just libscsynth called directly from sclang. In its current state, libscsynth is a bit limited. Most importantly, it insists on managing the audio I/O, which makes it useless for plugins and any projects that already do their own audio I/O. I have always wanted to implement a “plugin driver”, i.e. a fake audio backend where the user must provide the audio buffers, similar to the “AudioUnit” backend (used by SCAU), but not tied to a specific platform. I think this would make libscsynth much more attractive. For example, libpd is quite popular because it can be easily embedded into all sorts of applications.

thank you

le it’s true that SuperCollider typically manages its audio I/O; I was just wondering if there’s a way to achieve more flexible integration using libscsynth. Specifically, I’m curious about the possibility of creating multiple SuperCollider JACK clients within the same environment by utilizing separate World objects for each instance while preserving the host program to do its own thing (also using the jack framework). This could be achieved through a function like World* createScsynthInstance() or something along those lines. I wonder if this approach could give us finer control over how scsynth integrates into a broader JACK-based audio system (in a way, inspired by non-mixer and friends). Maybe this approach does not bring much (probably thatś the case), as you said, but I wondered what it could potentially allow.

Not yet, but as I said, it’s on my (eternal) to-do list.

Scsynth already supports multiple World instances, the problem is just that World insists on managing the audio devices. A “plugin driver” would be a “fake” audio backend with some kind of “process” method that takes input/output buffers. The user stays in control of the actual audio I/O. This way you could embed any number of scsynth instances in your application, e.g. as audio plugins. This would be very similar to libpd.

Maybe this approach does not bring much (probably thatĹ› the case), as you said, but I wondered what it could potentially allow.

? I literally said:

I think this would make libscsynth much more attractive.

1 Like

Yes, I understand, and that would be a significant enhancement. I was referring specifically to scsynth as a jack client since a jack client does not represent the hardware IO(), but a node in a graph. Since the host application also creates DSP processes as jack clients anyway, by coincidence (as a node in scsynth, this could even out both systems to work side-by-side since I don’t want to create a similar environment, but something to work together. Other applications line non-mixer uses jack framework as its “node tree”, or apps like jest can quickly and easily jit compile a faust, c++ (and possibly other languages with some tweaks) as a dsp module and a jack client, etc. However, I understand that it is quite accidentally a context specific to Linux and Jack API.

I was referring to it as it is now.

yea That is a deal breaker on macOs/MS-Win, but I was wondering if that happens with jack API (or other problems down the line)

It’s also a deal breaker on Linux for certain use cases, e.g. plugins or offline processing.

To your question: in theory you should already be able to create several World instances (in the same process) and they should appear as individual Jack clients. Give it a try and report back :slight_smile:

1 Like

One can use scsynth as an AUX insert on Linux, since everything goes through JACK (there is a quark to control Ardour from sc, which could help with that) –

Thank you buddy

I don’t think this is what most people would call a plugin :slight_smile: I’m really talking about (cross-platform) VST/LV2/CLAP plugins, plugins for game engines, embedding scsynth in other computer music environments, etc.

CLAP looks interesting, and it already has many wrappers for AU, VST, LADSPA, etc.

In any case, what people call “plugins” usually require a GUI interface. Right?? (It could be done as with Faust, just using the metadata from a synthdef, or something like that maybe)

Yes, DAW plugins always have a GUI interface. (If the plugin does not provide its own native GUI, the DAW will typically generate a generic UI from its parameters.)

There are, of course, other types of plugins which don’t necessarily have a GUI, e.g. Pd externals.

(It could be done as with Faust, just using the metadata from a synthdef, or something like that maybe)

I don’t quite see how Faust fits in the picture. The obvious approach would be to embed scsynth in a JUCE plugin. Note that there is no single solution, depending on the amount of flexibility you want. On the minimum, you could host a single synth and link plugin parameters to synth arguments. On the maximum, you can embed a full code editor so that users can write SC code directly in their DAW.

In Pd world, the former would correspond to Camomile (GitHub - pierreguillot/Camomile: An audio plugin with Pure Data embedded that allows to load and to control patches) (= a single Pd patch as a plugin), the latter would be equivalent to PlugData (https://plugdata.org/).

In SC world, the former has been attempted with SuperColliderAU. GitHub - asb2m10/plugincollider: SuperCollider as a VST3 plugin would be somewhat in between.

Faust can also use JUCE to build plugins with a GUI interface. The Faust code already has this metadata. But SC also has metadata attached to synthdefs, for example. That could be used to create the interface. That was the parallel. But that’s just yet another idea, of course. And the implications are pretty different, too.

But SC also has metadata attached to synthdefs, for example. That could be used to create the interface.

Well, you can just parse the SynthDef and create the plugin parameters. No need for Faust :slight_smile: Although in practice you would need some configuration file because the SynthDef itself constains too little information about the parameters. That’s basically how SuperColliderAU works.

I was not talking about using faust, I just mentioned that the SynthDef metadata can potentially cover all one would need to generate a GUI interface. Yes, that’s a sclang feature; it never reaches scsynth. But even a description file would function pretty much the same way.

I’m not sure when SCAU was written, synthdef metadata was already implemented like today, that is, an Event you can add any information. I think not

I was not talking about using faust, I just mentioned that the SynthDef metadata can potentially cover all one would need to generate a GUI interface.

I see what you mean. But as you said, the metadata file is not part of the SynthDef, so it doesn’t really matter if you use sclang’s metadata or roll your own config file. The advantage of the latter is that it is independent of the client. E.g. you can just use a plain old JSON file that can be easily parsed and written by any client.