That GUI processing causes xruns and that small buffer sizes like 64 causes lots of xruns with a good sound card, that works fine in other systems, are all Linux problems, Iām quite sure about it by experience. On the internet, I see recipes and advice and people saying they achieved it or not but no real results. Is it a distro problem? Is it the kernel scheduling model? Is it the drivers? Is it Jack? Is it sunday? I donāt know.
@eckel An idea, can you test the very same with other os/distro in the same machine? That could give you some other clues and a bigger picture. But it requires some work, e.g. I wouldnāt do it myself (there are two things I dislike about electroacoustic music: cables and system configuration).
I also guess that in every system it will be a point of instability, then the problem may be that you want an extreme setup with variables you canāt control. This is my advice to you, most people set things up so they work fine for most cases. Nonetheless, I like these threads because they are very informative and something can always arise.
Not any GUI processing does and not any real time process reacts with xruns. When I run this naive Pd test patch, I can print to the Pd post window 100 times a second and have the audio thread load the CPU with 80% without causing a single xrun. Pd runs with Jack at a blocksize of 64 at 44.1 kHz and a delay setting of 1ms.
Ok, thanks for clarifying that. It isnāt necessarily obvious that they are separate.
It may be that Qt is more CPU expensive than Tcl/Tk. (One thing that annoys me to no end in Pd is its antique pixelated appearance in Linux and Windows ā cāmon, itās 2021 ā but, if Tcl/Tk doesnāt antialias and Qt does, then Qt is doing more work. But why that work has to be high priority enough to bomb an audio thread, I donāt know.)
Ah, I see that I misread the post ā I should have grouped āPd, with Jack at a blocksizeā together, rather than reading it as āPd, with Jack, at a [Pd] blocksizeā¦ā
Over the years, Iāve seen a lot of confusion in both SC and Pd forums over control blocks vs audio buffers (I misunderstood some aspects myself, for a long time) and I saw the term āblockā through that lens⦠which was not right in this case. Iām sorry about that.
May I suggest one other benchmark? To run the test in command-line sclang, but also open a server status window with Server.default.makeGui ā this will also trigger Qt string drawing, but in sclang rather than IDE. Maybe thereās a difference (in thread priority?). Sclang forces all GUI operations onto a lower priority thread⦠if you can get more juice with a 64-sample buffer and sclang GUI work vs the IDE, that might tell us something.
Now I run Ubuntu Studio 20.04, I experience occasional xruns running 128x2. That speed was rock solid last distribution with the wifi off. I assumed it was something wrong with the distribution, or
at any case something I now need to āadjustā. My machine is a grand old Latitude E6500.
Thank you for your suggestion. I thought about this at some point earlier in the whole process, but then lost track of it, among all the other ideas to analyse the problem. It indeed allows me to run much more with this test code:
Thank you! I though I had given it enough memSize, but it needed more. This is what I can achieve with supernova under the same conditions as the lasts tests performed for this discussion.
Aha⦠So there is a concrete difference between two parts of the SC suite. This perhaps canāt be entirely blamed on Qt.
That would be a valid question to raise on github: why does string drawing in sclang StaticText objects not break server performance, while string drawing for the IDEās status bar does break it?