i was wondering whether GUIs could also be used for setting parameters in a Pattern.asScore for NRT synthesis?
It seems GUIs examples so far always presume a realtime context. The GUI itself surely would have to stand outside the NRT code. My question would be how ideally incorporate a GUI within an NRT setup (working with Patterns).
If anyone has experience or ideas with this, id be happy to hear about it!
You could use a GUI to define information (including sequences, modulation curves etc) to be rendered into a Score.
The Score is provided all at once to the NRT server – prepared in advance and then rendered in a batch style job. All of the information must go into the score in advance – hence there’s no possibility of interaction during rendering.
GUIs by definition are about interaction – because NRT rendering is never interactive, then the typical uses of GUIs aren’t compatible with NRT. There is literally no way to tweak a knob during NRT rendering and have it affect the sound. This would be why you don’t see a lot of NRT GUI examples.
thanks! yes i thought that the parameters would be set beforehand in the pattern so excluding interaction.
i was imagining something like a vargui in which instead of a play button there would be a „render“ one. im not clear though on how to setup/run the gui so that the nrt/pattern score catches the set parameters and can be executed by the gui itself…