I am thinking about switching to supercollider from max msp. One thing I often use in max and can’t seem to figure out how to do in supercollider is querying data from public web apis, such as weather apis. In max the maxurl object (which seems to be a wrapper around libcurl) makes this fairly simple.
Here is a video about the max object as an example:
How would one get data from similar apis using supercollider?
You can do a very limited version of this with the Download class, but you can’t specify headers, which probably makes it useless for most non-trivlal API’s.
The other option is to just run curl and grab the output:
"curl 'https://api.weather.gov/favicon.ico' -H 'User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:109.0) Gecko/20100101 Firefox/109.0' -H 'Accept: image/avif,image/webp,*/*' -H 'Accept-Language: en-US,en;q=0.5' -H 'Accept-Encoding: gzip, deflate, br' -H 'Referer: https://api.weather.gov/'".unixCmdGetStdOut.parseJSON
Thank you! The second method seems to work. Are the commands executed on the host machine or the supercollider server? I am just wondering, because curl and curl.exe work as a command, but Invoke-RestMethod does not…
And another question: If the request is a stream, it works fine using curl in from the terminal, but if I run the same command using the method you outlined in supercollider, it seems to just not return, but it also does not return the messages from the stream. I subsequently also can not run any other curl requests afterwards from supercollider.
SuperCollider has sadly no network capabilities beyond OSC messaging.
Using unxcmd with curl is one way to get around this but I don’t consider this reliable as
it becomes platform specific (curl needs to be installed) and
its blocking your sclang thread - so while it crawls for data you can’t do anything else anymore (like playing pattern) and
it is hard to get it working with extensive auth procedures like in an OAuth API.
I found the most viable way is to spin up a python or node script which handles the internet communication for me and I exchange information/instructions with this service via OSC - not nice but it works and it doesn’t block sclang while the request is running.
Also have in mind that the .parseJSON is a faulty JSON implementation as the JSON {"foo": 42} becomes {"foo": "42"}, so our number becomes a string
In unixy systems, this can be handled by redirecting the command’s stdout to a file (which can be done asynchronously with unixCmd), and then reading the local file after the curl process exits. There’s not a built-in method to do this, but it isn’t especially hard. I think we should encourage this approach over unixCmdGetStdOut (which does block the interpreter, as you noted).
I imagine this must be possible in Windows too, but I don’t know the DOS shell well enough.
Yes, but then you have to implement a file watcher in sclang and this still doesn’t help you if you need extensive auth capabilities or one has to crawl a long living connection like websocket streams.
One would probably need pid/lockfiles to do this properly.
On Windows this becomes even harder because AFAIK you can’t write to a file which is opened by another process, this will/can lead to Race condition - Wikipedia
No. unixCmd has an action function. You put the async finalizer in the action function, just as you would do for reading a buffer.
That’s true.
Perhaps I’m missing something, but shouldn’t there be no need to do that? The curl (equivalent) process writes the file. After the curl process terminates – after, not during, SC reads from the file which the curl process wrote and closed. Reading from a file that no other process is holding open sounds like a standard thing to do.
Admittedly I’m assuming here that it’s a simple request and retrieval, not the more complex cases that you mentioned (but it was already noted that this trick won’t help those cases, so that’s not a reason to reject the approach for the cases where it does apply).