Recently I found a folder named “dancer-gesture-analyzer” inside a folder named “CascadeProject” on my computer. It contains a neat library for analysing movement data captured by sensors and stored into an array of sensor-data values. The README.md says:
Dancer Gesture Analyzer
A SuperCollider project that analyzes movement data from wearable sensors worn by dancers. The system evaluates movement gestures in terms of human expression, character, and personality traits.
Features
Parse timestamped sensor data files
Analyze movement qualities (weight, time, space, flow)
Interpret expressive characteristics based on Laban Movement Analysis principles
Real-time visualization of movement qualities
Map movement to sonic feedback
The library is quite useful for research into gestures. It mentions no authors! I would like to contact the authors, to engage in dialog as I plan to use this library for my own research. Google did not provide any helpful hints. Does anyone here know the authors?
I don’t know if it’s by them but you could ask my colleagues Gerhard Eckel and David Pirrò here at IEM Graz ( Institut 17 Institute of Electronic Music and Acoustics ) who did a lot of artistic research of this kind in the past.
When I have a bit more time I’d be very curious to see if I could wrangle it into something realtime…
EDIT: looking through the code now, and it’s a goldmine! This is really, really good stuff if it works as advertised. Just the kind of stuff I’ve been looking for.
Delighted to see that this dancer-gesture-analysis library seems useful to the community.
I am putting it to use in a dance-MOCAP-animation project with GODOT as gaming engine/3D animation engine. The full library is in GitHub - iani/sc-hacks-redux: Redoing sc-hacks from scratch. Under development!!! - Jan. 2021 IT is in state of intensse flux as I am building the interface to GODOT to connect to Martin Carlé’s GODOT plugins there. But I am pushing the dancer-gesture-analysis folder here: GitHub - iani/DanceGestureAnalyser: SuperCollider library for analysis of gesture data from dancers
I have done some corrections on the DataParser class, which did not work for me out of the box.
For the GestureAnalyser class one needs to provide an implementation of the ‘variance’ method - which I have not done yet.
kflak would be very interested indeed to here how you want to approach wrangling it into something real time, as this is what I want to address next. One could work with little subsets of data being processed as they are input, in batches of say 100 entries, extracted from the full data array as it is accumulated during recording or playback. But even better would be if one could write some ugens that operate on kr buffers where these data are written in real time.
I haven’t dug into it yet, but my initial impulse is to do some kind of windowing, to process the last n samples at a time at regular intervals. I guess my first point of entry is to port it over to common lisp, as I am mostly doing supercollider stuff in cl-collider these days. That way I don’t think there would be any need for creating any ugens, as cl should be more than performant enough for these kinds of things.
If plausible I could also imagine doing this on different time-scales, where one would be as fast as possible, one on a medium timescale (over the order of a few seconds), and finally one process on a longer scale. Could open up a nice range of sonification/composition possibilities…
Hi kflak,
Thanks for answering. There are some significant overlaps in our approach:
I haven’t dug into it yet, but my initial impulse is to do some kind of windowing, to process the last n samples at a time at regular intervals.
Yes. That is my take. It is not too difficult to implement, and moreover it should be performant enough on regular sclang on a recent laptop model (1-2 years), for windows of ca 30 frames (= 1 second of data recording at ca 30hz, regular Rokoko sampling rate, analysed every half frame (0.5 seconds, i.e. at 2Hz), and for vectors of ca 61 elements (the data vectors we get from Rokoko).
I guess my first point of entry is to port it over to common lisp, as I am mostly doing supercollider stuff in cl-collider these days. That way I don’t think there would be any need for creating any ugens, as cl should be more than performant enough for these kinds of things.
I have not worked with cl collider, but as said, sclang would be performant and powerful enough for the job in real time. However, for other statistic algorithms such as calculating the distances or angles formed by joints, and then performing Laban Movement analysis on them, something more efficient might be needed.
Perhaps we can exchange ideas in the future about this, and try something with shared data, if you like, something like comparing the outcomes of different approaches.
Yes. That is my take. It is not too difficult to implement, and moreover it should be performant enough on regular sclang on a recent laptop model (1-2 years), for windows of ca 30 frames (= 1 second of data recording at ca 30hz, regular Rokoko sampling rate, analysed every half frame (0.5 seconds, i.e. at 2Hz), and for vectors of ca 61 elements (the data vectors we get from Rokoko).
I am working with much smaller data sets: 8 accelerometers with 3 datapoints each (x, y, z) coming in at 15 hz. sclang should definitely be performant enough for this. However, I just find the developer experience of common lisp/emacs far superior to sclang, so I’ve migrated most of my tools over there already.
However, for other statistic algorithms such as calculating the distances or angles formed by joints, and then performing Laban Movement analysis on them, something more efficient might be needed.
Sounds good for the kind of mocap I imagine you’re working on, but on my sensor kit I don’t have access to that kind of data.
Perhaps we can exchange ideas in the future about this, and try something with shared data, if you like, something like comparing the outcomes of different approaches.
Yes, absolutely! I will start working on this some time in the end of June, just have to wrap up a couple of productions first… If I know myself, though, I’m sure I’ll procrastinate on the tasks I have for these and start working on this project before that.