Live waveform view anyone?

Has anyone succeeded in creating a live waveform in a GUI, similar to SoundFileView but that could represent a buffer, which can change dynamically? For example to visualise recording.

I am trying to create that to do a GUI for a live granulator, where the user would interact with the waveform and move start/end flags etc.

Yes I do that in my main code. Here is a slightly simplified version of what I use, originally based on code by @jamshark70 but updated so nothing has to be written to disk (as I recall, I can’t find the original post right now).

(
s.waitForBoot{
	s.newBufferAllocators;
	~bufLength = 10; // in seconds
	~buf = Buffer.alloc(s, ~bufLength * s.sampleRate);
	~fps = 30; 
	~framesPerTrig = 1/~fps * s.sampleRate;
	
	~sfv = SoundFileView(nil, Rect(5, 100, 1363, 70))
	.alloc(~bufLength * s.sampleRate, 1, s.sampleRate)
	.gridOn_(false)
	.drawsBoundingLines_(false)
	.timeCursorOn_(true)
	.front
	.alwaysOnTop_(true);
	
	~updateWaveform = {|buf, view, framesPerTrig|
		var lastPhase;
		OSCdef(\phase, { |msg|
			{
				var data1, data2;
				var phase = msg[3];
				var cond = Condition.new;
				lastPhase = (phase - framesPerTrig).wrap(0, buf.numFrames);
				if (lastPhase > phase)
				{
					if (phase > 0)
					{
						buf.getn(0, phase, { |d|
							data2 = d;
							cond.unhang;
						});
						cond.hang;
					};
					if (lastPhase < buf.numFrames)
					{
						lastPhase = lastPhase.max(buf.numFrames - framesPerTrig);
						buf.getn(lastPhase, buf.numFrames - lastPhase, { |d|
							data1 = d;
							cond.unhang;
						});
						cond.hang
					}
				}
				{
					lastPhase = lastPhase.max(phase - framesPerTrig);
					buf.getn(lastPhase, phase - lastPhase, { |d|
						data1 = d;
						cond.unhang;
					});
					cond.hang;
				};
				
				if (lastPhase > phase)
				{
					if (data2.notNil) { view.set(0, data2) };
					if (data1.notNil) { view.set(lastPhase, data1) };
				}
				{ view.set(lastPhase, data1) };
				view.timeCursorPosition_(phase);
			}.fork(AppClock);
		}, '/phase');
	};
	
	SynthDef(\time, {
		var trig = \trig.tr(1);
		var buf = \buf.kr(0);
		var phase = Phasor.ar(trig, 1, 0, BufFrames.ir(buf));
		var sig = LFNoise0.ar(10) * Trig.ar(Dust.ar(5)); // some signal to test.
		// var sig = SoundIn.ar(0); // or use live input
		BufWr.ar(sig, buf, phase);
		SendReply.ar(Impulse.ar(\refreshRate.kr(30)), '/phase', phase);
	}).add;
	
	s.sync;
	
	x = Synth(\time, [buf: ~buf, refreshRate: ~fps]);
	~updateWaveform.(~buf, ~sfv, ~framesPerTrig);
}
)
2 Likes

Hi, this may be a little off topic as it concerns an Extension (the FluCoMa toolkit), and maybe you are seeking answers with the native Supercollider tools.
FluCoMa has a handy tool for visualizing sound and its analysis data, the FluidWaveform. It less flexible than the native SC Views, but is a fast way to achieve what I think you’re looking for.
Installing FluCoMa just for the FluidWaveform is kind of like buying a smart fridge to use the clock, but maybe it could interest you to look into their work anyway, as it is very useful for working with grains.
Here’s some sample code that does what you describe.
The next step (playing the grain) is pretty easy, since you have the slice frames saved in the global variable ~currentSlice, which you can use as control input for a Synth using the GrainBuf UGen, or anything else! The temp buffers created for visualization aren’t a good idea for playing, as they are interfered with quite regularly.

s.boot;
(
//Create a buffer filled with *something*
~src = Buffer.loadCollection(s,
	Env([0.0]++({rrand(-1.0, 1.0)}!30)++[0, 0],
		{rrand(0.1, 1)}!31,
		\sin
	).asSignal(4096);
);
)

(
~currentSlice = [0, ~src.numFrames - 1]; // starting slice: whole buffer
//FluidWaverform reads 2 channel buffers as containing onsets and offsets
//Its useful for many other FluCoMa tools to have the slices stored on the server as well
~sliceFrames = Buffer.alloc(s, numFrames: 1, numChannels: 2);
)
//set channels 1 and 2, as the buffer only has 1 frame.
~sliceFrames.setn(0, ~currentSlice);

(
~view = View(bounds:Rect(0, 0, 512, 400)).alwaysOnTop_(true).name_("Dynamic Waveform Viewing");
~srcWf = FluidWaveform.new(~src, ~sliceFrames, standalone:false);
~sliceWf = FluidWaveform.new(~src, standalone:false); //we start with the original buffer
~view.layout = VLayout(~srcWf, ~sliceWf);
~srcWf.asView.acceptsMouse = false; //FluidWaveform responds unexpectedly to mouse events
~sliceWf.asView.acceptsMouse = false;
~view.front;
//The two waveforms are equal at first
)

(//here we determine mouse behaviour: click and hold to determine start frame, release to determine end frame and update view.
//it would not be complicated to make the view update also when the mouse is moving while pressed!
//we could also dynamically "compose" the slice buffer as the mouse moves, making visuals update much faster 
//(using mouseMoveAction, conditionals, and FluidBufCompose's startChan/numChan) arguments)
var sliceSelection = [0, 0];
var sliceBuf; //here we will store the slices
var prevBuf; //useful for freeing buffers we aren't using anymore

~view.mouseDownAction = {|view, x, y ...args|
	sliceSelection[0] = ~src.numFrames * x / view.bounds.width;
};

~view.mouseUpAction = {|view, x, y ...args|
	sliceSelection[1] = ~src.numFrames * x / view.bounds.width;
	~currentSlice = sliceSelection.sort.copy; //we always want to make sure the first frame is smaller
	~sliceFrames.setn(0, ~currentSlice);
	prevBuf = sliceBuf;
	sliceBuf = Buffer(s);
	FluidBufCompose.processBlocking(s, ~src,
		startFrame: ~currentSlice[0], numFrames: ~currentSlice[1] - ~currentSlice[0],
		destination: sliceBuf, action: {
			~sliceWf.layers.pop;
			~sliceWf.addAudioLayer(sliceBuf);
			[~srcWf, ~sliceWf].do(_.refresh); // refresh the waveforms to draw the new buffers
			prevBuf.free;

	});

}
)

Other analysis tools in FluCoMa could quickly analyse data in your selected slice and display it on the waveform also!

1 Like