Problem loading a buffer while rendering in NRT

Hello!

I am trying to render a pattern score in NRT mode. I can load an existing file from my computer and play it with Playbuf, but when I want to create my own sample from lets say Signal, for some reason it seems that the sample is not loaded to the buffer.

I posted an example code,
Buffer 1 is the existing sample from my computer, and 2 is made from Signal.
The gaps you hear in the audio is when Playbuf plays from buffer 2 , so there is no audio.
To test it please put your own render location.

Why is this happening? Any help would be really appreciated…
thank you!

(
var server = Server(\nrt,
    options: ServerOptions.new
.numOutputBusChannels_(2)
.numInputBusChannels_(2)
),
def = SynthDef(\buf1, { |out, bufnum, rate = 1, time = 0.1, start = 0, amp = 0.1,dur|
	var release= Line.kr(0,1,(rate.reciprocal),doneAction: 0);
	var sig = PlayBuf.ar(1, bufnum, rate*(2048 / Server.default.sampleRate),Impulse.kr(rate), startPos: start);
    Out.ar(out, (sig*0.5 ).dup);
});

def.add;  // the pattern needs the def in the SynthDescLib

x = PmonoArtic(
	\buf1,
    \bufnum,  Pseq([1,2],inf),
	\rate, Pseq([5,8,3],inf)*1,
    \start, 0,
    \time, 0.1,
	\dur, Pkey(\rate).reciprocal,
    \legato, 1,
).asScore(duration: 10, timeOffset: 0.001);

// the score also needs the def and buffer
x.add([0.0, [\d_recv, def.asBytes]]);

x.add([0.0, Buffer.new(server,2048,1,bufnum:1).allocReadMsg(Platform.resourceDir +/+ "sounds/a11wlk01-44_1.aiff")]);

x.add([0.0, Buffer.new(server,2048,1,bufnum:2).loadCollection(Signal.sineFill(2048, 1.0/[1, 2]))]);
x.sort;

~outFile ="/Desktop/nrttiming3.aiff";
x.recordNRT(
    outputFilePath: ~outFile.standardizePath,
    sampleRate: 44100,
    headerFormat: "AIFF",
    sampleFormat: "int16",
    options: server.options,
    duration: 10
);

server.remove;
)

I’m afraid I can’t really explain the reason for this but I’ve had the same problem in a very old project of mine (at a time when SC files were still rich text files…). The solution was that I first had to read in the soundfile in realtime and then use the bufnum of the buffer when adding the buffer to the score:

s.sendMsg(\b_allocRead, ~b = s.bufferAllocator.alloc(1), "/path/to/my/soundfile");

Then, when adding the buffer to my score:

~score.add([0.0, [\b_allocRead, ~b, "/path/to/my/soundfile"]]);

… don’t know if that still works (at least it did back then).

Hey thank you for the reply!

Sorry, I got a bit confused. If I understood correctly, the example you posted has to do with reading from a file in the computer (something like “buffer 1” in my example).
How would you do the equivalent but use “signal” instead to generate a wavetable?
how would you do it in my example?

thank you!

Sorry, I may have misunderstood your question. Reading it again I saw you wrote

Buffer 1 is the existing sample from my computer, and 2 is made from Signal. The gaps you hear in the audio is when Playbuf plays from buffer 2 , so there is no audio.

Does that mean only the buffer that you were trying to fill with a signal isn’t audible in the recording? (I must confess I didn’t test your code)

Doing loadCollection here is analogous to doing Buffer.read – it won’t work for NRT because it’s performing the operation Right Now instead of providing messages for the Score.

Also: loadCollection writes the signal to a temporary disk file, then tells the server to read from the disk file, and then deletes the temporary file (because you don’t want temp files eating up space permanently).

So the solution here, I think, is to take apart the process and do the steps by hand.

f = SoundFile("~/waveform.aiff".standardizePath)
// .sampleRate_(44100)  // default properties are OK for a wavetable
// .headerFormat_("aiff")
// .sampleFormat_("float")
// .numChannels_(1)
;

if(f.openWrite) {
	f.writeData(yourSignal);
	f.close;
};

Then in the Score, use an allocReadMsg where the path is f.path.

Then render.

Only after rendering, File.delete(f.path) the temp file. (Why after rendering? The rendering server needs to read the file. If you delete the file after producing the Score but before rendering, then the file won’t be available.)

hjh

Hi nufets!
That is ok!
Well yes, only buffer2 which is the one trying to be filled with the signal is not audible (thus the gaps) and the one you can hear (buffer1) is the sample read from the computer.

Thank you so much!
this seems to work fine and I am now able to load a custom made signal!
I just have one more question that just occurred to me now. I am trying to resample the audio from the original size to a desired one (in my example 2040), so I can then later use it as you suggested. The thing now I cant write the file.

(

var temp,array;
var samplesize=2.pow(11);

//open a file from disk and resize it
var file = SoundFile.new;
file.openRead(Platform.resourceDir +/+ "sounds/a11wlk01-44_1.aiff");
temp = FloatArray.newClear(file.numFrames);
file.readData(temp);
file.close;
array = temp.asArray.resamp1(samplesize).copy;
//

f =SoundFile("~/waveform.aiff".standardizePath)
// .sampleRate_(44100)  // default properties are OK for a wavetable
// .headerFormat_("aiff")
// .sampleFormat_("float")
// .numChannels_(1)
;

if(f.openWrite) {
	f.writeData(array);
	f.close;
};

)

I guess it is a similar reason why the code underneath does not work.

(

var samplesize=2.pow(11);
var array = [1].resamp0(samplesize);
//

f =SoundFile("~/waveform.aiff".standardizePath)
// .sampleRate_(44100)  // default properties are OK for a wavetable
// .headerFormat_("aiff")
// .sampleFormat_("float")
// .numChannels_(1)
;

if(f.openWrite) {
	f.writeData(array);
	f.close;
};

)

Basically I cant have an array that later is exported to an audio file.
The thing is that when I did “.loadCollection” I would normally load them to a Buffer and play them. Why cant I write them to an audio file in a similar manner?

Thank you in advance!!

Hi @eskay - I think in this case I can give a valid answer :wink: - as lined out in the documentation writeData takes an argument rawArray, which is a bit vague but explained in the subsequent description: either a FloatArray or a Signal. So, you only have to explicitly cast your array to a FloatArray:

if(f.openWrite) {
	f.writeData(array.as(FloatArray));
	f.close;
};

Hey!

thank you so much! that worked lovely!

One more question popped as I was working. I am trying to put some busses to the score. A control Bus and a Audio bus. For some reason the control bus works fine but the audio bus is not (if change the output of “buf1” synth to 0 you can hear the lfo control bus working fine).
Any ideas why?
thank you!!

(
var server = Server(\nrt,
	options: ServerOptions.new
	.numOutputBusChannels_(2)
	.numInputBusChannels_(2)
),
def = SynthDef(\buf1, { |out, bufnum, rate = 1, time = 0.1, start = 0, amp = 0.1,dur,lfobuf|
	var release= Line.kr(0,1,(rate.reciprocal),doneAction: 0);
	var sig = PlayBuf.ar(1, bufnum, In.kr(lfobuf,1)*rate*(2048 / Server.default.sampleRate),Impulse.kr(rate), startPos: start);
	Out.ar(out, (sig*0.5 ).dup);
}),
defLfo = SynthDef(\lfo, { |out=0, freq=100|
	Out.kr(out, LFNoise2.kr(freq).exprange(20.0,100))
}),

defout = SynthDef(\out, { |out=0,inbus, freq=100|
	Out.ar(out, In.ar(inbus,2))
});

def.add;  // the pattern needs the def in the SynthDescLib
defLfo.add;  
defout.add;  


x = Ppar([
	PmonoArtic(
		\buf1,
		\bufnum,  Pseq([1,2],inf),
		\rate, Pseq([5,8,3],inf)*1,
		\start, 0,
		\time, 0.1,
		\lfobuf,10,
		\out,11,
		//\out,0, // put out 0 to show control bus working fine
		\dur, Pkey(\rate).reciprocal,
		\legato, 1,
	),
	PmonoArtic(
		\lfo,
		\freq,20,
		\out,10,
		\dur, Pkey(\freq).reciprocal,
		\legato, 1,
	),
		PmonoArtic(
		\out,
		\freq,1,
		\inbus,11,
		\out,0,
		\dur, Pkey(\freq).reciprocal,
		\legato, 1,
	)
]
).asScore(duration: 10, timeOffset: 0.001);

// the score also needs the def and buffer
x.add([0.0, [\d_recv, def.asBytes]]);
x.add([0.0, [\d_recv, defLfo.asBytes]]);
x.add([0.0, [\d_recv, defout.asBytes]]);



x.add([0.0, Buffer.new(server,2048,1,bufnum:1).allocReadMsg(Platform.resourceDir +/+ "sounds/a11wlk01-44_1.aiff")]);
x.add([0.0, Buffer.new(server,2048,1,bufnum:2).loadCollection(Signal.sineFill(2048, 1.0/[1, 2]))]);
x.add([0.0, Bus.new(\control,10,1,server)]);
x.add([0.0, Bus.new(\audio,11,1,server)]);

x.sort;


~outFile ="~/Desktop/test.aiff".standardizePath;
x.recordNRT(
	outputFilePath: ~outFile.standardizePath,
	sampleRate: 44100,
	headerFormat: "AIFF",
	sampleFormat: "int16",
	options: server.options,
	duration: 10
);

server.remove;
)

There’s no bus-allocation message, so you don’t need these.

s.boot;

s.dumpOSC(1);

b = Buffer.alloc(s, 2048, 1);
-> Buffer(0, 2048, 1, 48000.0, nil)
[ "/b_alloc", 0, 2048, 1, 0 ]  // this needs to be in the score

c = Bus.audio(s, 2);
-> Bus(audio, 4, 2, localhost)
// AND NO MESSAGE HERE hence nothing to add to the Score

The real issue I suspect is order of execution. Probably you need one group for source synths, and another group after that for \out.

I at least get some audio this way:

(
var server = Server(\nrt,
	options: ServerOptions.new
	.numOutputBusChannels_(2)
	.numInputBusChannels_(2)
),
def = SynthDef(\buf1, { |out, bufnum, rate = 1, time = 0.1, start = 0, amp = 0.1,dur,lfobuf|
	var release= Line.kr(0,1,(rate.reciprocal),doneAction: 0);
	var sig = PlayBuf.ar(1, bufnum, In.kr(lfobuf,1)*rate*(2048 / Server.default.sampleRate),Impulse.kr(rate), startPos: start);
	Out.ar(out, (sig*0.5 ).dup);
}),
defLfo = SynthDef(\lfo, { |out=0, freq=100|
	Out.kr(out, LFNoise2.kr(freq).exprange(20.0,100))
}),

defout = SynthDef(\out, { |out=0,inbus, freq=100|
	Out.ar(out, In.ar(inbus,2))
});

// normal node IDs clash with patterns below, it seems?
var srcGroup = Group.basicNew(server, 500);
var outGroup = Group.basicNew(server, 501);

def.add;  // the pattern needs the def in the SynthDescLib
defLfo.add;
defout.add;


x = Ppar([
	PmonoArtic(
		\buf1,
		\bufnum,  Pseq([1,2],inf),
		\rate, Pseq([5,8,3],inf)*1,
		\start, 0,
		\time, 0.1,
		\lfobuf,10,
		\out,11,
		//\out,0, // put out 0 to show control bus working fine
		\dur, Pkey(\rate).reciprocal,
		\legato, 1,
		\group, srcGroup,
	),
	PmonoArtic(
		\lfo,
		\freq,20,
		\out,10,
		\dur, Pkey(\freq).reciprocal,
		\legato, 1,
		\group, srcGroup,
	),
	PmonoArtic(
		\out,
		\freq,1,
		\inbus,11,
		\out,0,
		\dur, Pkey(\freq).reciprocal,
		\legato, 1,
		\group, outGroup,
	)
]
).asScore(duration: 10, timeOffset: 0.001);

x.add([0.0, srcGroup.newMsg(server)]);
x.add([0.0, outGroup.newMsg(srcGroup, \addAfter)]);

// the score also needs the def and buffer
x.add([0.0, [\d_recv, def.asBytes]]);
x.add([0.0, [\d_recv, defLfo.asBytes]]);
x.add([0.0, [\d_recv, defout.asBytes]]);



x.add([0.0, Buffer.new(server,2048,1,bufnum:1).allocReadMsg(Platform.resourceDir +/+ "sounds/a11wlk01-44_1.aiff")]);

// oh and this is old...? I did get some errors about "no buffer data"
x.add([0.0, Buffer.new(server,2048,1,bufnum:2).loadCollection(Signal.sineFill(2048, 1.0/[1, 2]))]);

x.sort;


~outFile ="~/Desktop/test.aiff".standardizePath;
x.recordNRT(
	outputFilePath: ~outFile.standardizePath,
	sampleRate: 44100,
	headerFormat: "AIFF",
	sampleFormat: "int16",
	options: server.options,
	duration: 10
);

server.remove;
)

hjh

Sorry for the late reply but that worked perfectly!
Thank you so much for your answer!!