Adelay use of tempBufferPos

Good day freinds !

I am trying to understand the circular buffer implementation inside the ‘adelay’ sample plugin provided in the SDK folder.
Here’s the section that does the audio processing.

int32 numChannels = SpeakerArr::getChannelCount (arr);
int32 delayInSamples = std::max<int32> (1, (int32)(mDelay * processSetup.sampleRate)); 
for (int32 channel = 0; channel < numChannels; channel++)
	float* inputChannel = data.inputs[0].channelBuffers32[channel];
	float* outputChannel = data.outputs[0].channelBuffers32[channel];
	int32 tempBufferPos = mBufferPos;
	for (int32 sample = 0; sample < data.numSamples; sample++)
         outputChannel[sample] = mBuffer[channel][tempBufferPos];
         mBuffer[channel][tempBufferPos] = inputChannel[sample];
         if (tempBufferPos >= delayInSamples)
	     tempBufferPos = 0;
mBufferPos += data.numSamples;
while (delayInSamples && mBufferPos >= delayInSamples)
	mBufferPos -= delayInSamples;

Why do we create a tempBufferPos

int32 tempBufferPos = mBufferPos;

Why don’t we directly increment the mBufferPos++ inside the sample loop, instead of doing tempBufferPos++ first and then cumulatively increment mBufferPos += data.numSamples. Is it some kind of performance optimization ?

For comparison, I implemented the circular buffer without using temp variables and this seems to work for getting the desired delay, but some how
a fire crackle like sound get added to the the audio. What could be wrong with the below implemntation.

for (int32 channel = 0; channel < numChannels; channel++)
    float *inputChannel = data.inputs[0].channelBuffers32[channel];
    float *outputChannel = data.outputs[0].channelBuffers32[channel];
    for (int32 sample = 0; sample < data.numSamples; sample++)
      mCircularBuffers[channel][mCircularBufferWriteHead] = inputChannel[sample];
      if (mCircularBufferWriteHead >= mCircularBufferLength)  mCircularBufferWriteHead = 0;
      mDelayReadHead = mCircularBufferWriteHead - delayInSamples;
      if (mDelayReadHead < 0) mDelayReadHead += mCircularBufferLength;
      outputChannel[sample] = (mCircularBuffers[channel][(int) mDelayReadHead] + inputChannel[sample])/2;


Your implementation works only for one channel. That’s why there is the temporary variable in the ADelay example, to let it work for any channel count.

Thanks and sorry if my inital post was not very clear.

mCircularBuffers is a 2d array , so its an array where the first index represent the channel number and the second index represents the array for that channel.


So my implementation is targeted at multiple channels.And while the delay works on all channels, I can hear a crackling sound in the output. Can you please suggest what could be causing the crackling sound in the output

You increase mCircularBufferWriteHead for every sample that’s wrong. Just unroll your second loop (for sample = 0…) and you will see what’s wrong.