I am receiving this ABSD from CMSampleBuffer which comes from a broadcasting session from the ReplayKit in iOS.
mFormatID = kAudioFormatLinearPCM,
mFormatFlags = 14,
mChannelsPerFrame = 2,
mBytesPerPacket = 4
mFramesPerPacket = 1
mBytesPerFrame = 4
mBitsPerChannel = 16
mSampleRate = 44100
The AudioBufferList has mNumberOfBuffers = 1 and mNumberOfChannels = 2.
I need to pass to the encode and transmit session an AudiobufferList with the same ABSD but with mNumberOfBuffers = 2 and with mNumberOfChannels = 1 for each buffer.
From the specifications of the streaming session I have the following instructions:
"The audio samples are copied by the enqueueAudioBuffer method synchronously to internal buffers. The timestamp must have a valid mHostTime field. The audio buffer must contain floating point deinterleaved samples matching the session's channel count and sample rate."
The audio I play in the app is garbled.And I think the reason is the number of channels are not matching with the session's channels .
I am new in Core Audio in iOS.
Please, how I can change the AudiBufferList to two Buffers with one Channel each one?
Thanks