I am trying to get the sample buffer from captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer, process it and then append it to an AVAssetWriter. The whole code works, however it gets really slow and I get low fps on older devices.
I thought of putting it inside a dispatch_async to improve the performance, which however leads to an EXC_BAD_ACCESS error as soon as the sample buffer is accessed.
How could I fix it, while keeping the code in background?
queue1 = dispatch_queue_create("testqueue", DISPATCH_QUEUE_SERIAL);
...
-(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection{
dispatch_async(queue1, ^{
...
if(captureOutput == videoOutput){
//I process the buffer by appending an image to an adaptor
if([writerVideoInput isReadyForMoreMediaData] && recordingAssetWriter.status == 1)
[adaptor appendPixelBuffer:pxBuffer withPresentationTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)]) //<-- here I get EXC_BAD_ACCESS
}
if(captureOutput == audioOutput){
...
// I then append the audio buffer
if([assetWriterAudioInput isReadyForMoreMediaData] && recordingAssetWriter.status == 1)
[assetWriterAudioInput appendSampleBuffer:sampleBuffer];
...
}
});
}
From the
captureOutput:didOutputSampleBuffer:fromConnection:
discussion in theAVCaptureVideoDataOutput.h
header file:So it looks like you need retain those sample buffers because they're going out of scope! Don't forget to release them later, otherwise you'll leak a lot of memory.
I had forgotten that ARC in Objective-C does not manage
CoreFoundation
objects.The header file then goes on to warn against holding on to the sample buffers for too long lest you drop frames.