I am building a UIImage from a CMSampleBuffer. From the main thread, I call a function to access the pixel data in the CMSampleBuffer and convert the YCbCr planes into an ABGR bitmap which I wrap in a UIImage. I call the function from the main thread with:
let priority = DISPATCH_QUEUE_PRIORITY_DEFAULT
dispatch_async(dispatch_get_global_queue(priority, 0), {() -> Void in
let image = self.imageFromSampleBuffer(frame)
dispatch_async(dispatch_get_main_queue(), {() -> Void in
self.testView.image = image
self.testView.hidden = false
})
})
This maintains responsiveness of the UI and main thread as I would hope. The function processing the buffer is:
func imageFromSampleBuffer(sampleBuffer: CMSampleBuffer) -> UIImage {
let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
CVPixelBufferLockBaseAddress(pixelBuffer, 0)
let lumaBaseAddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0)
let chromaBaseAddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1)
let width = CVPixelBufferGetWidth(pixelBuffer)
let height = CVPixelBufferGetHeight(pixelBuffer)
let lumaBytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 0)
let chromaBytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 1)
let lumaBuffer = UnsafeMutablePointer<UInt8>(lumaBaseAddress)
let chromaBuffer = UnsafeMutablePointer<UInt8>(chromaBaseAddress)
var rgbaImage = [UInt8](count: 4*width*height, repeatedValue: 0)
for var x = 0; x < width; x++ {
for var y = 0; y < height; y++ {
let lumaIndex = x+y*lumaBytesPerRow
let chromaIndex = (y/2)*chromaBytesPerRow+(x/2)*2
let yp = lumaBuffer[lumaIndex]
let cb = chromaBuffer[chromaIndex]
let cr = chromaBuffer[chromaIndex+1]
let ri = Double(yp) + 1.402 * (Double(cr) - 128)
let gi = Double(yp) - 0.34414 * (Double(cb) - 128) - 0.71414 * (Double(cr) - 128)
let bi = Double(yp) + 1.772 * (Double(cb) - 128)
let r = UInt8(min(max(ri,0), 255))
let g = UInt8(min(max(gi,0), 255))
let b = UInt8(min(max(bi,0), 255))
rgbaImage[(x + y * width) * 4] = b
rgbaImage[(x + y * width) * 4 + 1] = g
rgbaImage[(x + y * width) * 4 + 2] = r
rgbaImage[(x + y * width) * 4 + 3] = 255
}
}
let colorSpace = CGColorSpaceCreateDeviceRGB()
let dataProvider: CGDataProviderRef = CGDataProviderCreateWithData(nil, rgbaImage, 4 * width * height, nil)!
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.NoneSkipFirst.rawValue | CGBitmapInfo.ByteOrder32Little.rawValue)
let cgImage = CGImageCreate(width, height, 8, 32, width * 4, colorSpace!, bitmapInfo, dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault)!
let image = UIImage(CGImage: cgImage)
CVPixelBufferUnlockBaseAddress(pixelBuffer,0)
return image
}
If I put a breakpoint just before the function returns, I can use "Quick Look" and see the image (and it is what I would expect). However, once the function returns, I cannot use image
anywhere else and Quick Look always fails. If I attempt to set a UIImageView to the returned image, nothing in the UI changes:
testView.image = image \\The UIImageView does not update.
If I try to access the image in any other way (e.g., to attempt to save it to Parse), the code crashes with EXC_BAD_ACCESS. Again, if I save the image to Parse within the above function, it appears in the backend database as expected.
I have also tried calling the processing function without dispatching to global and main queues by calling the function directly. The results are always the same.
I believe this is because the image is not retained. I have tried defining both the image and CGImage context at the class and file level, but neither change the outcome. I thought this would maintain a reference, but it apparently does not. I am new enough to Swift that I clearly do not understand how ARC is working in this case.
I also believe there were a few times while debugging using Quick Look from within the function that the first time I clicked the Quick Look was "unavailable"... but waiting a few seconds and clicking again results in the image appearing. Is it possible it is just taking longer for the data to be made available? Perhaps GPU->CPU? If so, how do I check/delay to avoid the crash?
How do I maintain a reference? Is there a better way to handle the image created from the CMSampleBuffer?
The problem is the way in which the CGImage is being created. Using
dataProvider
andCGImageCreate
is the specific issue:A working solution using
CGBitmapContextGetData
andCGBitmapContextCreateImage
follows: