I am receiving a CMSampleBuffer from the front camera of my iPhone. Currently its size is 1920x1080, and I want to scale it down to 1280x720. I want to use the vImageScale function but I can't get it working correctly. The pixel format from the camera is kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, so I have tried the following, but it outputs a weird green image which isn't correct:
private var scaleBuffer: vImage_Buffer = {
var scaleBuffer: vImage_Buffer = vImage_Buffer()
let newHeight = 720
let newWidth = 1280
scaleBuffer.data = UnsafeMutableRawPointer.allocate(byteCount: Int(newWidth * newHeight * 4), alignment: MemoryLayout<UInt>.size)
scaleBuffer.width = vImagePixelCount(newWidth)
scaleBuffer.height = vImagePixelCount(newHeight)
scaleBuffer.rowBytes = Int(newWidth * 4)
return scaleBuffer
}()
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)
{
guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
return
}
CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))
// create vImage_Buffer out of CVImageBuffer
var inBuff: vImage_Buffer = vImage_Buffer()
inBuff.width = UInt(CVPixelBufferGetWidth(imageBuffer))
inBuff.height = UInt(CVPixelBufferGetHeight(imageBuffer))
inBuff.rowBytes = CVPixelBufferGetBytesPerRow(imageBuffer)
inBuff.data = CVPixelBufferGetBaseAddress(imageBuffer)
// perform scale
var err = vImageScale_CbCr8(&inBuff, &scaleBuffer, nil, 0)
if err != kvImageNoError {
print("Can't scale a buffer")
return
}
CVPixelBufferUnlockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))
var newBuffer: CVPixelBuffer?
let attributes : [NSObject:AnyObject] = [
kCVPixelBufferCGImageCompatibilityKey : true as AnyObject,
kCVPixelBufferCGBitmapContextCompatibilityKey : true as AnyObject
]
let status = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
Int(scaleBuffer.width), Int(scaleBuffer.height),
kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, scaleBuffer.data,
Int(scaleBuffer.width) * 4,
nil, nil,
attributes as CFDictionary?, &newBuffer)
guard status == kCVReturnSuccess,
let b = newBuffer else {
return
}
// Do something with the buffer to output it
}
What's going wrong here? Looking at this answer here, it looks like I need to scale the "Y" and the "UV" planes separately. How can I do that in swift and then combine them back into one CVPixelBuffer?
The
imageBufferthat's returned fromCMSampleBufferGetImageBufferactually contains two discrete planes - a luminance plane and a chrominance plane (note that for 420, the chrominance plane is half the size of the luminance plane). This is discussed in this sample code project.This gets you almost there. I don't have experience with the Core Video
CVPixelBufferCreateWithBytes, but this code will create you the scaledYpandCbCrbuffers, and convert them to an interleaved ARGB buffer: