I'm using the AVVideoComposition
API to get CIImage
s from a local video, and after scaling down the CIImage
I'm getting nil
when trying to get the CVPixelBuffer
.
Before scaling down the source frame, I'm getting the original frame CVPixelBuffer
.
Is there any reason the buffer is nil
after scaling down?
Sample:
AVVideoComposition(asset: asset) { [weak self] request in
let source = request.sourceImage
let pixelBuffer = source.pixelBuffer // return value
let scaledDown = source.transformed(by: .init(scaleX: 0.5, y: 0.5))
let scaledPixelBuffer // return nil
})
I think the last line in your sample is incomplete. Did you mean
let scaledPixelBuffer = scaledDown.pixelBuffer
? If so, then yes, this won't work. The reason is that thepixelBuffer
property is only available if theCIImage
was created directly from aCVPixelBuffer
. From the docs:The
CIImage
that is passed to the composition block was created from a pixel buffer provided by AVFoundation. But when you apply a filter or transform to it, you need to render the resulting image into a pixel buffer explicitly using aCIContext
, otherwise you won't get a result.If you want to change the size of the video frames the composition is using, you can use a
AVMutableVideoComposition
instead and set itsrenderSize
to your desired size after it is initialized: