I'm using pre-trained CoreML model inside ARKit app. And I'm capturing images from ARCamera and placing them into CVPixelBuffer for processing:
let pixelBuffer: CVPixelBuffer? = sceneView.session.currentFrame?.capturedImage
ARKit can capture pixel buffers in a YCbCr format. To correctly render these images on an iPhone's display, you'll need to access the luma
and chroma
planes of the pixel buffer and convert full-range YCbCr values to an sRGB using float4x4 ycbcrToRGBTransform
matrix. So I understand how to handle a color.
But I'd like to know if I can change a resolution of Captured AR Images in CVPixelBuffer?
How to do it? I need a processing to be as low as possible.
Yes, you can do it. Here is how!
Reference: https://github.com/hollance/CoreMLHelpers/tree/master/CoreMLHelpers