I'm trying to create a CVPixelBufferRef
from an IOSurfaceRef
in order to be able to read its pixels values.
The IOSurfaceRef is read from the Syphon (website and repo) framework's SyphonOpenGLClient
. My code is inspired from the Simple Client example that can be found here and is based on the Apple's documentation about CVPixelBufferCreateWithIOSurface.
Here it is:
m_client = [[SyphonOpenGLClient alloc] initWithServerDescription:[serverDescription copy]
context:CGLGetCurrentContext()
options:nil
newFrameHandler:^(SyphonOpenGLClient *client) {
// Get the new frame produced by the client.
SyphonImageBase *frame = (SyphonImageBase *)[client newFrameImage];
// It works: IOSurfaceRef exists.
m_width = IOSurfaceGetWidth((IOSurfaceRef)[frame surface]);
m_height = IOSurfaceGetHeight((IOSurfaceRef)[frame surface]);
CVPixelBufferRef buffer = NULL;
CVPixelBufferCreateWithIOSurface(kCFAllocatorDefault, (IOSurfaceRef)[frame surface], NULL, &buffer);
// Logs 0x0
NSLog(@"Buffer: %p", buffer);
}];
I'm really not a specialist of CoreVideo
, I'm tinkering a lot.
I would like to understand how to get pixels from an IOSurfaceRef.
Thank you.