I'm looking to aggregate live representations of all windows. Much like Mission Control (Exposé), I want to extremely quickly access the image buffer of any given NSWindow or screen. Ideally, I want to composite these live images in my own OpenGL context so I can manipulate them (scale and move the windows screen captures around).
Things that are too slow:
CGDisplayCreateImage
CGWindowListCreateImage
CGDisplayIDToOpenGLDisplayMask
&CGLCreateContext
&CGBitmapContextCreate
Any other ideas? I'm trying to achieve 60 fps capture/composite/output but the best I can get with any of these methods is ~5 fps (on a retina display capturing the entire screen).
Unfortunately, I haven't found away to quickly capture the framebuffers of individual windows, but I figured out the next best thing. This is a method for quickly capturing the live view of the entire screen(s) into OpenGL:
AVFoundation Setup
On Each
AVCaptureVideoDataOutput
FrameCleanup
The big difference here between some other implementations that get the
AVCaptureVideoDataOutput
image into OpenGL as a texture is that they might useCVPixelBufferLockBaseAddress
,CVPixelBufferGetBaseAddress
,glTexImage2D
, andCVPixelBufferUnlockBaseAddress
. The issue with this approach is that it's typically terribly redundant and slow.CVPixelBufferLockBaseAddress
will make sure that the memory it's about to hand you is not GPU memory, and will copy it all to general purpose CPU memory. This is bad! After all, we'd just be copying it back to the GPU withglTexImage2D
.So, we can take advantage of the fact that the
CVPixelBuffer
is already in GPU memory withCVOpenGLTextureCacheCreateTextureFromImage
.I hope this helps someone else... the
CVOpenGLTextureCache
suite is terribly documented and its iOS counterpartCVOpenGLESTextureCache
is only slightly better documented.60fps at 20% CPU capturing the 2560x1600 desktop!