AVCaptureStillImageOutput and AVCaptureVideoDataOutput outputs different images

431 Views Asked by At

I am creating an real time camera filter app.

I use AVCaptureVideoDataOutputSampleBufferDelegate to capture the output video and then I apply the filter.

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *sourceImage = [CIImage imageWithCVPixelBuffer:(CVPixelBufferRef)imageBuffer options:nil];
CGRect sourceExtent = sourceImage.extent;

CIImage *filteredImage = [self customFilterImage:sourceImage];

CGFloat sourceAspect = sourceExtent.size.width / sourceExtent.size.height;
CGFloat previewAspect = _videoPreviewViewBounds.size.width  / _videoPreviewViewBounds.size.height;

CGRect drawRect = sourceExtent;
if (sourceAspect > previewAspect)
{
    drawRect.origin.x += (drawRect.size.width - drawRect.size.height * previewAspect) / 2.0;
    drawRect.size.width = drawRect.size.height * previewAspect;
}
else
{
    drawRect.origin.y += (drawRect.size.height - drawRect.size.width / previewAspect) / 2.0;
    drawRect.size.height = drawRect.size.width / previewAspect;
}

[_videoPreviewView bindDrawable];

if (_eaglContext != [EAGLContext currentContext])
    [EAGLContext setCurrentContext:_eaglContext];

if (filteredImage){
    [_ciContext drawImage:filteredImage inRect:_videoPreviewViewBounds fromRect:drawRect];
}

[_videoPreviewView display];
}

This works fine and I can see the filtered live image.

Live image filter applied

When pressing the button then I want to take a snapshot of what I see in the camera and save it as an image in the phone. The image should be good quality.

I use the exactly same filter here as well.

I use AVCaptureStillImageOutput to take the snapshot.

- (IBAction)clickPhotoBtn:(id)sender {

dispatch_async( _captureSessionQueue, ^{
    AVCaptureConnection *connection = [_captureImageOutput connectionWithMediaType:AVMediaTypeVideo];

    [_captureImageOutput captureStillImageAsynchronouslyFromConnection:connection completionHandler:^( CMSampleBufferRef imageDataSampleBuffer, NSError *error ) {
        if ( imageDataSampleBuffer ) {
            CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(imageDataSampleBuffer);
            CIImage *sourceImage = [CIImage imageWithCVPixelBuffer:(CVPixelBufferRef)imageBuffer options:nil];

            CGImageRef imageRef;
            CIImage *filteredImage = [self customFilterImage:sourceImage];

            CIContext *context = [CIContext contextWithOptions:nil];
            imageRef = [context createCGImage:filteredImage fromRect:filteredImage.extent];
            UIImage *lastImage = [UIImage imageWithCGImage:imageRef scale:1.0 orientation:UIImageOrientationRight];

            UIImageWriteToSavedPhotosAlbum(lastImage, nil, nil, nil);

        }
        else {
            NSLog( @"Could not capture still image: %@", error );
        }
    }];
} );    
}

The problem here is that the picture taken here is not the same as I see on the phone screen real time.

snapshot image

It is very bright compared to the original.

What have I missed here?

0

There are 0 best solutions below