iOS app Video Capture Slows down in low light

1.2k Views Asked by At

I have an iOS app that is using the front camera of the phone and setting up an AVCaptureSession to read through the incoming camera data. I set up a simple frame counter to check the speed of data incoming, and to my surprise, when the camera is in low light the frame rate (measured using the imagecount variable in the code) is very slow, but as soon as I move the phone into a brightly lit area the frame rate will almost triple. I would like to keep the high frame rate of image processing throughout and have set the minFrameDuration variable to 30 fps, but that didnt help. Any ideas on why this random behaviour?

Code to create the capture session is below:

#pragma mark Create and configure a capture session and start it running

- (void)setupCaptureSession
{

NSError *error = nil;

// Create the session
session = [[AVCaptureSession alloc] init];

// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPresetLow;

// Find a suitable AVCaptureDevice
//AVCaptureDevice *device=[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSArray *devices = [AVCaptureDevice devices];
AVCaptureDevice *frontCamera;
AVCaptureDevice *backCamera;

for (AVCaptureDevice *device in devices) {



    if ([device hasMediaType:AVMediaTypeVideo]) {

        if ([device position] == AVCaptureDevicePositionFront) {

            backCamera = device;
        }
        else {

            frontCamera = device;
        }
    }
}


//Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera
                                                                    error:&error];

if (!input) {
    //Handling the error appropriately.
}
[session addInput:input];

// Create a VideoDataOutput and add it to the session
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];

// Configure your output.
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);

// Specify the pixel format
output.videoSettings =
[NSDictionary dictionaryWithObject:
 [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
                            forKey:(id)kCVPixelBufferPixelFormatTypeKey];

// If you wish to cap the frame rate to a known value, such as 30 fps, set
// minFrameDuration.
output.minFrameDuration = CMTimeMake(1,30);

//Start the session running to start the flow of data

[session startRunning];

}

#pragma mark Delegate routine that is called when a sample buffer was written

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
   fromConnection:(AVCaptureConnection *)connection
{
  //counter to track frame rate
  imagecount++;

  //display to help see speed of images being processed on ios app
  NSString *recognized = [[NSString alloc] initWithFormat:@"IMG COUNT - %d",imagecount];
  [self performSelectorOnMainThread:@selector(debuggingText:) withObject:recognized waitUntilDone:YES];

}
1

There are 1 best solutions below

0
On

When there is less light, the camera requires a longer exposure to get the same signal to noise ratio in each pixel. That is why you might expect the frame rate to drop in low light.

You are setting minFrameDuration to 1/30 s in an attempt to prevent long-exposure frames from slowing down the frame rate. However, you should be setting maxFrameDuration instead: your code as-is says the frame rate is no faster than 30 FPS, but it could be 10 FPS, or 1 FPS....

Also, the Documentation say to bracket any changes to these parameters with lockForConfiguration: and unlockForConfiguration: , so it may be that your changes just didn't take.