Captured video is so laggy when adding overlays

674 Views Asked by At

I am building an app with AVAssetWriter where i add overlay to a video. It works great when I don't try to add overlays. But when I add overlays the video looks cropped from half (as you can see in the screenshot).

Here is my addOverlayToImage function:

func addOverlayToImage(from filteredImage: UIImage) -> UIImage {
   UIGraphicsBeginImageContextWithOptions(self.imageView.frame.size, false, 1.0);
   self.imageView.layer.render(in: UIGraphicsGetCurrentContext()!)
   let imageWithText = UIGraphicsGetImageFromCurrentImageContext()
   UIGraphicsEndImageContext();
   return imageWithText! 
}

I call the function inside captureOutput:

func captureOutput(_ captureOutput: AVCaptureOutput, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection) {
   self.bufferVideoQueue.async() {
     let imageWithOverlay = self.addOverlayToImage(from: self.filteredImage)
     let buffer = self.imageToBuffer(from: imageWithOverlay)
     self.assetWriterPixelBufferInput?.append(buffer!, withPresentationTime: self.currentTime)
   }
}

And the imageToBuffer function:

func imageToBuffer(from image: UIImage) -> CVPixelBuffer? {
    let attrs = [
        String(kCVPixelBufferCGImageCompatibilityKey) : kCFBooleanTrue,
        String(kCVPixelBufferCGBitmapContextCompatibilityKey) : kCFBooleanTrue
    ] as [String : Any]
    var buffer : CVPixelBuffer?
    let status = CVPixelBufferCreate(kCFAllocatorDefault, Int(image.size.width), Int(image.size.height), kCVPixelFormatType_32ARGB, attrs as CFDictionary, &buffer)
    guard (status == kCVReturnSuccess) else {
        return nil
    }

    CVPixelBufferLockBaseAddress(buffer!, CVPixelBufferLockFlags(rawValue: 0))
    let pixelData = CVPixelBufferGetBaseAddress(buffer!)

    let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
    let context = CGContext(data: pixelData, width: Int(image.size.width), height: Int(image.size.height), bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(buffer!), space: rgbColorSpace, bitmapInfo: CGImageAlphaInfo.noneSkipFirst.rawValue)

    context?.translateBy(x: 0, y: image.size.height)
    context?.scaleBy(x: 1.0, y: -1.0)

    UIGraphicsPushContext(context!)
    image.draw(in: CGRect(x: 0, y: 0, width: image.size.width, height: image.size.height))
    UIGraphicsPopContext()
    CVPixelBufferUnlockBaseAddress(buffer!, CVPixelBufferLockFlags(rawValue: 0))

    return buffer
}

And a screenshot from the video:

enter image description here

2

There are 2 best solutions below

5
On

Have you tried configuring the AVAssetWriterInput to specify the data is in real time (set expectsMediaDataInRealTime to true)? This makes a big difference when writing real time (live camera) data and can cause the output to be laggy if not set properly.

https://developer.apple.com/documentation/avfoundation/avassetwriterinput

https://developer.apple.com/documentation/avfoundation/avassetwriterinput/1387827-expectsmediadatainrealtime

5
On

I don't have a definitive answer because I haven't seen that error but I think it might be happening because the processing you're doing on the frame is taking longer than the frame time (1/30th of a second, probably).

My suggestion would be to reduce the time as much as possible. From what I can see you're creating a UIImage from a UIView and then converting that into a CVPixelBuffer.

All of this is happening every frame, however, your content doesn't seem like it needs to change every frame.

I would suggest you store the buffer and in captureOutput(...) add some logic to see if the content has changed. If it hasn't you can use the stored buffer, if it has, you can recalculate it, but now that'll only happen every minute (I'm assuming that from your screenshot), so it shouldn't affect the video.

Finally, you're executing the code asynchronously, this might be causing issues so I would recommend you remove that part and just execute that code in the delegate method. (NOTE: Disregard this if the docs instruct you to do it asynchronously)