Create frames from video in Swift (iOS)

1k Views Asked by At

There is a video file with duration 3 seconds. I need to create 30 frames - UIImages. Capture image each 0.1 second. I tried to use AVAssetImageGenerator and CMTimeMake but I always getting 30 similar images, or 15 similar images and 15 another similar.

Please help to understand how to make this kind of slideshow from this video. Or maybe there is some better way to do it.

Please see the code below:

static func generate_Thumbnails(forVideoWithURL url : URL) -> [UIImage]? {
    let asset = AVAsset(url: url)
    var result: [UIImage] = []
    let assetImgGenerator = AVAssetImageGenerator(asset: asset)
    assetImgGenerator.appliesPreferredTrackTransform = true
    for i in 1...30 {
        let time: CMTime = CMTimeMake(value: Int64(i), timescale: 10)
        do {
            let img: CGImage = try assetImgGenerator.copyCGImage(at: time, actualTime: nil)
            let frameImg: UIImage = UIImage(cgImage: img)
            result.append(frameImg)
        } catch {
            //return nil
        }
    }
    return result
}
2

There are 2 best solutions below

1
On BEST ANSWER

I tried solution from Amin Benarieb, and it seems to work:

static func toImages(fromVideoUrl url: URL) -> [UIImage]? {
    let asset = AVAsset(url: url)
    guard let reader = try? AVAssetReader(asset: asset) else { return nil }
    let videoTrack = asset.tracks(withMediaType: .video).first!
    let outputSettings = [String(kCVPixelBufferPixelFormatTypeKey): NSNumber(value: kCVPixelFormatType_32BGRA)]
    let trackReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: outputSettings)
    reader.add(trackReaderOutput)
    reader.startReading()
    var images = [UIImage]()
    while reader.status == .reading {
        autoreleasepool {
            if let sampleBuffer = trackReaderOutput.copyNextSampleBuffer() {
                if let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
                    let ciImage = CIImage(cvImageBuffer: imageBuffer)
                    images.append(UIImage(ciImage: ciImage))
                }
            }
        }
    }
    return images
}
1
On

I haven't read the docs for AVAssetImageGenerator, but in practice I've only ever been able to generate 1 image per second. So you should be able to get an image at 1, 2, and 3 seconds (but not 30). Here is the code I use to generate images, which is very similar to yours.

private func getPreviewImage(forURL url: URL, atSeconds seconds: Double) -> UIImage? {
        let asset = AVURLAsset(url: url)
        let generator = AVAssetImageGenerator(asset: asset)
        generator.appliesPreferredTrackTransform = true

        let timestamp = CMTime(seconds: seconds, preferredTimescale: 100)

        do {
            let imageRef = try generator.copyCGImage(at: timestamp, actualTime: nil)
            return UIImage(cgImage: imageRef)
        }
        catch let error as NSError
        {
            print("Image generation failed with error \(error)")
            return nil
        }
    }