generateCGImagesAsynchronously produces duplicate images according to actualTime

724 Views Asked by At

I am able to get the frames from a video using AVAssetImageGenerator, and I do so using generateCGImagesAsynchronously. Whenever the result succeeds, I print the requestedTime in seconds as well as the actualTime for that image. In the end, however, two generated images are the same and have the same actualTimes even though my step value and frames for times are evenly spaced out.

Here is a snippet of my printed requested and actual times in seconds for each image:

Requested: 0.9666666666666667
Actual: 0.9666666666666667

Requested: 1.0
Actual: 1.0

Requested: 1.0333333333333334
Actual: 1.0333333333333334

Requested: 1.0666666666666667
Actual: 1.0666666666666667

Requested: 1.1
Actual: 1.1

Requested: 1.1333333333333333
Actual: 1.1

Requested: 1.1666666666666667
Actual: 1.135

It seems to be going fine until the frame corresponding to 1.1 seconds in the video is generated, which results in two of the same images and the actualTime to be delayed for the rest of the process.

I've already tried adjusting the way in which I compute the frames that should be generated, but it seems to be correct. I am using frames per second and multiplying that by the video duration to figure out how many frames I need to have in total, and I'm dividing the total duration by the sample counts to make sure cgImages are generated evenly.

let videoDuration = asset.duration
print("video duration: \(videoDuration.seconds)")
let videoTrack = asset.tracks(withMediaType: AVMediaType.video)[0]
let fps = videoTrack.nominalFrameRate
var frameForTimes = [NSValue]()
let sampleCounts = Int(videoDuration.seconds * Double(fps))
let totalTimeLength = Int(videoDuration.seconds * Double(videoDuration.timescale))
let step = totalTimeLength / sampleCounts
for i in 0 ..< sampleCounts {
     let cmTime = CMTimeMake(value: Int64(i * step), timescale: Int32(videoDuration.timescale))
     frameForTimes.append(NSValue(time: cmTime))
}

and the way in which I create images (see this):

imageGenerator.generateCGImagesAsynchronously(forTimes: timeValues) { (requestedTime, cgImage, actualTime, result, error) in
     if let cgImage = cgImage {
          print("Requested: \(requestedTime.seconds), Actual: \(actualTime.seconds)")
          let image = UIImage(cgImage: cgImage)
          // scale image if you want
          frames.append(image)
     }
}

I also set tolerance to zero before calling generateCGImages:

imageGenerator.requestedTimeToleranceBefore = CMTime.zero
imageGenerator.requestedTimeToleranceAfter = CMTime.zero

I expected the actual times to be consistent with the requested times, and for each image produced to be different. Looking through the images, there is always a duplicate regardless of the video being tested, and it normally occurs towards the middle to end.

Edit:

I found this, this, and this which mention the same problem but I've had no success with any of them.

0

There are 0 best solutions below