Not able to Extract each and every frame from Video

365 Views Asked by At

I have a 7 second MOV video at 30fps, but when I try to loop the code for multiple values it doesn't work. It always give me 7-8 unique frames even if I am able to write all frames. Requested times are always different but I always get actual time frames.

MPMoviePlayerController *moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"video" ofType:@"MOV"]]];
moviePlayer.shouldAutoplay = NO;


NSMutableArray *timesm=[[NSMutableArray alloc]init];
for (int i=0; i<frames; i++) {
    CMTime firstThird = CMTimeMakeWithSeconds(videoDurationSeconds *i/(float)frames, 600);

    [timesm addObject:[NSValue valueWithCMTime:firstThird]];
    double abc=firstThird.value/600;
    UIImage *thumbnail = [moviePlayer thumbnailImageAtTime:abc timeOption:MPMovieTimeOptionExact];

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSString *savedImagePath = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"zshipra %i.jpg",i]];
    NSData *imageData = UIImageJPEGRepresentation(thumbnail,0.7);
    [imageData writeToFile:savedImagePath atomically:NO];
}

I also tried another way from the Apple developer site, but both are giving 7 frames from 7 seconds video.

AVAssetImageGenerator *generate1 = [[AVAssetImageGenerator alloc] initWithAsset:asset1];
generate1.appliesPreferredTrackTransform = YES;
NSError *err = NULL;
CMTime videoDuration = asset1.duration;
float videoDurationSeconds = CMTimeGetSeconds(videoDuration);
int frames=(int)videoDurationSeconds*30;

NSMutableArray *timesm=[[NSMutableArray alloc]init];
for (int i=0; i<frames; i++) {
    CMTime firstThird = CMTimeMakeWithSeconds(videoDurationSeconds *i/(float)frames, 600);

    [timesm addObject:[NSValue valueWithCMTime:firstThird]];

}


[generate1 generateCGImagesAsynchronouslyForTimes:timesm
                                completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error) {

    UIImage* myImage = [[UIImage alloc] initWithCGImage:image];

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSString *savedImagePath = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"ankur-%@ %i.jpg",image,(int)[[NSDate date] timeIntervalSince1970]]];
    NSData *imageData = UIImageJPEGRepresentation(myImage,0.7);
    [imageData writeToFile:savedImagePath atomically:NO];


    NSString *requestedTimeString = (NSString *)

    CFBridgingRelease(CMTimeCopyDescription(NULL, requestedTime));

    NSString *actualTimeString = (NSString *)

    CFBridgingRelease(CMTimeCopyDescription(NULL, actualTime));

    NSLog(@"Requested: %@; actual %@", requestedTimeString, actualTimeString);

    if (result == AVAssetImageGeneratorSucceeded) {

        // Do something interesting with the image.
    }


    if (result == AVAssetImageGeneratorFailed) {

        NSLog(@"Failed with error: %@", [error localizedDescription]);
    }

    if (result == AVAssetImageGeneratorCancelled) {

        NSLog(@"Canceled");

   }

}];
1

There are 1 best solutions below

0
On BEST ANSWER

I've tested with your code. And with two changes, it seems all the frame images are successfully generated.

1) Specify the tolerance options

generate1.requestedTimeToleranceAfter = CMTimeMakeWithSeconds(1/30.0, videoDuration.timescale);
generate1.requestedTimeToleranceBefore = CMTimeMakeWithSeconds(1/30.0, videoDuration.timescale);

2) Use native CMTime unit, not second unit

NSMutableArray *timesm=[[NSMutableArray alloc]init];
for (int i=0; i<frames; i++) {
    CMTime firstThird = CMTimeMake( i * (videoDuration.timescale / 30.0f), videoDuration.timescale);
    [timesm addObject:[NSValue valueWithCMTime:firstThird]];
}