AVAssetWriterInputPixelBufferAdaptor and CMTime

8.9k Views Asked by At

I'm writing some frames to video with AVAssetWriterInputPixelBufferAdaptor, and the behavior w.r.t. time isn't what I'd expect.

If I write just one frame:

 [videoWriter startSessionAtSourceTime:kCMTimeZero];
 [adaptor appendPixelBuffer:pxBuffer withPresentationTime:kCMTimeZero];

this gets me a video of length zero, which is what I expect.

But if I go on to add a second frame:

 // 3000/600 = 5 sec, right?
 CMTime nextFrame = CMTimeMake(3000, 600); 
 [adaptor appendPixelBuffer:pxBuffer withPresentationTime:nextFrame];

I get ten seconds of video, where I'm expecting five.

What's going on here? Does withPresentationTime somehow set both the start of the frame and the duration?

Note that I'm not calling endSessionAtSourceTime, just finishWriting.

3

There are 3 best solutions below

0
On

According to the documentation of -[AVAssetWriterInput appendSampleBuffer:] method:

For track types other than audio tracks, to determine the duration of all samples in the output file other than the very last sample that's appended, the difference between the sample buffer's output DTS and the following sample buffer's output DTS will be used. The duration of the last sample is determined as follows:

  1. If a marker sample buffer with kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration is appended following the last media-bearing sample, the difference between the output DTS of the marker sample buffer and the output DTS of the last media-bearing sample will be used.
  2. If the marker sample buffer is not provided and if the output duration of the last media-bearing sample is valid, it will be used.
  3. if the output duration of the last media-bearing sample is not valid, the duration of the second-to-last sample will be used.

So, basically you are in the situation #3:

  • The first sample's duration is 5s, based on the PTS different between first and second sample
  • The second sample's duration is 5s too, because the duration of the second-to-last sample was used
2
On

Have you tried using this as your first call

CMTime t = CMTimeMake(0, 600);
[videoWriter startSessionAtSourceTime:t];
[adaptor appendPixelBuffer:pxBuffer withPresentationTime:t];
1
On

Try looking at this example and reverse engineering to add 1 frame 5 seconds later...

Here is the sample code link: [email protected]:RudyAramayo/AVAssetWriterInputPixelBufferAdaptorSample.git

Here is the code you need:

- (void) testCompressionSession
{
CGSize size = CGSizeMake(480, 320);


NSString *betaCompressionDirectory = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"];

NSError *error = nil;

unlink([betaCompressionDirectory UTF8String]);

//----initialize compression engine
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:betaCompressionDirectory]
                                                       fileType:AVFileTypeQuickTimeMovie
                                                          error:&error];
NSParameterAssert(videoWriter);
if(error)
    NSLog(@"error = %@", [error localizedDescription]);

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                               [NSNumber numberWithInt:size.height], AVVideoHeightKey, nil];
AVAssetWriterInput *writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];

NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
                                                       [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                                                                                 sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);

if ([videoWriter canAddInput:writerInput])
    NSLog(@"I can add this input");
else
    NSLog(@"i can't add this input");

[videoWriter addInput:writerInput];

[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];

//---
// insert demo debugging code to write the same image repeated as a movie

CGImageRef theImage = [[UIImage imageNamed:@"Lotus.png"] CGImage];

dispatch_queue_t    dispatchQueue = dispatch_queue_create("mediaInputQueue", NULL);
int __block         frame = 0;

[writerInput requestMediaDataWhenReadyOnQueue:dispatchQueue usingBlock:^{
    while ([writerInput isReadyForMoreMediaData])
    {
        if(++frame >= 120)
        {
            [writerInput markAsFinished];
            [videoWriter finishWriting];
            [videoWriter release];
            break;
        }

        CVPixelBufferRef buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:theImage size:size];
        if (buffer)
        {
            if(![adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(frame, 20)])
                NSLog(@"FAIL");
            else
                NSLog(@"Success:%d", frame);
            CFRelease(buffer);
        }
    }
}];

NSLog(@"outside for loop");

}


- (CVPixelBufferRef )pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, 
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, &pxbuffer);
// CVReturn status = CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &pxbuffer);

NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); 

CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);

CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, kCGImageAlphaPremultipliedFirst);
NSParameterAssert(context);

CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);

CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);

CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

return pxbuffer;
}