I have an iPhone camera attachment that captures video at 9FPS and makes it available as individual UIImages. I'm trying to stitch these images together to create a timelapse video of what the camera sees using AVFoundation.
I'm not sure how to properly convert frames and timing to achieve the time compression I want.
For example - I want 1 hour of real life footage to be converted into 1 minute of time lapse. This tells me that I need to capture every 60th frame and append it to timelapse.
Does the code below accomplish 60 seconds to 1 second time lapse conversion? Or do I need to add some more multiplication/division by kRecordingFPS
?
#define kRecordingFPS 9
#define kTimelapseCaptureFrameWithMod 60
//frameCount is the number of frames that the camera has output so far
if(frameCount % kTimelapseCaptureFrameWithMod == 0)
{
//...
//convert image and prepare it for recording
[self appendImage:image
atTime:CMTimeMake(currentRecordingFrameNumber++, kRecordingFPS)];
}
Your code will make 1 frame out of 60 go in your film every 1/9s, and increase the frameIndex by one each time.
The result should be a film of [max value for frameCount] / (60 * 9). If you have 32400 frames (1h of film at 9fps), the film will be of {540:9 = 60s}.
Try printing the CMTime using CMTimeShow() at every call to
appendImage:atTime:
to check.