I want to take a video (eg:16:9 shot with iPhone) and fit and center it in a square with custom background color. My code goes like:
- (void)videoOutput
{
if (!self.firstAsset) {
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"Please Load a Video Asset First"
delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];
[alert show];
return;
}
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, self.firstAsset.duration)
ofTrack:[[self.firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, self.firstAsset.duration);
AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
AVAssetTrack *videoAssetTrack = [[self.firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CGSize videoSize = [[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize];
NSLog(@"Video Size W:%f, H:%f",videoSize.width,videoSize.height);
float scaleRatio = 600/videoSize.width;
[videolayerInstruction setTransform:CGAffineTransformMakeScale(scaleRatio, scaleRatio) atTime:kCMTimeZero];
[videolayerInstruction setOpacity:0.0 atTime:self.firstAsset.duration];
mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];
AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
float renderWidth, renderHeight;
renderWidth = 600;
renderHeight = 600;
mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
[self applyVideoEffectsToComposition:mainCompositionInst size:CGSizeMake(600, 600)];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithFormat:@"FinalVideo-%d.mov",arc4random() % 1000]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL=url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = mainCompositionInst;
[exporter exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
[self exportDidFinish:exporter];
});
}];
}
- (void)applyVideoEffectsToComposition:(AVMutableVideoComposition *)composition size:(CGSize)size
{
UIImage *borderImage = nil;
borderImage = [self imageWithColor:[UIColor greenColor] rectSize:CGRectMake(0, 0, size.width, size.height)];
CALayer *backgroundLayer = [CALayer layer];
[backgroundLayer setContents:(id)[borderImage CGImage]];
backgroundLayer.frame = CGRectMake(0, 0, size.width, size.height);
[backgroundLayer setMasksToBounds:YES];
AVPlayerItem *playerItem2 = [[AVPlayerItem alloc] initWithAsset:secondAsset];
AVPlayer *videoPlayer2 = [AVPlayer playerWithPlayerItem:playerItem2];
AVPlayerLayer *videoLayer = [AVPlayerLayer playerLayerWithPlayer:videoPlayer2];
CGSize videoSize = [[[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize];
[videoLayer setBackgroundColor:[UIColor whiteColor].CGColor];
videoLayer.frame = CGRectMake(0, (600-337.5)/2, 600, 337.5);
CALayer *parentLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, size.width, size.height);
[parentLayer addSublayer:backgroundLayer];
[parentLayer addSublayer:videoLayer];
composition.animationTool = [AVVideoCompositionCoreAnimationTool
videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
}
The original video and the exported result are the images below. As you can see in the exported video, the frame for the overlaid video is correct. But the video in it does not maintain its aspect ratio. If I choose to make the videolayer frame square, the aspect ratio remains normal.

I am stuck at this early level. Ultimately I am trying to build a WYSIWYG editor for square videos and apply scale,translation and rotation transformations to the videoLayer that will be rendered in the square video. Any help with this specific question and forward is much appreciated.
This is exactly what I'm getting through. Now I have a solution. If you want to change the video from rectangle shape to square. You need to crop the video and set AVMutableVideoCompositionInstruction's backgroundColor to the colors you want.
instruction.backgroundColor = /CGColorRef/;
Quote by Apple :
/* Indicates the background color of the composition. Solid BGRA colors only are supported; patterns and other color refs that are not supported will be ignored. If the background color is not specified the video compositor will use a default backgroundColor of opaque black. If the rendered pixel buffer does not have alpha, the alpha value of the backgroundColor will be ignored. */ @property (nonatomic, retain, nullable) attribute((NSObject)) CGColorRef backgroundColor CF_RETURNS_RETAINED;
The wrong way to set instruction backgroundColor is :
instruction.backgroundColor = [UIColor blueColor].CGColor;
The correct one is :
Done!
By the way, the videolayer and videoComposition.renderSize must set naturalSize of original video. You can not set videolayer.frame to customized CGRect.