AVPlayer from AVMutableComposition with audio and video won't play

1.2k Views Asked by At

I'm trying to show a video from a composition of both video and audio. However, I seem to have a problem, once the video status never reaches AVPlayerStatusReadyToPlay.

If I include the video asset or the audio asset directly to the player item it will work. Thus, I know there is no problem with the assets.

This is my code:

       - (void) loadPlayer {
            NSURL *videoURL = **;
            AVURLAsset *videoAsset = [AVURLAsset URLAssetWithURL:videoURL options:nil];

            NSURL *audioURL = **;
            AVURLAsset *audioAsset = [AVURLAsset URLAssetWithURL:audioURL options:nil];


            NSArray *keys = [NSArray arrayWithObject:@"duration"];
            [videoAsset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {

                NSError *error = nil;
                AVKeyValueStatus durationStatus = [videoAsset statusOfValueForKey:@"duration" error:&error];

                switch (durationStatus) {
                    case AVKeyValueStatusLoaded:;
                        _videoDuration = videoAsset.duration;
                        if (_audioDuration.flags == kCMTimeFlags_Valid) {
                            [self loadPlayWithVideoAsset:videoAsset withDuration:_videoDuration andAudioAsset:audioAsset withDuration:_audioDuration];
                        }
                        break;
                }
            }];

            [audioAsset loadValuesAsynchronouslyForKeys:keys completionHandler:^() {

                NSError *error = nil;
                AVKeyValueStatus durationStatus = [audioAsset statusOfValueForKey:@"duration" error:&error];

                switch (durationStatus) {
                    case AVKeyValueStatusLoaded:;
                        _audioDuration = audioAsset.duration;
                        if (_videoDuration.flags == kCMTimeFlags_Valid) {
                            [self loadPlayWithVideoAsset:videoAsset withDuration:_videoDuration andAudioAsset:audioAsset withDuration:_audioDuration];
                        }
                        break;
                }
            }];
        }

        - (void) loadPlayWithVideoAsset:(AVURLAsset *)videoAsset withDuration:(CMTime)videoDuration andAudioAsset:(AVURLAsset *)audioAsset withDuration:(CMTime)audioDuration {


        AVMutableComposition *composition = [AVMutableComposition composition];

        //Video
        AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] lastObject];
        NSError *videoError = nil;
        if (![compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,videoDuration)
                                            ofTrack:videoTrack
                                             atTime:kCMTimeZero
                                              error:&videoError])  {
            NSLog(@"videoError: %@",videoError);
        }



        //Audio
        AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
        AVAssetTrack *audioTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] lastObject];
        NSError *audioError = nil;
        if (![compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,audioDuration)
                                            ofTrack:audioTrack
                                             atTime:kCMTimeZero
                                              error:&audioError]) {
            NSLog(@"audioError: %@",audioError);
        }


        NSInteger compare = CMTimeCompare(videoDuration, audioDuration);

        if (compare == 1) {
            //The video is larger
            CMTime timeDiff = CMTimeSubtract(videoDuration, audioDuration);
            [compositionAudioTrack insertEmptyTimeRange:CMTimeRangeMake(audioDuration, timeDiff)];
        }
        else {
            CMTime timeDiff = CMTimeSubtract(audioDuration, videoDuration);
            [compositionVideoTrack insertEmptyTimeRange:CMTimeRangeMake(videoDuration, timeDiff)];
        }
        AVPlayerItem * playerItem = [AVPlayerItem playerItemWithAsset:composition];
        self.mPlayer = [AVPlayer playerWithPlayerItem:playerItem];
        self.mPlaybackView = [[AVPlayerPlaybackView alloc] initWithFrame:CGRectZero];
        [self.view addSubview:self.mPlaybackView];
        [self.mPlayer addObserver:self forKeyPath:@"status" options:0 context:AVPlayerPlaybackViewControllerStatusObservationContext];
}
- (void)observeValueForKeyPath:(NSString*) path ofObject:(id)object change:(NSDictionary*)change context:(void*)context
{
    if (self.mPlayer.status == AVPlayerStatusReadyToPlay) {
        [self.mPlaybackView setPlayer:self.mPlayer];
        isReadyToPlay = YES;
        _playVideoBtn.hidden = NO;
    }
}
- (void) playVideo {
    if (YES || isReadyToPlay) {
        [self.mPlayer play];
    }
}
1

There are 1 best solutions below

0
On

From my experience, AVPlayer works with AVMutableComposition only if resource/video is bundled with app. If video resource is on network, then AVPlayer won't play with AVMUtableComposition, despite status reported by AVPlayerItem and AVPlayer as "Ready to Play".