I've been trying to figure out AVCapture the last couple of days and am struggling to save a video. My understanding is that you call [movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
and then at a later time you can call [movieFileOutput stopRecording];
And it should then call the delegate method -(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error{
. After that I should be able to save the movie with something like UISaveVideoAtPathToSavedPhotosAlbum([outputFileURL path] ,nil,nil,nil);
But apparently I'm not doing it correctly. When I start the session and then startRecordingToOutputFile
it immediately calls the delegate didFinishRecording
. I can't figure out why. Here is a my code:
-(void)viewDidAppear:(BOOL)animated{
[super viewDidAppear:animated];
session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
session.sessionPreset = AVCaptureSessionPresetMedium;
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.frame = self.imagePreview.bounds; //UIView *imagePreview
[self.imagePreview.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [self getCamera];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(@"ERROR: trying to open camera: %@", error);
}
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [paths objectAtIndex:0];
movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
NSString *archives = [documentsDirectoryPath stringByAppendingPathComponent:@"archives"];
NSString *outputpathofmovie = [[archives stringByAppendingPathComponent:@"Test"] stringByAppendingString:@".mp4"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputpathofmovie];
[session addInput:input];
[session addOutput:movieFileOutput];
[session commitConfiguration];
[session startRunning];
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
[NSTimer timerWithTimeInterval:7 target:self selector:@selector(stopRun) userInfo:nil repeats:NO];
/*
[self initializeCamera];
*/
}
-(void)stopRun{
[movieFileOutput stopRecording];
}
-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error{
NSLog(@"capture done url: %@",outputFileURL);
UISaveVideoAtPathToSavedPhotosAlbum([outputFileURL path] ,nil,nil,nil);
}
-(AVCaptureDevice*)getCamera{
NSArray *devices = [AVCaptureDevice devices];
AVCaptureDevice *frontCamera;
AVCaptureDevice *backCamera;
for (AVCaptureDevice *device in devices) {
NSLog(@"Device name: %@", [device localizedName]);
if ([device hasMediaType:AVMediaTypeVideo]) {
if ([device position] == AVCaptureDevicePositionBack) {
NSLog(@"Device position : back");
backCamera = device;
}
else {
NSLog(@"Device position : front");
frontCamera = device;
}
}
}
return frontCamera;
}
Sorry that it's so lengthy. I hope a lot of this code can be useful to someone else.
Disclaimer: I am not an Objective C programmer and its only been 15 days since i first started reading about the language itself.
I had to do something similar. Here is the code below which worked for me. I grabed it from different stack overflow questions and examples in developer.apple.com. I have commented out the code which i didn't need for my working prototype. You can play around with it.