I'm coding something that:
- record video+audio with the built-in camera and mic (AVCaptureSession),
- do some stuff with the video and audio samplebuffer in realtime,
- save the result into a local .mp4 file using AVAssetWritter,
- then (later) read the file (video+audio) using AVAssetReader,
- do some other stuff with the samplebuffer (for now I do nothing),
- and write the result into a final video file using AVAssetWriter.
Everything works well but I have an issue with the audio format:
When I capture the audio samples from the capture session, I can log about 44 samples/sec, which seams to be normal. When I read the .mp4 file, I only log about 3-5 audio samples/sec! But the 2 files look and sound exactly the same (in QuickTime).
- I didn't set any audio settings for the Capture Session (as Apple doesn't allow it).
I configured the outputSettings of the 2 audio AVAssetWriterInput as follow:
NSDictionary *settings = @{ AVFormatIDKey:@(kAudioFormatLinearPCM), AVNumberOfChannelsKey:@(2), AVSampleRateKey:@(44100.), AVLinearPCMBitDepthKey:@(16), AVLinearPCMIsNonInterleaved:@(NO), AVLinearPCMIsFloatKey:@(NO), AVLinearPCMIsBigEndianKey:@(NO) };
I pass nil to the outputSettings of the audio AVAssetReaderTrackOutput in order to receive samples as stored in the track (according to the doc).
So, the sample rate should be 44100Hz from the CaptureSession to the final file. Why I am reading only a few audio samples? And why is it working anyway? I have the intuition that it will not work well when I'll have to work with the samples (I need to update their timestamps for example).
I tried several other settings (such as kAudioFormatMPEG4AAC), but AVAssetReader can't read compressed audio formats.
Thanks for your help :)