The app i'm writing contains 2 parts:
- An audio player that plays stereo MP3 files
- Video conferencing using webRTC
Each part works perfectly in isolation, but the moment i try them together, one of two things happens:
- The video conference audio fades out and we just hear the audio files (in stereo)
- We get audio output from both, but the audio files are played in mono, coming out of both ears equally
My digging had taken me down a few routes:
https://developer.apple.com/forums/thread/90503
&
https://github.com/twilio/twilio-video-ios/issues/77
Which suggest that the issue could be with the audio session category, mode or options. However i've tried lots of the combos and am struggling to get anything working as intended.
Does anyone have a better understanding of the audio options to point in the right direction?
My most recent combination
class BBAudioClass {
static private var audioCategory : AVAudioSession.Category = AVAudioSession.Category.playAndRecord
static private var audioCategoryOptions : AVAudioSession.CategoryOptions = [
AVAudioSession.CategoryOptions.mixWithOthers,
AVAudioSession.CategoryOptions.allowBluetooth,
AVAudioSession.CategoryOptions.allowAirPlay,
AVAudioSession.CategoryOptions.allowBluetoothA2DP
]
static private var audioMode = AVAudioSession.Mode.default
static func setCategory() -> Void {
do {
let audioSession: AVAudioSession = AVAudioSession.sharedInstance()
try audioSession.setCategory(
BBAudioClass.audioCategory,
mode: BBAudioClass.audioMode,
options: BBAudioClass.audioCategoryOptions
)
} catch {
};
}
}
Update
I managed to get everything working as i wanted by:
- Starting the audio session
- Connecting to the video conference (at this point all audio is mono)
- Forcing all output to the speaker
- Forcing output back to the headphones
Obviously this is a crazy thing to have to do, but does prove that it should work. But it would be great if anyone knew WHY this works, in order that i can actually get things to work properly first time without going through all these hacky steps