System audio streaming on Android over Webrtc

1.6k Views Asked by At

I am trying to build a screensharing application on Android using Webrtc. I am able to share the screen using mediaprojection+webrtc but not able to share the system audio. Mediaprojection added support to capturing system audio from API 29 (Android 10) with the help of AudioPlaybackCaptureConfiguration. But the app is crashing when I assign the audio source from audio record to peerconnection audiotrack.

   MediaProjectionManager mediaProjectionManager =
            (MediaProjectionManager) mContext.getApplicationContext().getSystemService(
                    Context.MEDIA_PROJECTION_SERVICE);

    MediaProjection sMediaProjection =
            mediaProjectionManager.getMediaProjection(
                    MPResultCode,
                    MPData
            );

    AudioPlaybackCaptureConfiguration config = new AudioPlaybackCaptureConfiguration.Builder(sMediaProjection)
            .addMatchingUsage(AudioAttributes.USAGE_MEDIA)
            .build();

    AudioFormat audioFormat = new AudioFormat.Builder()
            .setEncoding(AudioFormat.ENCODING_PCM_16BIT)
            .setSampleRate(8000)
            .setChannelMask(AudioFormat.CHANNEL_IN_MONO)
            .build();

    AudioRecord audioRecord = new AudioRecord.Builder()
            .setAudioFormat(audioFormat)
            .setBufferSizeInBytes(BUFFER_SIZE_IN_BYTES)
            .setAudioPlaybackCaptureConfig(config)
            .build();

    AudioSource audioSource = new AudioSource(audioRecord.getAudioSource());
    AudioTrack localAudioTrack = factory.createAudioTrack("AudioTrack", audioSource1);
    localAudioTrack.setEnabled(true);
    mLocalMediaStream.addTrack(localAudioTrack); 

Streaming mic audio is working fine if I configure the audiosource as below

    AudioSource audioSource = factory.createAudioSource(new MediaConstraints()); 

How to configure the webrtc audiotrack using the AudioRecord object?

1

There are 1 best solutions below

0
On

I resolved this issue by using WebRTC Framework. Let me explain how it's working. In the WebRTC stack, Audio record running on WebRTCAudioRecord.java. If you want to use system audio you need to transfer media projection object into WebRTCAudioRecord.java. Here is the example of implementation: https://github.com/ant-media/WebRTCAndroidSDK/pull/1 You can check it. Any comment is welcome.