Can we Stream Live Audio from Android phone using FFmpeg?

429 Views Asked by At

Im using ffmpeg_kit_flutter package to Stream data to the RTSP server in Flutter.

IOS : Working

Android : Its not working

Command Used :

'ffmpeg -f avfoundation -i ":0" -acodec aac -f rtsp -rtsp_transport tcp "$Url"'

When i ran a "ffmpeg -devices" command on android, it returns follwing response, through which i got to know android doesn't support avfoundation but android has android_camera, Does this android_camera support audio too?

Command : 'ffmpeg -devices'

Response :

I/flutter (10620): logs:  libavutil      57. 28.100 / 57. 28.100
I/flutter (10620): logs:  libavcodec     59. 37.100 / 59. 37.100
I/flutter (10620): logs:  libavformat    59. 27.100 / 59. 27.100
I/flutter (10620): logs:  libavdevice    59.  7.100 / 59.  7.100
I/flutter (10620): logs:  libavfilter     8. 44.100 /  8. 44.100
I/flutter (10620): logs:  libswscale      6.  7.100 /  6.  7.100
I/flutter (10620): logs:  libswresample   4.  7.100 /  4.  7.100
I/flutter (10620): logs:Devices:
I/flutter (10620):  D. = Demuxing supported
I/flutter (10620):  .E = Muxing supported
I/flutter (10620):  --
I/flutter (10620): logs: D  android_camera   
I/flutter (10620): logs: D  lavfi            
I/flutter (10620): logs: DE video4linux2,v4l2

Commands which I tried in Android

FFmpegKit.execute('-y -f android_camera -i 0:1 -r 30 -c:a aac -f rtsp -rtsp_transport tcp "$Url"');

FFmpegKit.execute('-y -f android_camera -i 0:1 -r 30 -c:a libmp3lame -qscale:a 2 "/storage/emulated/0/Download/androidvideo.mp3"');

FFmpegKit.execute('-y -f android_camera -i 0:0 -r 30 -c:a wavpack -b:a 64k "/storage/emulated/0/Download/androidvideo.wav"');

This command records video but no audio in it.

FFmpegKit.execute('-video_size hd720 -f android_camera -camera_index 1 -i anything -r 10 -t 00:00:15 "$dir/androidvideo.mp4”');
2

There are 2 best solutions below

0
On

the android_camera input device in FFmpeg for Android does not support audio capture. It only captures video from the Android camera but does not capture audio from the device's microphone.

0
On

As seen in the following sources, ffmpeg for Android supports only video capturing from the camera using the android_camera device:

Though on iOS the avfoundation supports video and audio.

As an option, the audio could be recorded by the native means, MediaRecorder or AAudio / OpenSL ES via JNI for example, and then mixed with the video recorded from the camera.

The ffmpeg would take audio data from an audio buffer (a temp file as a simplest option or a in-memory buffer, but this would need to be investigated further for how to configure it in the call string).