I am using webrtc for create peerconnection and stream audio between browsers, but how can I visualize and play the audio stream with a visualizer(for example wave form) to both the transmitting and the receiving? Someone knows some example?
Thanks
I am using webrtc for create peerconnection and stream audio between browsers, but how can I visualize and play the audio stream with a visualizer(for example wave form) to both the transmitting and the receiving? Someone knows some example?
Thanks
Copyright © 2021 Jogjafile Inc.
Take a look at @cwilso's excellent demos on webaudiodemos.appspot.com, in particular Audio Recorder (which inputs audio from getUserMedia to Web Audio, analyses the data and draws to a canvas element) and Live Input Effects (which does something similar but with WebGL for the visualisation).
@paul-lewis's Audio Room also uses WebGL.