I'm attempting to build a modular synthesizer using the web audio api. The main thing I can't figure out are how to do are "gates", or in other words how to have an audio signal trigger js functions or events.
For instance, say I have a low frequency square wave, and I want a function to be triggered every time the square wave's amplitude goes above a certain value. What is the best process to go about creating that kind of listener?
There are no ordinary callbacks in Web Audio. The only "listener" you can create is one that uses a
ScriptProcessorNode
.Connect the node where you want to look for the square wave magnitude. Define a
scriptNode.onaudioprocess
function and it will be called on each pass through the audio network you've built.You can then grab the input using
audioProcessingEvent.inputBuffer
andinputDataMic = inputBuffer.getChannelData(0)
(assuming you give yourself access to the event that caused your callback to be invoked).These calls give you caccess to the actual data values coming in. You must look for your square wave in each audio frame. You may find a analyzer node helpful to identify the frames that have power around the frequencies in your square wave.