My app (coded in swift) does realtime processing based on audio signals.
I need to get a function with the left and right buffers coming from the input (2 channels from USB microphone) and one function with buffers for the output (2 channels as well).
I used to use EZAudio but I have memory problem with 2 channels 96K format. And as EZAudio stopped, I would like to change to either Superpowered or Audiokit.
My problem is: I cannot get in any of these libraries the functions with the buffers.
Superpowered: I added #import "SuperpoweredIOSAudioIO.h" in the bridge header.
I added SuperpoweredIOSAudioIODelegate in my ViewController. This automatically added the interruption, permission and mapchannels functions, but not audioProcessingCallback.
I tried the following things:
audio = SuperpoweredIOSAudioIO(delegate: self, preferredBufferSize: 12, preferredMinimumSamplerate: 96000, audioSessionCategory: AVAudioSessionCategoryPlayAndRecord, channels: 2, audioProcessingCallback: audioProcessingCallback, clientdata: UnsafeMutablePointer)
audio.start()
and
func audioProcessingCallback(buffers: UnsafeMutablePointer<UnsafeMutablePointer<Float>>, inputChannels: UInt32, outputChannels: UInt32, numberOfSamples: UInt32, sampleRate: UInt32, hostTime: UInt64) -> Bool {
return true
}
But I get the error:
Cannot convert value of type '(UnsafeMutablePointer>, UInt32, UInt32, UInt32, UInt32, UInt64) -> Bool' to expected argument type 'audioProcessingCallback!' (aka 'ImplicitlyUnwrappedOptional<@convention(c) (Optional, Optional>>>, UInt32, UInt32, UInt32, UInt32, UInt64) -> Bool>')
I could not find any examples of this library with Swift...
With AudioKit, here is what I did:
let mic = AKMicrophone()
installTap(mic)
AudioKit.output = mic
AudioKit.start()
func installTap(_ input:AKNode) {
input.avAudioNode.installTap(onBus: 0, bufferSize: 1024, format: AudioKit.format) { [weak self] (buffer, time) -> Void in
self?.signalTracker(didReceivedBuffer: buffer, atTime: time)
}
}
func signalTracker(didReceivedBuffer buffer: AVAudioPCMBuffer, atTime time: AVAudioTime){
let samples = UnsafeBufferPointer(start: buffer.floatChannelData?[0], count:1024)
audioProcess.ProcessDataCaptureWithBuffer(samples, numberOfSamples: UInt32(1024))
}
It can get the coming buffers in my algorithm, but it seems to be not in "realtime", I mean, very slow..(sorry, hard to explain.)
Thanks!
If you need to do realtime processing, you should not use Swift (or ObjC). Currently the way to do this in AudioKit is to create an AUAudioUnit subclass and do your processing within it. However, if you just need the audio tap to be faster, then AKLazyTap is an good solution. It's different from a normal tap in that you have to poll it for data, but this method allows for buffer re-use, so you can call it as quickly as you'd like.
Here's an example of using AKLazyTap to get peak: