I'm using the Ant Media SDK to develop a webRTC capable app. The thing is that I need to send some frames every 300 ms to a recognition service. I have found that there's a listener that pass the frames:
surfaceTextureHelper.startListening((VideoFrame frame) -> {
//some code
}
The thing is, how can I get a byte[] from that frame, in Camera1API you had setPreviewCallback and received directly a byte[] with the frame. How can I make the same conversion??
Thanks in advance