I want to send a external video from my iOS device. This video is being received from a live streaming: RTSP server or HLS url (not from iPhone camera).
Currently I can stream my camera video from iPhone using VideoCore
(internally using CameraSource
and MicSource
) but now, the video I want to stream comes from an URL. Similar to Periscope streaming video from GoPro Cam.
Problem 1: I don't know how to extract from a RTSP URL audio and video
Problem 2: I don't know how to create a CameraSource
o MicSource
from this extracted video and audio.
Do you know where to find an example or could you help me with this technical challenge?
I found a first approach for the first problem:
Then create a callback with
CADisplayLink
which will callbackdisplayPixelBuffer
: at every vsync.and in this method get pixelBuffer and send to output For Audio, do similar tasks in prepare callback using
AURenderCallbackStruct
.