Is it possible for me to generate CMSampleBuffers in real time and stream that to a TV via AirPlay. Similarly to how I am able to render those frames directly to an AVSampleBufferDisplayLayer on the iOS device.
I know that I can use AirPlay with the HTTP Live Streaming (HLS) protocol, but, like I mentioned, in my case, the video is being generated on the actual iOS device and thus, I want to stream it in "real time" to AirPlay.
For Audio it appears to be doable with AVSampleBufferRenderSynchronizer and AVSampleBufferAudioRenderer (as outlined in https://developer.apple.com/documentation/avfoundation/streaming_and_airplay/supporting_airplay_in_your_app). But, can't find any info in regards to video.
I haven't tested it yet but I think iOS17 might have added support for this via AVSampleBufferVideoRenderer.
The docs are very sparse but it exists publicly at least, see: https://developer.apple.com/documentation/avfoundation/avsamplebuffervideorenderer