In my mobile app, I would like to stream video live to YouTube via RTMP. From the Wikipedia article Adaptive bitrate streaming, when streaming videos, client is responsible for requesting a higher bitrate or lower bitrate:
The streaming client is made aware of the available streams at differing bit rates, and segments of the streams by a manifest file. When starting, the client requests the segments from the lowest bit rate stream. If the client finds the download speed is greater than the bit rate of the segment downloaded, then it will request the next higher bit rate segments. Later, if the client finds the download speed for a segment is lower than the bit rate for the segment, and therefore the network throughput has deteriorated, then it will request a lower bit rate segment.
Since I'm concerned with live streaming upload from the mobile platform (Android & iOS), I wonder what the roles of the servers and the clients are. I would assume on YouTube's end, it has to request for content of higher bitrate? Or is the decision made on the mobile side?
Does anyone know which server YouTube uses for RTMP and whether it supports adaptive bitrate when uploading a live stream? Thanks for your help.
I made an adaptive bitrate RTMP encoding prototype on an iOS device using VideoCore, so it appears that the YouTube Livestreaming API supports adaptive bitrate encoding for RTMP streams.