MediaStreamSource video streaming in UWP

2.9k Views Asked by At

I just started to experiment with MediaStreamSource in UWP. I took the MediaStreamSource streaming example from MS and tried to rewrite it to support mp4 instead of mp3. I changed nothing but the InitializeMediaStreamSource part, it now looks like this:

{
    var clip = await MediaClip.CreateFromFileAsync(inputMP3File);
    var audioTrack = clip.EmbeddedAudioTracks.First();
    var property = clip.GetVideoEncodingProperties();

    // initialize Parsing Variables
    byteOffset = 0;
    timeOffset = new TimeSpan(0);

    var videoDescriptor = new VideoStreamDescriptor(property);
    var audioDescriptor = new AudioStreamDescriptor(audioTrack.GetAudioEncodingProperties());

    MSS = new MediaStreamSource(videoDescriptor)
    {
        Duration = clip.OriginalDuration
    };

    // hooking up the MediaStreamSource event handlers
    MSS.Starting += MSS_Starting;
    MSS.SampleRequested += MSS_SampleRequested;
    MSS.Closed += MSS_Closed;

    media.SetMediaStreamSource(MSS);
}    

My problem is, that I cannot find a single example where video streams are used instead of audio, so I can't figure out what's wrong with my code. If I set the MediaElement's Source property to the given mp4 file, it works like a charm. If I pick an mp3 and leave the videoDescriptor out then as well. But if I try to do the same with a video (I'm still not sure whether I should add the audioDescriptor as a second arg to the MediaStreamSource or not, but because I've got one mixed stream, I guess it's not needed), then nothing happens. The SampleRequested event is triggered. No error is thrown. It's really hard to debug it, it's a real pain in the ass. :S

1

There are 1 best solutions below

0
On

I have solution to build working video MediaStreamSource from file bitmaps but unfortunately have not found solution for RGBA buffer. First of all read MediaStreamSource Class documentation https://learn.microsoft.com/en-us/uwp/api/windows.media.core.mediastreamsource I'm creating MJPEG MediaStreamSource

var MediaStreamSource = new MediaStreamSource(
                new VideoStreamDescriptor(
                    VideoEncodingProperties.CreateUncompressed(
                        CodecSubtypes.VideoFormatMjpg, size.Width, size.Height
                    )
                )
            );

Then initialize some buffer time

MediaStreamSource.BufferTime = TimeSpan.FromSeconds(1);

Then subscribe for event to set requested frame.

MediaStreamSource.SampleRequested += async (MediaStreamSource sender, MediaStreamSourceSampleRequestedEventArgs args) =>
            {
                var deferal = args.Request.GetDeferral();
                try
                {
                    var timestamp = DateTime.Now - startedAt;

                    var file = await Windows.ApplicationModel.Package.Current.InstalledLocation.GetFileAsync(@"Assets\grpPC1.jpg");
                    using (var stream = await file.OpenReadAsync())
                    {
                        args.Request.Sample = await MediaStreamSample.CreateFromStreamAsync(
                            stream.GetInputStreamAt(0), (uint)stream.Size, timestamp);
                    }
                    args.Request.Sample.Duration = TimeSpan.FromSeconds(5);
                }
                finally
                {
                    deferal.Complete();
                }
            };

As you may see in my sample I use CodecSubtypes.VideoFormatMjpg and hardcoded path to jpeg file that I permanently use as MediaStreamSample. We need to research which CodecSubtypes to set to use RGBA (4 byte per pixel) format bitmap like thin

var buffer = new Windows.Storage.Streams.Buffer(size.Width * size.Height * 4);
// latestBitmap is SoftwareBitmap
latestBitmap.CopyToBuffer(buffer);
args.Request.Sample = MediaStreamSample.CreateFromBuffer(buffer, timestamp);