All Projects → MicrosoftDX → AzureRTMPIngestLib

MicrosoftDX / AzureRTMPIngestLib

Licence: other
A library for Windows UWP applications to enable live multibitrate camera capture to be published to Azure Media Services over RTMP

Programming Languages

C++
36643 projects - #6 most used programming language
C#
18002 projects

About this library

Universal Windows applications often run on devices that have either on-device or connected media capture peripherals such as cameras and microphones. What if you wanted to build a Windows application that wants to capture the audio/video content using these peripherals and stream it live to a multitude of viewers ? To gain scale you would most likely want to leverage a cloud based media processing and delivery platform such as Azure Media Services. But how do you get the media from your custom universal application to Azure ? This library contains types and API’s that can be used from Universal Windows applications (UWP) running on Windows, to publish locally captured audio/video content to Azure Media Services live over RTMP. The library also supports advanced features such as local transcoding for adaptive/multi bitrate, low latency encoding, timestamp discontinuities using a single Azure channel and a few others. #Adding the library to your UWP project You can download and install the setup (Microsoft.Media.RTMP.Setup.vsix) project on your development machine. Once the VSIX is installed you can add a reference to the library by navigating to the Universal Windows Extensions dialog in Visual Studio and selecting the “Microsoft RTMP Publish Library for Windows”. Alternatively you can add the source C++ project (Microsoft.Media.RTMP) to the same solution alongside your UWP app, and then add a reference to the C++ project directly to your app. This option may make step through debugging of the library code from your app code more straightforward. #Using the library You will use the MediaCapture class available as a part of the Windows Runtime API’s to integrate the library functionality into your app. The MediaCapture class is what you would normally use to capture audio/video in your UWP application. For more on MediaCapture refer to https://msdn.microsoft.com/en-us/library/windows/apps/windows.media.capture.mediacapture.aspx. To start, add the Microsoft.Media.RTMP namespace to your app code. Before you start capturing media, create an instance of the RTMPPublishSession type in your code. The RTMPPublishSession type constructor accepts a collection of PublishProfile instances. Each PublishProfile instance represents one target profile that you want your captured media to be transcoded to, and some additional details about the RTMP packaging. To construct a PublishProfile instance, provide the following information:

  • The target server type as an instance of the RTMPServerType enumerated type. Currently only RTMPServerType.Azure is supported.
  • The RTMP publish endpoint URL
  • The target encoding profile as an instance of the MediaEncodingProfile class. For more on MediaEncodingProfile , see https://msdn.microsoft.com/en-us/library/windows/apps/windows.media.mediaproperties.mediaencodingprofile.aspx
  • The suggested key frame interval (GOP size) in seconds. This parameter is optional and defaults to 4 seconds if not supplied.
  • The RTMP chunk size used in bytes. This parameter is optional and defaults to 128 bytes if not specified.
  • A unique name for the RTMP stream for this target profile. This parameter is optional and the library generates a unique name if not supplied.

If you are targeting an adaptive bitrate stream to be delivered to your viewers, your content will need to be transcoded to different bitrates and resolutions you want to be made available as a part of the bitrate adaptive stream. You can of course deliver a single bitrate over RTMP and leverage the Azure Live Encoding and dynamux feature that will consume your single bitrate stream and dynamically convert it to all the target bitrate/resolution pairs you need. If that is what you want to do, simply provide a list containing a single instance of the PublishProfile type to the RTMPPublishSession constructor containing the details about the single bitrate profile you want to publish over RTMP and then setup transcoding details in your Azure environment. Alternatively, if you want to leverage your client and produce the different bitrate/resolution pairs you need for adaptive streaming right on the client, provide a list containing one PublishProfile instance for each target encoding profile to the RTMPPublishSession constructor, and the library will do the rest.

Once this is all setup, and assuming your MediaCapture instance is also initialized and ready to go, you can call GetCaptureSinkAsync() on the RTMPPublishSession instance, which returns a capture sink object that you can pass onto the StartRecordToCustomSinkAsync() method on the MediaCapture instance. When capture stops, the RTMPPublishSession.SessionClosed event is raised. The SessionClosedEventArgs object contains two useful properties – the LastVideoTimestamp and the LastAudioTimestamp – these are the timestamps attached to the last video and audio RTMP packets sent to the Azure endpoint before the session ended. These are handy if you intend to use the same Azure channel for publishing, but start and stop the publish process many times. Each Azure channel expects media to be “continuous” in terms of timestamp (unless you reset the channel in between stops and starts which is a time consuming process). However, normally when you start publishing using this library, timestamps always start from 0. If you store the values mentioned above, you can start subsequent sessions at these timestamps (actually add a second or two to each) by supplying them using the PublishProfile.VideoTimestampBase and PublishProfile.AudioTimestampBase properties.

If there is a failure or error of any sort, the PublishFailed event is raised, followed by the SessionClosed event. Once the SessionClosed event is raised (under normal conditions or error) you should not attempt to use the existing RTMPPublishSession object anymore. If you need to publish again, create a new instance. #Sample application The sample application provided with the library demonstrates a complete publishing workflow. To use the application, create an Azure Media Service instance first, and create at least one channel in it without any encoding support. Then create a program within the channel, but leave it stopped. Now when you run the app, you can supply your Azure credentials, and the app will let you pick the channel, followed by the devices (camera/microphone) you want to use, and your target encoding profiles. Once selected, click the publish button – and head back to the Azure portal. Start the program, and then copy the publish URL – you should be able to view the stream it in any player like the Azure Media Player.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].