Blog:

2/2 Azure Media Services: Automatically Encode and Publish Media

Published on Mar 29, 2021
By Vladimir / Software Engineer

After part 1 about Azure Media Services: Automatically Encode and Publish now it’s time for part 2 and also the last part. To start with the actual encoding process:

1. Detect a blob has been uploaded to a container

Set up a function, which will get triggered when a file is uploaded to a particular container. Mine looks like this:

/// <summary>
/// The name of the container, uploads to which will trigger the function
/// </summary>
private const string SOURCE_CONTAINER_NAME = "raw-videos";

[FunctionName("BlobTriggeredVideoEncoder")]
public static async void Run([BlobTrigger(SOURCE_CONTAINER_NAME + "/{name}", Connection = "AzureWebJobsStorage")] ICloudBlob myBlob, string name, ILogger log)
{
    log.LogInformation($"Video encoding function triggered by blob upload: Name:{name} Size: {myBlob.Properties.Length} Bytes"); 
    // ...
    // the invocation of all the methods described below take place here
    // the complete Run method can be found later in the post 
}

This function will be triggered whenever a file is uploaded to the raw-videos/ container of the storage account, which is connected to my AMS instance. Note that I am using an ICloudBlob as the parameter. The complete Run method can be found after step 6.

2. Create an empty input asset

First, the method, that will be doing the empty asset creation.

private static Asset createEmptyInputAsset(string fileName, MediaServicesConfigWrapper amsConfig, IAzureMediaServicesClient client, ILogger log)
{
    Guid assetGuid = Guid.NewGuid();
    string fileNameCleaned = fileName.Replace('.', '-').Replace(' ', '-');
    string assetName = "INPUT-ASSET-" + fileNameCleaned + "-" + assetGuid.ToString();
    string assetDescription = "Input asset for file " + fileName;

    Asset assetParams = new Asset(null, assetName, null, assetGuid, DateTime.Now, DateTime.Now, null, assetDescription, null, STORAGE_ACCOUNT_NAME, AssetStorageEncryptionFormat.None);

    Asset asset = client.Assets.CreateOrUpdate(amsConfig.ResourceGroup, amsConfig.AccountName, assetName, assetParams);

    string destinationContainer = "asset-" + asset.AssetId;
    log.LogInformation($"--- Created empty asset with ID: {asset.AssetId}, name: {assetName}, desinationContainer: {destinationContainer}");

    return asset;
}

There is some basic file name standardization taking place at the beginning of the method. The STORAGE_ACCOUNT_NAME variable corresponds to the name of the storage account, connected to the AMS account. The empty asset will be created in the root of the aforementioned storage account and displayed within the list of assets of the AMS account.

The fileName is the name of the file, which was uploaded. It is only being used to ensure some adequate naming of all the resources is in place.

The amsConfig parameter is an object, which carries the configuration details for the connection to AMS.

The client is the object, through which the actual connection to AMS takes place.

The method returns a reference to the newly created asset.

3. Copy the blob from step 1 into the asset from step 2

Next, the blob, whose upload triggered the function, needs to be copied in the newly created asset, so that it can be used as an input for the encoding process. Azure does not provide a way of combining steps 2 and 3, unfortunately.

private static Task<string> startCopyBlobContainerToAsset(CloudBlob sourceBlob, Asset asset, MediaServicesConfigWrapper amsConfig, IAzureMediaServicesClient client, ILogger log)
{
    // Setup blob container                
    var response = client.Assets.ListContainerSas(amsConfig.ResourceGroup, amsConfig.AccountName, asset.Name, permissions: AssetContainerPermission.ReadWrite, expiryTime: DateTime.UtcNow.AddHours(4).ToUniversalTime());
    var sasUri = new Uri(response.AssetContainerSasUrls.First());
    CloudBlobContainer destinationBlobContainer = new CloudBlobContainer(sasUri);

    log.LogInformation($"--- Initiated copy of the blob {sourceBlob.Name} to the asset {asset.Name}");

    // Copy Source Blob container into Destination Blob container that is associated with the asset asynchronously
    return BlobStorageHelper.CopyBlobAsync(sourceBlob, destinationBlobContainer.GetBlockBlobReference(sourceBlob.Name));
}

The method initiates the copying of the blob asynchronously.
The parameter sourceBlob is the reference to the blob, that triggered the function.
The parameter asset is the empty input asset created earlier.

The returned object is the asynchronous task of copying the blob.

4. Acquire a transform

public static Transform getOrCreateTransform(MediaServicesConfigWrapper amsConfig, IAzureMediaServicesClient client, ILogger log)
{
    Transform transform = client.Transforms.Get(amsConfig.ResourceGroup, amsConfig.AccountName, TRANSFORM_NAME);

    log.LogInformation($"--- Requested transform with name {TRANSFORM_NAME}");

    if (transform == null)
    {
        List<TransformOutput> transformOutputs = new List<TransformOutput>();
        Preset preset = new BuiltInStandardEncoderPreset(ENCODING_PRESET_NAME);
        transformOutputs.Add(new TransformOutput(preset, OnErrorType.StopProcessingJob, Priority.Normal));
        string description = $"Transform with preset {ENCODING_PRESET_NAME}";

        // Create Transform
        transform = client.Transforms.CreateOrUpdate(amsConfig.ResourceGroup, amsConfig.AccountName, TRANSFORM_NAME, transformOutputs.ToArray(), description);

        log.LogInformation($"--- Created transform with name {TRANSFORM_NAME}");
    }
    else
    {
        log.LogInformation($"--- Got transform with name {TRANSFORM_NAME}");
    }

    return transform;
}

Uses the client to retrieve a transform with a predefined name. If no existing transform is found, creates a new one instead, which uses a predefined encoding preset.

The transform defines the outputs of the encoding process, what presets are going to be used for it and so on. This is a really simplified transform creation, but a more in-depth example can be found amongst the Azure samples.

5. Submitting the encoding job

Now, for the actual encoding – AMS handles this part in the background once a task is being submitted.

public static void createMediaJobAndPublishOutputAsset(Asset inputAsset, Transform transform, MediaServicesConfigWrapper amsConfig, IAzureMediaServicesClient client, ILogger log)
{
    string guid = Guid.NewGuid().ToString();
    string jobName = inputAsset.Name + "-V3-ENCODING-JOB-" + guid;
    var jobOutputList = new List<JobOutput>();

    // preparing the output assets, where all the encoded files will be accessible from
    for (int i = 0; i < transform.Outputs.Count; i++)
    {
        Guid assetGuid = Guid.NewGuid();
        string outputAssetName = "ENCODED-" + inputAsset.Name.Substring("INPUT-ASSET-".Length);
        string outputAssetDescription = $"{outputAssetName}-{ENCODING_PRESET_NAME}";

        Asset assetParams = new Asset(null, outputAssetName, null, assetGuid, DateTime.Now, DateTime.Now, null, outputAssetDescription, null, STORAGE_ACCOUNT_NAME, AssetStorageEncryptionFormat.None);

        Asset outputAsset = client.Assets.CreateOrUpdate(amsConfig.ResourceGroup, amsConfig.AccountName, outputAssetName, assetParams);
        jobOutputList.Add(new JobOutputAsset(outputAssetName));

        string locatorName = "abel-streaming-locator-" + DateTime.Now.ToString("yyyyddMM-HHmmss-") + assetGuid;

        // publishes the asset, see next step
        createStreamingLocatorAsync(outputAssetName, locatorName, amsConfig, client, log);
    }

    // Use the name of the created input asset to create the job input
    log.LogInformation($"--- Initiating transform job with name {jobName}");

    JobInput jobInput = new JobInputAsset(assetName: inputAsset.Name);
    JobOutput[] jobOutputs = jobOutputList.ToArray();
    Job job = client.Jobs.Create(
        amsConfig.ResourceGroup,
        amsConfig.AccountName,
        transform.Name,
        jobName,
        new Job
        {
            Input = jobInput,
            Outputs = jobOutputs,
        }
    );
}

Once again, some basic shaping of the output names for convenience’s sake, followed by creating a new asset, which will be used as the output. After this, a streaming locator is being created for the newly created asset (more on that in the next step).

Once the encoding job completes, you will be able to find the newly generated files by going to the output asset (see next step) and selecting the Storage container’. The files can also be found manually by going to the Storage Account. There should be a list of files, differing in video and audio quality. The size of each of those is significantly smaller than the size of the input file.

6. Publishing the encoded asset

At the end of step 5 the encoding job has only begun and it might take a while before the actual files are available in the storage container. Despite this, the output asset is already available after the call to client.Assets.CreateOrUpdate. Since we need to publish the asset itself and not the files within it, this is a nice time create a streaming locator for the output.

private static void createStreamingLocatorAsync(string assetName, string locatorName, MediaServicesConfigWrapper amsconfig, IAzureMediaServicesClient client, ILogger log)
{
    log.LogInformation($"--- Creating streaming locator for asset {assetName}");
    client.StreamingLocators.CreateAsync(
        amsconfig.ResourceGroup,
        amsconfig.AccountName,
        locatorName,
        new StreamingLocator
        {
            AssetName = assetName,
            StreamingPolicyName = PredefinedStreamingPolicy.ClearStreamingOnly
        });
}

This function does not bother with returning the streaming locator to us in any way, so the publishing process gets started and is left to run to its own devices in the background.

At this point the asset can be accessed at a generated URL, but the video will not play until the encoding job is complete.

The asset can be found by navigating to the Media Service Account -> Assets (under the Media Services section).

By selecting the ENCODED-.. asset, you will be taken to a screen where the streaming URL can be found. It will be pointing to the manifest of the newly encoded asset. Based on this manifest, video players can determine which of the encoded files to provide to the user.

The rest of the code

The complete method from step 1 and the variables used across the functions look like this:

/// <summary>
/// The name of the storage account where the assets will be stored
/// </summary>
private const string STORAGE_ACCOUNT_NAME = "<your-storage-acc-name>";

/// <summary>
/// The name of the container, uploads to which will trigger the function
/// </summary>
private const string SOURCE_CONTAINER_NAME = "raw-videos";

/// <summary>
/// The name of the Transform, which will be used to start the encoding jobs
/// </summary>
private const string TRANSFORM_NAME = "<your-transform-name>";

/// <summary>
/// The name of the Preset that will be used for the encoding process
/// </summary>
private const string ENCODING_PRESET_NAME = "AdaptiveStreaming";


[FunctionName("BlobTriggeredVideoEncoder")]
public static async void Run([BlobTrigger(SOURCE_CONTAINER_NAME + "/{name}", Connection = "AzureWebJobsStorage")] ICloudBlob myBlob, string name, ILogger log)
{
    log.LogInformation($"Video encoding function triggered by blob upload: \n Name:{name} \n Size: {myBlob.Properties.Length} Bytes");

    try
    {
        MediaServicesConfigWrapper amsConfig = new MediaServicesConfigWrapper();
        IAzureMediaServicesClient client = MediaServicesHelper.CreateMediaServicesClientAsync(amsConfig);

        Asset inputAsset = createEmptyInputAsset(name, amsConfig, client, log);

        Task<string> copyInputBlobToAssetTask = startCopyBlobContainerToAsset((CloudBlob) myBlob, inputAsset, amsConfig, client, log);

        Transform transform = getOrCreateTransform(amsConfig, client, log);

        await copyInputBlobToAssetTask;
        createMediaJobAndPublishOutputAsset(inputAsset, transform, amsConfig, client, log);
    }
    catch (ApiErrorException e)
    {
        log.LogError($"ERROR: AMS API call error: {e.Message}; \nError Code: {e.Body.Error.Code}; \nMessage: {e.Body.Error.Message}; \nStack trace: {e.StackTrace}");
        throw (e);
    }
    catch (Exception e)
    {
        log.LogError($"ERROR: Exception with message: {e.Message}; \nStack trace: {e.StackTrace}");
        throw (e);
    }
}

Notable parts are that there is some basic error logging added, as well as configuring the amsConfig and client object, which were referred to in pretty much every other method.

Using the encoded and published video file

As mentioned before, the streaming URL for a published asset points to a manifest file. Various HTML players support playback from such files and come equipped with support for manual selection of the video playback quality, or automated selection, based on the user’s device and quality of connection.

Azure Media Player is one such player and is also the player that is being used to preview a published asset in AMS. It is available as a JavaScript library and is relatively easy to set up. It supports a variety of functionalities, which are provided out of the box – using watermarks and thumbnails, ability to switch between different video quality settings and audio streams (pretty handy for multi-language clips, for example) and various others.

Should I keep anything particular in mind when opting for this approach?

This approach comes with one quirk that should definitely be kept in mind, especially if working with really big input files. If opting for a Consumption Plan for the Function App – which would make sense if you only need to upload videos once in a while and this is the only function you need – the function will not run for more than 10 minutes (and the default timeout is 5 minutes), as per the documentation.

The only ‘dangerous’ part of the function is the awaiting of the copy process from blob to an input asset and it should not be an issue in most cases, but it should still be something to keep in mind. If you face any issues related to this, I recommend breaking the example above into multiple functions. For example, there could be a function, which creates an empty asset and initiates the copy process, and another, which gets triggered after the input asset has already been prepared.

Another aspect that can be improved upon is that the code shown above does not provide anything in terms of retry or notification mechanisms in case something goes wrong, but this could be incorporated within the flow without too much hassle, depending on the use case.