Skip to content

KristofferStrube/Blazor.MediaCaptureStreams

Repository files navigation

License: MIT GitHub issues GitHub forks GitHub stars NuGet Downloads (official NuGet)

Blazor.MediaCaptureStreams

A Blazor wrapper for the Media Capture and Streams browser API.

The API standardizes ways to request access to local multimedia devices, such as microphones or video cameras. This also includes the MediaStream API, which provides the means to control where multimedia stream data is consumed, and provides some information and configuration options for the devices that produce the media. This project implements a wrapper around the API for Blazor so that we can easily and safely interact with the media streams of the browser.

Demo

The sample project can be demoed at https://kristofferstrube.github.io/Blazor.MediaCaptureStreams/

On each page, you can find the corresponding code for the example in the top right corner.

On the API Coverage Status page you can see how much of the WebIDL specs this wrapper has covered.

Getting Started

Prerequisites

You need to install .NET 7.0 or newer to use the library.

Download .NET 7

Installation

You can install the package via NuGet with the Package Manager in your IDE or alternatively using the command line:

dotnet add package KristofferStrube.Blazor.MediaCaptureStreams

Usage

The package can be used in Blazor WebAssembly and Blazor Server projects.

Import

You need to reference the package in order to use it in your pages. This can be done in _Import.razor by adding the following.

@using KristofferStrube.Blazor.MediaCaptureStreams

Add to service collection

The library has one service which is the IMediaDevicesService which can be used to access the MediaDevices of the current browser context. An easy way to make the service available on all your pages is by registering it in the IServiceCollection so that it can be dependency injected in the pages that need it. This is done in Program.cs by using our extension AddMediaDevicesService() before you build the host like we do in the following code block. If you use Blazor WASM you also need to invoke the SetupErrorHandlingJSInterop() extension on the IServiceProvider after building the host but before running like do we here:

var builder = WebAssemblyHostBuilder.CreateDefault(args);
builder.RootComponents.Add<App>("#app");
builder.RootComponents.Add<HeadOutlet>("head::after");

builder.Services.AddScoped(sp => new HttpClient { BaseAddress = new Uri(builder.HostEnvironment.BaseAddress) });

// Adding IMediaDevicesService to service collection.
builder.Services.AddMediaDevicesService();

var app = builder.Build();

// For Blazor WASM you need to call this to make Error Handling JS Interop.
await app.Services.SetupErrorHandlingJSInterop();

await app.RunAsync();

Inject in page

Then the service can be injected into a page

@inject IMediaDevicesService MediaDevicesService;

and we can use it to get the MediaDevices object that is the primary access point for the API and open a MediaStream representing a microphone device like so:

@if (error is not null)
{
    <code>@error</code>
}
else if (deviceLabel is not null)
{
    <p>We opened the media device with label: <b>@deviceLabel</b></p>
}
else
{
    <button @onclick="Open">Open Microphone MediaStream</button>
}

@code {
    private string? deviceLabel;
    private string? error;

    private async Task Open()
    {
        try
        {
            MediaDevices mediaDevices = await MediaDevicesService.GetMediaDevicesAsync();
            MediaStream mediaStream = await mediaDevices.GetUserMediaAsync(new MediaStreamConstraints() { Audio = true });
            MediaStreamTrack[] audioTracks = await mediaStream.GetAudioTracksAsync();
            deviceLabel = await audioTracks.First().GetLabelAsync();
        }
        catch (WebIDLException ex)
        {
            error = $"{ex.GetType().Name}: {ex.Message}";
        }
    }
}

We can read, process, and record the MediaStream and MediaStreamTracks through other API's. Among these are the Web Audio, WebRTC, MediaStream Image Capture, and MediaStream Recording API's. We are currently working on making a Blazor wrapper for the Web Audio API in the Blazor.WebAudio project. If you want an example of outputting the content of a video stream track check out the live video demo sample

But before using the MediaStream and MediaStreamTracks in other API's we can further constrain the details of the streams. We can do this by parsing some MediaTrackConstraints to the Audio property of the MediaStreamConstraints in our previous example. But before we make constraints we can also check what the possible range of values is for our devices so that we don't try to overconstrain the media devices. In the following sample, we open a MediaStream representing some camera, then check the capabilities of its first track and apply some new constraints to it depending on the constraints' max values for the resolution.

MediaDevices mediaDevices = await MediaDevicesService.GetMediaDevicesAsync();
MediaStream mediaStream = await mediaDevices.GetUserMediaAsync(new MediaStreamConstraints() { Video = true });

MediaStreamTrack[] videoTracks = await mediaStream.GetVideoTracksAsync();
MediaStreamTrack firstVideoTrack = videoTracks.First();

MediaTrackCapabilities capabilities = await firstVideoTrack.GetCapabilitiesAsync();
await firstVideoTrack.ApplyContraintsAsync(new MediaTrackConstraints()
    {
        Height = capabilities.Height?.Max,
        Width = capabilities.Width?.Max
    });

If we had not checked the capabilities of the track before setting its values we could potentially have set some values outside the valid range. If we did this we would get an OverConstrainedException that we could catch as we did with the more general WebIDLException in the previous sample.

Issues

Feel free to open issues on the repository if you find any errors with the package or have wishes for features.

Related repositories

This project uses the Blazor.WebIDL package to make error handling JSInterop and it uses the Blazor.DOM package to listen to events from the EventTarget's in the package like MediaDevices, MediaStream, and MediaStreamTrack.

The project is used in the Blazor.WebAudio library to create AudioNodes representing microphones that can be used together with other audio sources in many different ways. You can also record your audio and video mediastreams using the Blazor.MediaStreamRecording library.

Related articles

This repository was built with inspiration and help from the following series of articles: