The Twilio Live Player SDK allows you to play back a live stream of a Video Room. Please take a look at the SDK documentation here.
The Twilio Live Player SDK is distributed under the Twilio Terms of Service.
Chrome | Edge (Chromium) | Firefox | Safari | |
---|---|---|---|---|
Android | ✓ * | - | - | - |
iOS | ✓ * | - | - | ✓ * |
Linux | ✓ | ✓ | ✓ | ✓ |
MacOS | ✓ | ✓ | ✓ * | ✓ |
Windows | ✓ | ✓ | ✓ * | ✓ |
You can install the SDK as a dependency of your app by running the following command:
npm install @twilio/live-player-sdk
You can now import the SDK to your project as shown below:
import { Player } from '@twilio/live-player-sdk';
const { Player } = require('@twilio/live-player-sdk');
You can deploy node_modules/@twilio/live-player-sdk/dist/build/twilio-live-player.min.js
with your application.
Once you include it in a <script>
tag, you can access the SDK APIs in window
scope as shown below:
const { Player } = Twilio.Live;
Please refer to the Twilio Live docs docs for starting a live stream of a Video Room from your application server. At the end, you will have an AccessToken which you can use to join the live stream.
You can check whether the SDK supports the browser on which the application is running as shown below:
import { Player } from '@twilio/live-player-sdk';
if (Player.isSupported) {
/**
* Load your application.
*/
} else {
/**
* Inform the user that the browser is not supported.
*/
}
You can now join the live stream from your application using the AccessToken as shown below:
import { Player } from '@twilio/live-player-sdk';
const {
host,
protocol,
} = window.location;
/**
* Join a live stream.
*/
const player = await Player.connect('$accessToken', {
playerWasmAssetsPath: `${protocol}//${host}/path/to/hosted/player/assets`,
});
In order for the SDK to run, your application must host the following artifacts
which are available in node_modules/@twilio/live-player-sdk/dist/build
:
twilio-live-player-wasmworker-x-y-z.min.wasm
twilio-live-player-wasmworker-x-y-z.min.js
where x.y.z
is the version of the SDK assets.
After joining the live stream, you can listen to events on the Player as shown below:
player.on(Player.Event.StateChanged, (state: Player.State) => {
switch (state) {
case Player.State.Buffering:
/**
* The player is buffering content.
*/
case Player.State.Ended:
/**
* The stream has ended.
*/
case Player.State.Idle:
/**
* The player has successfully authenticated and is loading the stream. This
* state is also reached as a result of calling player.pause().
*/
case Player.State.Playing:
/**
* The player is now playing a stream. This state occurs as a result of calling
* player.play().
*/
case Player.State.Ready:
/**
* The player is ready to play back the stream.
*/
}
});
You can perform the following playback actions on the live stream:
/**
* Call this method after the Player transitions to the Player.State.Ready state.
*/
player.play();
/**
* Pause playback.
*/
player.pause();
/**
* Mute audio.
*/
player.isMuted = true;
/**
* Unmute audio.
*/
player.isMuted = false;
/**
* Set volume.
*/
player.setVolume(0.5);
If your application plays the live stream on page load without a user action, then
the browser's autoplay policy may come into effect, in which case the audio will be
muted. You can detect when this happens by listening to the Player.Event.VolumeChanged
event on the Player as shown below:
player.on(Player.Event.VolumeChanged, () => {
if (player.isMuted) {
/**
* Show the unmute button.
*/
} else {
/**
* Hide the unmute button.
*/
}
});
In order to render the live stream, you can use the default HTMLVideoElement created by the Player in your application as shown below:
const container = document.querySelector('div#container');
container.appendChild(player.videoElement);
Alternatively, if you want to render the live stream in your own HTMLVideoElement, you can do so as shown below:
const videoElement = document.querySelector('div#container > video');
/**
* Enable inline playback on iOS browsers.
*/
videoElement.playsInline = true;
player.attach(videoElement);
When a Media Extension inserts
TimedMetadata into a stream, you can receive them by listening to the Player.Event.TimedMetadataReceived
event as shown below:
player.on(Player.Event.TimedMetadataReceived, (metadata: Player.TimedMetadata) => {
/**
* Handle the metadata.
*/
});
You can disconnect from the live stream as shown below:
player.disconnect();
This is a terminal operation on the Player, which is no longer useful to the application.
ended
state after
a live stream is stopped by ending a MediaProcessor.Player.stats
are always either 0 or null.Player.stats
are always either 0 or null.Generated using TypeDoc