@twilio/video-processors

Twilio Video Processors JavaScript Library

Usage no npm install needed!

<script type="module">
  import twilioVideoProcessors from 'https://cdn.skypack.dev/@twilio/video-processors';
</script>

README

Twilio Video Processors

Twilio Video Processors is a collection of video processing tools which can be used with Twilio Video JavaScript SDK to apply transformations and filters to a VideoTrack.

   See it live here!

Features

The following Video Processors are provided to apply transformations and filters to a person's background. You can also use them as a reference for creating your own Video Processors that can be used with Twilio Video JavaScript SDK.

Prerequisites

Note

The Node.js and NPM requirements do not apply if the goal is to use this library as a dependency of your project. They only apply if you want to check the source code out and build the artifacts and/or run tests.

Installation

NPM

You can install directly from npm.

npm install @twilio/video-processors --save

Using this method, you can import twilio-video-processors like so:

import * as VideoProcessors from '@twilio/video-processors';

Script tag

You can also copy twilio-video-processors.js from the dist/build folder and include it directly in your web app using a <script> tag.

<script src="https://my-server-path/twilio-video-processors.js"></script>

Using this method, twilio-video-processors.js will set a browser global:

const VideoProcessors = Twilio.VideoProcessors;

Assets

In order to achieve the best performance, the VideoProcessors use WebAssembly to run TensorFlow Lite for person segmentation. You need to serve the tflite model and binaries so they can be loaded properly. These files can be downloaded from the dist/build folder. Check the API docs for details and the examples folder for reference.

Usage

These processors are only supported on chromium-based desktop browsers at this moment and will not work on other browsers. For best performance and accuracy, we recommend that, when calling Video.createLocalVideoTrack, the video capture constraints be set to 24 fps frame rate with 640x480 capture dimensions. Higher resolutions can still be used for increased accuracy, but may degrade performance, resulting in a lower output frame rate on low powered devices.

Additionally, these processors run TensorFlow Lite using MediaPipe Selfie Segmentation Landscape Model and requires Chrome's WebAssembly SIMD support in order to achieve the best performance. WebAssembly SIMD can be turned on by visiting chrome://flags on versions 84 through 90. This will be enabled by default on Chrome 91+. You can also enable this on versions 84-90 for your users without turning on the flag by registering for a Chrome Origin Trial for your website.

Please check out the following pages for example usage. For more information, please refer to the API Docs.