Server-Side Rendering
info
This documentation is how server-side rendering works in Remotion v3 and above. To see rendering in 2.0 and below, click here.
Remotion's rendering engine is built upon Node.JS, which makes it easy to render a video in the cloud.
Since Remotion is built with tech (Node.JS, FFMPEG, Puppeteer) that works well cross-platform, you can without much hassle run it on a Linux-based system or even dockerize your video.
Render a video on AWS Lambda
The easiest and fastest way to render videos in the cloud is to use @remotion/lambda
. Click here to read the documentation for it.
Render a video using Node.JS APIs
The NPM package @remotion/renderer
provides you with an API for rendering the videos programmatically. You can make a video in three steps: creating a Webpack bundle, rendering the frames, and stitching them together to an MP4. This gives you more independence and allows you to for example skip the stitching process, if you just want a PNG sequence.
Follow this commented example to see how to render a video:
tsx
importfs from "fs";importos from "os";importpath from "path";import {bundle } from "@remotion/bundler";import {getCompositions ,renderMedia } from "@remotion/renderer";conststart = async () => {// The composition you want to renderconstcompositionId = "HelloWorld";// Create a webpack bundle of the video.// You only have to do this, you can reuse the bundle.constbundleLocation = awaitbundle (require .resolve ("./src/index"));// Parametrize the video by passing arbitrary props to your component.constinputProps = {custom : "data",};// Extract all the compositions you have defined in your project// from the webpack bundle.constcomps = awaitgetCompositions (bundleLocation , {// You can pass custom input props that you can retrieve using getInputProps()// in the composition list. Use this if you want to dynamically set the duration or// dimensions of the video.inputProps ,});// Select the composition you want to render.constcomposition =comps .find ((c ) =>c .id ===compositionId );// Ensure the composition existsif (!composition ) {throw newError (`No composition with the ID ${compositionId } found`);}awaitrenderMedia ({composition ,serveUrl :bundleLocation ,codec : "h264",outputLocation : "out/video.mp4",inputProps ,});};start ();
tsx
importfs from "fs";importos from "os";importpath from "path";import {bundle } from "@remotion/bundler";import {getCompositions ,renderMedia } from "@remotion/renderer";conststart = async () => {// The composition you want to renderconstcompositionId = "HelloWorld";// Create a webpack bundle of the video.// You only have to do this, you can reuse the bundle.constbundleLocation = awaitbundle (require .resolve ("./src/index"));// Parametrize the video by passing arbitrary props to your component.constinputProps = {custom : "data",};// Extract all the compositions you have defined in your project// from the webpack bundle.constcomps = awaitgetCompositions (bundleLocation , {// You can pass custom input props that you can retrieve using getInputProps()// in the composition list. Use this if you want to dynamically set the duration or// dimensions of the video.inputProps ,});// Select the composition you want to render.constcomposition =comps .find ((c ) =>c .id ===compositionId );// Ensure the composition existsif (!composition ) {throw newError (`No composition with the ID ${compositionId } found`);}awaitrenderMedia ({composition ,serveUrl :bundleLocation ,codec : "h264",outputLocation : "out/video.mp4",inputProps ,});};start ();
This flow is highly customizable. Click on one of the SSR APIs to read about it's options:
getCompositions()
- Get a list of available compositions from a Remotion project.renderMedia()
- Render a video or audiorenderFrames()
- Render an image sequencerenderStill()
- Render a still imagestitchFramesToVideo()
- Encode a video based on an image sequenceopenBrowser()
- Share a browser instance across function calls for even better performance.
Note that we only added a minimal example. For production, you should consider adding a queueing system and rate limiting.
Render using GitHub Actions
The template includes a GitHub Actions workflow file
under .github/workflows/render-video.yml
. All you have to do is to adjust the props that your root component accepts in the workflow file and you can render a video right on GitHub.
- Commit the template to a GitHub repository
- On GitHub, click the 'Actions' tab.
- Select the 'Render video' workflow on the left.
- A 'Run workflow' button should appear. Click it
- Fill in the props of the root component and click 'Run workflow'.
- After the rendering is finished, you can download the video under 'Artifacts'.
Note that running the workflow may incur costs. However, the workflow will only run if you actively trigger it.
See also: Passing props in GitHub Actions