How to Render MP4 in Node.js Without FFmpeg
May 14, 2025 · By VideoFlowProgrammatic video in Node.js doesn't have to mean FFmpeg child processes. Learn how to use headless Chromium and WebCodecs for faster, consistent Node.js video rendering.
How to Render MP4 in Node.js Without FFmpeg
For years, programmatic video in Node.js meant one thing: spawning an FFmpeg child process, piping raw frames over stdin, and praying your server had enough RAM to handle the context switching. It’s brittle, hard to debug, and requires managing heavy native binaries in your production environment.
But what if you could render production-grade MP4s using the same engine that powers your browser?
In this guide, we’ll look at a modern approach to Node.js video rendering that bypasses the FFmpeg bottleneck entirely by leveraging WebCodecs and headless Chromium.
The FFmpeg Bottleneck
Most video automation pipelines follow a predictable, painful pattern. You generate frames (often as JPEGs or PNGs) using a library like Canvas or a headless browser, then you pipe those images into an FFmpeg command.
This "screenshot-and-pipe" architecture has three major flaws:
- Overhead: Every frame is encoded to an image format (like JPEG), sent over a pipe, decoded by FFmpeg, and then re-encoded to H.264.
- Complexity: Managing FFmpeg versions across local dev, CI, and production (especially in Lambda or serverless environments) is a DevOps nightmare.
- Inconsistency: What you see in your preview (usually a browser) often differs from the FFmpeg output due to font rendering, CSS support, or color space shifts.

A New Path: Headless WebCodecs
The introduction of WebCodecs changed the game. Modern browsers can now encode video frames directly into an MP4 container in a high-performance, low-level way. By running a headless Chromium instance on your server, you can tap into this hardware-accelerated (or highly optimized software) encoding path.
This means your server-side rendering engine is literally the same engine that runs your web app. No more "it looks different on the server."
Building a Render Pipeline in 20 Lines
With VideoFlow, you don't have to manage the headless browser or the WebCodec muxing logic yourself. The @videoflow/renderer-server package handles the Playwright orchestration and returns a buffer or saves a file directly.
First, install the core and the server renderer:
npm install @videoflow/core @videoflow/renderer-server
npx playwright install chromium
Now, let's build a simple dynamic video:
import VideoFlow from '@videoflow/core';
async function generateSocialVideo(username: string) {
const $ = new VideoFlow({
width: 1080,
height: 1920,
fps: 30,
backgroundColor: '#0b0b1f'
});
// Add a background image
const bg = $.addImage({
source: 'https://images.unsplash.com/photo-example',
opacity: 0.4
});
// Add dynamic text
const title = $.addText({
text: `Hello, ${username}!`,
fontSize: 8,
color: '#FF5A1F', // VideoFlow Orange
fontWeight: 900,
position: [0.5, 0.4]
});
// Animate the entrance
title.fadeIn('800ms');
title.animate({ scale: 0.8 }, { scale: 1 }, { duration: '1s', easing: 'easeOut' });
$.wait('3s');
// Render to MP4
await $.renderVideo({
outputType: 'file',
output: `./videos/${username}-welcome.mp4`,
verbose: true
});
}
This code doesn't just "take screenshots." It boots a headless browser, compiles your project into a VideoJSON document, and uses the browser's internal encoding pipeline to produce a valid MP4.
Why "JSON as the New MP4" Matters
When you treat video as code (specifically, as a portable JSON schema), your architecture becomes significantly more flexible. You can:
- Preview instantly: Use
@videoflow/renderer-domto show the user a 60fps live preview in the browser. - Export locally: Use
@videoflow/renderer-browserto let the user's own machine do the heavy lifting of rendering, saving you server costs. - Render at scale: Send that same JSON to a Node.js worker running
@videoflow/renderer-serverfor batch processing.

How VideoFlow Handles This
VideoFlow was built from the ground up to be an FFmpeg alternative for developers who want a purely TypeScript-driven workflow.
- @videoflow/core: The engine that manages the timeline, animations, and transitions. It produces a
VideoJSONthat is platform-agnostic. - @videoflow/renderer-server: The Node.js package we've discussed. It uses Playwright to drive Chromium. It can fall back to FFmpeg if you explicitly ask for it (e.g., for specific x264 flags), but the default path is purely browser-based.
- Cinematic Presets: You get access to 27 transitions (like
glitchResolveorblurResolve) and 42 GLSL effects (likebloomorfrosted glass) that work exactly the same on your dev machine as they do on the server.
By moving the rendering logic into the browser engine, we eliminate the biggest source of "rendering drift" in programmatic video.
Conclusion
If you are still wrestling with FFmpeg command-line strings or managing binary dependencies in your Dockerfiles, it’s time to look at a WebCodecs-first approach. Node.js video rendering has evolved beyond the shell script.
Ready to start building?
- Explore the Playground to see the API in action.
- Check out the Docs for the full list of effects and transitions.
- Star the project on GitHub.