Skip to main content
The producer package combines the engine’s frame capture with FFmpeg encoding to deliver a complete HTML-to-video rendering pipeline. It supports MP4 (h264) and WebM (VP9 with alpha transparency), and handles runtime injection, readiness gates, audio mixing, and optional Docker-based deterministic rendering.
npm install @hyperframes/producer

When to Use

Use @hyperframes/producer when you need to:
  • Render compositions to MP4 or WebM programmatically from Node.js (e.g., in a backend service or CI pipeline)
  • Build a custom rendering service with fine-grained control over the pipeline
  • Run visual regression tests against golden baselines
  • Benchmark render performance across different configurations
Use a different package if you want to:
  • Render from the command line without writing code — use the CLI (npx hyperframes render)
  • Preview compositions in the browser — use the CLI or studio
  • Capture frames without encoding — use the engine
  • Lint or parse composition HTML — use core
If you are building a web application or script that just needs to render a video, the CLI is the fastest path. The producer package is for when you need programmatic control inside Node.js.

What It Does

The producer orchestrates the full render pipeline:
1

Load the composition HTML

Reads your index.html and any referenced sub-compositions.
2

Inject the Hyperframes runtime

Adds the runtime script that manages timeline seeking, clip lifecycle, and media playback.
3

Wait for readiness gates

Polls for window.__playerReady and window.__renderReady to ensure all assets (fonts, images, video) are loaded before capture begins.
4

Capture frames via the engine

Uses the engine’s BeginFrame pipeline to capture each frame as a pixel buffer.
5

Encode to MP4 or WebM via FFmpeg

Pipes frame buffers into FFmpeg with the selected quality preset. MP4 uses h264; WebM uses VP9 with alpha transparency support.
6

Mix audio tracks

Extracts audio from video clips and audio elements, applies data-volume and data-media-start offsets, and mixes them into the final MP4.

Programmatic Usage

The producer uses a two-step API: create a render job configuration, then execute it.
import { createRenderJob, executeRenderJob } from '@hyperframes/producer';

const job = createRenderJob({
  input: './my-video/index.html',
  output: './output.mp4',
  fps: 30,
  quality: 'standard',
});

const result = await executeRenderJob(job);

Render Configuration

import type { RenderConfig } from '@hyperframes/producer';

const config: RenderConfig = {
  fps: 30,                   // 24, 30, or 60
  quality: 'standard',       // 'draft', 'standard', or 'high'
  format: 'mp4',             // 'mp4' or 'webm' (WebM renders with transparency)
  workers: 4,                // Parallel render workers (1-8)
  useGpu: false,             // GPU-accelerated encoding
  debug: false,              // Debug logging
};

WebM with Transparency

Set format: 'webm' to render with a transparent background using VP9 alpha:
const job = createRenderJob({
  fps: 30,
  quality: 'standard',
  format: 'webm',
});

await executeRenderJob(job, './my-overlay', './overlay.webm');
When format: 'webm':
  • Frames are captured as PNG (preserves alpha channel)
  • Chrome’s page background is set to transparent via CDP
  • FFmpeg encodes with VP9 + yuva420p pixel format
  • Audio is encoded as Opus (instead of AAC for MP4)

Progress Callbacks

import type { ProgressCallback, RenderStatus } from '@hyperframes/producer';

const onProgress: ProgressCallback = (status: RenderStatus) => {
  console.log(`Status: ${status}`);
  // Statuses: "queued" | "preprocessing" | "rendering" | "encoding"
  //           | "assembling" | "complete" | "failed" | "cancelled"
};

Cancellation

import { RenderCancelledError } from '@hyperframes/producer';

try {
  await executeRenderJob(job);
} catch (err) {
  if (err instanceof RenderCancelledError) {
    console.log(`Cancelled: ${err.reason}`);
    // reason: "user_cancelled" | "timeout" | "aborted"
  }
}

HTTP Server

The producer includes a built-in HTTP server for running as a rendering service:
import { startServer } from '@hyperframes/producer/server';

await startServer({ port: 8080 });

Server Endpoints

MethodPathDescription
POST/renderBlocking render — returns JSON result
POST/render/streamStreaming render with Server-Sent Events
POST/lintLint a composition for issues
GET/healthHealth check
GET/outputs/:tokenDownload a rendered MP4
For custom server integration, use the lower-level handlers:
import { createRenderHandlers, createProducerApp } from '@hyperframes/producer/server';

// Get individual request handlers
const handlers = createRenderHandlers(options);

// Or get a full Hono app
const app = createProducerApp(options);

Docker Rendering

For deterministic output, the producer can render inside a Docker container with a pinned Chrome version and font set. This guarantees identical output across machines — critical for CI pipelines and production services.
# Via the CLI (recommended)
npx hyperframes render --docker --output output.mp4
Docker mode requires Docker to be installed and running. Run npx hyperframes doctor to verify your environment. See Deterministic Rendering for details on what makes Docker mode deterministic.

Quality Presets

PresetResolutionEncodingUse Case
draftOriginalFast CRFQuick iteration, previewing edits
standardOriginalBalanced CRFProduction renders, sharing
highOriginalHigh-quality CRFFinal delivery, archival

GPU Encoding

The producer supports hardware-accelerated encoding for faster renders:
PlatformEncoderFlag
NVIDIANVENCAuto-detected
macOSVideoToolboxAuto-detected
LinuxVAAPIAuto-detected
GPU encoding is automatically used when available. To check your system’s capabilities:
npx hyperframes doctor

Additional Exports

The producer also re-exports key engine functionality for convenience:
ExportDescription
createCaptureSession()Create a frame capture session
initializeSession()Initialize session with a composition
captureFrame() / captureFrameToBuffer()Capture individual frames
closeCaptureSession()Clean up a capture session
getCompositionDuration()Get total composition duration
getCapturePerfSummary()Get capture performance metrics
createFileServer()Create an HTTP file server for serving assets
createVideoFrameInjector()Create a video frame injector for page
resolveConfig() / DEFAULT_CONFIGProducer configuration
createConsoleLogger() / defaultLoggerLogging utilities
quantizeTimeToFrame()Convert time to frame boundary
resolveRenderPaths()Resolve render directory paths
prepareHyperframeLintBody() / runHyperframeLint()Linting utilities

Regression Testing

The producer includes a regression harness for comparing render output against golden baselines. This is useful for catching visual regressions when changing the runtime, engine, or rendering pipeline.
cd packages/producer

# Build the test Docker image
bun run docker:build:test

# Run regression tests (compares output against golden baselines)
bun run docker:test

# Regenerate golden baselines after intentional changes
bun run docker:test:update

Benchmarking

Find optimal render settings for your hardware:
# Via the CLI
npx hyperframes benchmark

# Directly from the producer package
cd packages/producer
bun run benchmark
The benchmark runs several compositions with different quality and FPS settings and reports timing for each combination.

CLI

Command-line interface that wraps the producer for rendering, previewing, and more.

Engine

The low-level capture pipeline that the producer uses to grab frames.

Core

Types, runtime, and linter that the producer depends on.

Studio

Visual editor for building compositions before rendering with the producer.