First of all, I’ve found this repository which has a ffmpeg JavaScript implementation:
https://github.com/ffmpegwasm/ffmpeg.wasm
I’m curious if I can somehow bind my canvas output and pass some frametimes to get a video output. (for example, visualize a physics object)
So far, I’ve set up a basic physics simulator in JS. I have a bunch of squares being rendered based on their x and y coordinates.
class PhysicsObject {
// ...
render(canvas, ctx) {
ctx.fillStyle = this.color;
ctx.fillRect(this.x - this.w / 2, this.y - this.h / 2, this.w, this.h);
}
// ...
}
let timer = performance.now();
// ...
function draw() {
// ...
let now = performance.now();
dt = (now - timer) / 1000;
timer = now;
// ...
for (let object of physicsObjects) {
// ...
object.update(dt);
object.render(canvas, ctx);
// ...
}
requestAnimationFrame(draw);
}
I now need a way to link my canvas output to the ffmpeg and some other parameters but I have no idea where to even start.
If there is a way to bind the canvas output to the ffmpeg port, I’d like to delve deeper into the documentation of this ffmpegwasm thing.
2
Answers
You cant render in realtime because browser can lag. You must send frames via ajax and wait for response. On server side store image and add row to concat file.
Similar request with solution here with example code. The ffmpeg.wasm code on that answer looks a little old, but the basic technique should be what you are after.
You record the canvas to a .webm video using MediaRecorder (MDN docs with example) and then (optionally) use ffmpeg.wasm to transcode the .webm video to .mp4. All of it is done in the browser.