How can i stream through ffmpeg a canvas generated in Node.js to youtube/any other rtmp server?

2.2k Views Asked by At

i wanted to generate some images in Node.JS, compile them to a video and stream them to youtube. To generate the images i'm using the node-canvas module. This sounds simple enough, but i wanted to generate the images continuously, and stream the result in realtime. I'm very new to this whole thing, and what i was thinking about doing, after reading a bunch of resources on the internet was:

  1. Open ffmpeg with spawn('ffmpeg', ...args), setting the output to the destination rtmp server
  2. Generate the image in the canvas
  3. Convert the content of the canvas to a buffer, and write it to the ffmpeg process through stdin
  4. Enjoy the result on Youtube

But it's not as simple as that, is it? I saw people sharing their code involving client-side JS running on the browser, but i wanted it to be a Node app so that i could run it from a remote VPS. Is there a way for me to do this without using something like p5 in my browser and capturing the window to restream it? Is my thought process even remotely adequate? For now i don't really care about performance/resources usage. Thanks in advance.

EDIT:

I worked on it for a bit, and i couldn't get it to work... This is my code:

const { spawn } = require('child_process');
const { createCanvas } = require('canvas');
const fs = require('fs');


const canvas = createCanvas(1920, 1080);
const ctx = canvas.getContext('2d');
const ffmpeg = spawn("ffmpeg",
    ["-re", "-f", "png_pipe", "-vcodec", "png", "-i", "pipe:0", "-vcodec", "h264", "-re", "-f", "flv", "rtmp://a.rtmp.youtube.com/live2/key-i-think"],
    { stdio: 'pipe' })

const randomColor = (depth) => Math.floor(Math.random() * depth)
const random = (min, max) => (Math.random() * (max - min)) + min;

let i = 0;
let drawSomething = function () {
    ctx.strokeStyle = `rgb(${randomColor(255)}, ${randomColor(255)}, ${randomColor(255)})`
    let x1 = random(0, canvas.width);
    let x2 = random(0, canvas.width);
    let y1 = random(0, canvas.height);
    let y2 = random(0, canvas.height);
    ctx.moveTo(x1, y1);
    ctx.lineTo(x2, y2);
    ctx.stroke();

    let out = canvas.toBuffer();
    ffmpeg.stdin.write(out)
    i++;
    if (i >= 30) {
        ffmpeg.stdin.end();
        clearInterval(int)
    };
}

drawSomething();
let int = setInterval(drawSomething, 1000);

I'm not getting any errors, neither i am getting any video data from it. I have set up an rtmp server that i can connect to, and then get the stream with VLC, but i don't get any video data. Am i doing something wrong? I Looked around for a while, and i can't seem to find anyone that tried this, so i don't really have a clue...

EDIT 2: Apparently i was on the right track, but my approach just gave me like 2 seconds of "good" video and then it started becoming blocky and messy. i think that, most likely, my method of generating images is just too slow. I'll try to use some GPU accelerated code to generate the images, instead of using the canvas, which means i'll be doing fractals all the time, since i don't know how to do anything else with that. Also, a bigger buffer in ffmpeg might help too

1

There are 1 best solutions below

0
On

Your approach is fine. Just keep in mind that whatever you're streaming to is going to expect the source end to keep up. If you can't generate frames fast enough, all sorts of things are going to go wrong. (I think this is mostly due to YouTube trying to drop the source that's too slow, and pick up again once it receives frames, only to fail again.)

You can test by outputting to a file first before trying to stream to an RTMP server.