ffmpeg: how to reduce CPU and bandwidth usage when grabbing still frames from live video stream?

1k Views Asked by At

I am using ffmpeg to grab still images from live camera feeds (rtsp://192.168.1.88:554/11). The input feeds are 1920x1080 h.264 video.

We may need to grab frames from as many as 10-15 cameras simultaneously.

All cameras are on the local network (100Mbit Ethernet).

I am executing ffmpeg with options to grab one frame every 10 seconds, or 1/10 FPS, and convert to 640x360 JPG output, written to a file in a temporary directory.

I notice that when a single instance of ffmpeg is running, the system's bandwidth usage increases to over 100kbps. So I'm assuming ffmpeg is streaming the video live all the time, instead of reconnecting to the stream every 10 seconds to grab a new image.

Is there any way to prevent this, and possibly force ffmpeg to only request data when it needs a new grab? I realize it would have to start/stop the stream and listen for keyframes etc to ensure a valid still frame is captured, but I'm wondering if this is possible and/or worth it.

Also, the CPU usage goes to about 12-15% for a single ffmpeg process (running on a Raspberry Pi 3). I'm concerned that 1) the PI will overheat if more ffmpeg processes are added, and 2) this amount of CPU usage per camera feed will limit the number of simultaneous feeds being captured to a less-than-optimal number.

TL;DR What are your suggestions for the most CPU- and bandwidth-optimized ffmpeg video-to-JPG conversion at a 1/10fps (or possibly 1/5fps) rate?

0

There are 0 best solutions below