I have a Raspberry Pi which is streaming video feed from the camera via TCP using libcamera module. I want to write a Python application on another device that receives this stream and displays it in a GUI window. The most important part is low latency (<200 ms).
If I simply run ffplay command on the receiving machine:
ffplay tcp://192.168.0.10:5000 -fflags nobuffer -flags low_delay -framedrop -vf "setpts=N/30" -vcodec h264_v4l2m2m
Then my latency is 160-180ms, which is very good. Problem is, I cannot find any way to embed ffplay output into a Python application window (I tried pygame and tkinter for creating a window). I also tried other libraries to capture the stream - openCv, gstreamer, etc., but even with optimal settings, they all induce 2-5 second delay.
Can anyone advise what should I use to capture that stream and display it in the application window with similar delay as ffplay provides? Or, did I miss some way to embed ffplay into the application window?
EDIT: here is my code right now, trying to pipe ffplay output into tkinter window:
import subprocess
import numpy as np
from tkinter import *
from PIL import ImageTk, Image
import threading
class Stream:
def __init__(self):
self.ffplay_cmd = "ffplay tcp://192.168.1.216:5000 -nodisp -fflags nobuffer -flags low_delay -framedrop -tune zerolatency -vf setpts=0".split(" ")
self.WIDTH = 1024
self.HEIGHT = 600
def main(self):
self.process = subprocess.Popen(self.ffplay_cmd, stdout=subprocess.PIPE, stderr=subprocess.DEVNULL)
self.raw_frame = np.empty((self.HEIGHT, self.WIDTH, 3), np.uint8)
self.frame_bytes = memoryview(self.raw_frame).cast("B")
self.window = Tk()
self.window.configure(width=self.WIDTH, height=self.HEIGHT)
self.window.config(background="#FFFFFF")
self.lmain = Label(self.window)
self.lmain.place(relwidth=1.0, relheight=1.0)
threading.Thread(target=self.showFrame, daemon=True).start()
self.window.mainloop()
def showFrame(self):
while self.process.poll() is None:
self.process.stdout.readinto(self.frame_bytes)
frame = self.raw_frame.copy()
image = Image.fromarray(frame)
image = image.resize((self.WIDTH, self.HEIGHT))
imgtk = ImageTk.PhotoImage(image=image)
self.lmain.configure(image=imgtk)
self.lmain.imgtk = imgtk
def getSize(self):
return self.WIDTH, self.HEIGHT
stream = Stream()
stream.main()
Tkinter window opens, and streaming service recognizes connection and starts streaming, but tkinter window just turns black and doesn't display the stream. Any suggestions?