I'm having problems with ffmpeg, probably due to my inexperience with this software.
My basic need is the following: I have a series of videos with material that I want to protect so that it is not plagiarized. For this I want to add a watermark so that when a user views it, they also see some personal data that prevents them from downloading and sharing it without permission.
What I would like is to create a small Angular + Java application that does this task (invoking ffmpeg via Runtime#exec)
I have seen that from ffmpeg I can emit to a server, like ffserver but I wonder if there is a somewhat simpler way. Something like launching the ffmpeg command from my java application with the necessary configuration and having ffmpeg emit the video along with the watermark through some port/protocol.
EDIT
I have continued to investigate and I have seen that ffmpeg allows you to broadcast for WebRTC, but you need an adapter. What I would like and I don't know if it is possible is to launch ffmpeg so that it acts as a server and it can be consumed from the web.
I don't have a Java example but we do something similar in our WebRTC .NET application. The code should be fairly straightforward to port to Java.
The RTP packets received in the reader can be streamed over most Java WebRTC libraries.