Why is Print Screen versus what is actually displaying on the monitor are different?

48 Views Asked by At

I'm working on an application that screen captures a monitor in real-time, encodes it, sends it over ethernet, decodes it, then displays that monitor in an application.

So I put the decoder application on the same monitor that is being captured. I then open a timer application and put it next to the decoder application. I can then start the timer and see the latency between main instance of the timer and the timer within the application.

What's weird is that if I take a picture of the monitor with a camera, I get one latency measurement (almost always ~100ms) but if I take a Print Screen of the monitor, the latency between the two is much lower (~30-60ms).

Why is that? How does Print Screen work? Why would it result in 40+ ms difference? Which latency measurement should I trust?

1

There are 1 best solutions below

3
On

Print Screen saves the screenshot to your clipboard which is stored on your RAM (highest speed storage system in your computer), whereas what you are doing probably writes the screenshot data to your HDD/SSD and then reads it again to send over the internet, which takes a lot longer to do.