How do i timing streaming messages with Python gRPC

427 Views Asked by At

As the answer in the question: how-do-i-handle-streaming-messages-with-python-grpc, @Nathaniel provides a solution to handle request and response.

But when I want to statistics the process time of every response, it just doesn't look right. For example, I sleep 200ms in my stream_iter, but the tr even less than 200. My code:

t0 = time.time()
for rsp in stub.Process(stream_iter()):
    tr = (time.time() - t0) * 1000
    print(tr)
    t0 = time.time()
...

So I want to know how to timing?

1

There are 1 best solutions below

1
On

It's hard to tell what is wrong with the given snippet. I would recommend to create a complete reproduction case as an issue to https://github.com/grpc/grpc/issues.

Here are some possibilities:

  1. Flow control, the client/server might buffer messages it received, hence you might see small burst in certain scenarios.
  2. Bug in iterator, e.g., we want to sleep in __next__, but might accidentally injected sleep in __iter__.

Generally, if you tune up the scale and throughput, the noise should go away. What you have for measuring the latency of each response is good.