As the answer in the question: how-do-i-handle-streaming-messages-with-python-grpc, @Nathaniel provides a solution to handle request and response.
But when I want to statistics the process time of every response, it just doesn't look right. For example, I sleep 200ms in my stream_iter
, but the tr
even less than 200. My code:
t0 = time.time()
for rsp in stub.Process(stream_iter()):
tr = (time.time() - t0) * 1000
print(tr)
t0 = time.time()
...
So I want to know how to timing?
It's hard to tell what is wrong with the given snippet. I would recommend to create a complete reproduction case as an issue to https://github.com/grpc/grpc/issues.
Here are some possibilities:
__next__
, but might accidentally injected sleep in__iter__
.Generally, if you tune up the scale and throughput, the noise should go away. What you have for measuring the latency of each response is good.