How between-graph replication with asynchronous updates in Tensorflow runs?

221 Views Asked by At

We focus on this situation: between-graph replication with asynchronous updates.

The following figure shows how it works. And I known that each worker will send gradients to every ps and receive its updated parameters from every ps.(Is it correct? Is it path 1 and path 2?)

But can anyone explain in detail what does the figure 1,2,3,4,5,6 mean respectively?

Thank you in advance!

between-graph replication with asynchronous updates

0

There are 0 best solutions below