Is it possible to set a maximum number of retries shared between all pool connections in the same session?

559 Views Asked by At

Currently I do the following for setting a maximum number of connection retries for my grequest wrapper:

self._s = Session()
retries = Retry(total=5, status_forcelist=[500, 502, 503, 504])
self._s.mount('http://, HTTPAdapter(max_retries=retries))

I then create a bunch of grequest objects with the session self._s as one of the arguments. For exmaple, creating a set of GET requests would be done using something like this:

requests = [grequests.get(url, ..., 'session': self._s')]

Finally, these are all eventually issued using grequests.map(requests, ...).

The problem is I want a way of making the maximum number of retries persist and be shared across all connections of a connection pool. The retries still seem to be applied only on an individual connection basis. Is there any possible way of doing this? Is this not possible since new Retry() objects seem to be created upon each decrement from total calls?

1

There are 1 best solutions below

0
On

I think you are out of luck. The Retry docstring says (excerpt):

Each retry attempt will create a new Retry object with updated values, so they can be safely reused.

So a new object is made, like you said, every connection...and that's done by design.

Furthermore, the Retry object itself is what allows for the threading: it sleeps between connection attempts. So one Retry object must be associated with one thread, by this design. Sorry. Here's a link to urllib.utils.retry in case that helps.