Is there a way to increase the amount of requests I can send Docker?

277 Views Asked by At

Currently I am using python to build a small application that will allow me to see the travel times from one location to a spread of other locations. Approx 50k other locations. I am using docker to set up an API sandbox to run the OSRM calculations that will return estimated travel times.

I've never used Docker before, so this is completely new to me. I am able to send about 12-14k requests before Docker stops me.

Now I can easily splice my database to send my requests in packets of 10k. However, I was wondering if there is a way I could allow docker to accept the full 50K requests without kicking me out.

This is the error I get. Hopefully this clarifies the question, because I am not sure if I am asking this correctly.

HTTPConnectionPool(host='127.0.0.1', port=5000): Max retries exceeded with url: /route/v1/driving/-104.7783563744188,40.705742844393505;-104.8130275,40.0847055 (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x000001786C29E708>: Failed to establish a new connection: [WinError 10048] Only one usage of each socket address (protocol/network address/port) is normally permitted'))

Example:

import polyline
import requests
import numpy as np

def get_route(pickup_lat, pickup_lon, dropoff_lat, dropoff_lon):
    '''Outputs the respecting route characteristics.
       distance, duration, route'''
    
    # Configure URL request for docker and correct port
    loc = "{},{};{},{}".format(pickup_lon, pickup_lat, dropoff_lon, dropoff_lat)
    url = "http://127.0.0.1:5000/route/v1/driving/"
    
    with requests.Session() as s:
        r = s.get(url + loc) 
        if r.status_code!= 200:
            return {}

        # Decoding the Json file
        res = r.json()   
        routes = polyline.decode(res['routes'][0]['geometry'])
        start_point = [res['waypoints'][0]['location'][1], res['waypoints'][0]['location'][0]]
        end_point = [res['waypoints'][1]['location'][1], res['waypoints'][1]['location'][0]]
        distance = res['routes'][0]['distance']
        duration = res['routes'][0]['duration']

        # Output the relative metrics
        out = {'distance':distance,
               'duration':duration}

    return out

lon = np.random.uniform(low=-105, high=-104, size=(50000))
lat = np.random.uniform(low=40, high=41, size=(50000))
df = np.vstack((lat, lon)).T

index = 0
for lat_lon in df:
    get_route(*lat_lon, 40.0847055, -104.8130275)
    print(index)
    index += 1

with requests.Sessions seems to be working worse or rather inconsistent. Ranges between 3-15k requests.

edit: kthompso suggestion has solved it. Taking with.sessions outside for the function was the solution

 with requests.Session() as s: 
    for lat_lon in df:
        get_route(*lat_lon, 40.0847055, -104.8130275)
        print(index)
        index += 1
0

There are 0 best solutions below