In Python, is it possible to use 'exponential backoff' per request for batched HTTP requests?

1.9k Views Asked by At

So, here I have written a script that adds students to the courses (Google Classroom API).

students = getStudents('Year10', '10A')  # VAR

for student in students:
    newStudent = {
        # Student Identifier
        'userId': student
    }
    batch1_1.add(service.courses().students().create(courseId=arCourseId, body=newStudent))
    batch1_1.add(service.courses().students().create(courseId=ciCourseId, body=newStudent))
    batch1_1.add(service.courses().students().create(courseId=dtCourseId, body=newStudent))
    batch1_1.add(service.courses().students().create(courseId=drCourseId, body=newStudent))
    batch1_1.add(service.courses().students().create(courseId=enCourseId, body=newStudent))
    batch1_2.add(service.courses().students().create(courseId=geCourseId, body=newStudent))
    batch1_2.add(service.courses().students().create(courseId=hiCourseId, body=newStudent))
    batch1_2.add(service.courses().students().create(courseId=icCourseId, body=newStudent))
    batch1_2.add(service.courses().students().create(courseId=laCourseId, body=newStudent))
    batch1_2.add(service.courses().students().create(courseId=maCourseId, body=newStudent))
    batch1_3.add(service.courses().students().create(courseId=muCourseId, body=newStudent))
    batch1_3.add(service.courses().students().create(courseId=peCourseId, body=newStudent))
    batch1_3.add(service.courses().students().create(courseId=reCourseId, body=newStudent))
    batch1_3.add(service.courses().students().create(courseId=scCourseId, body=newStudent))
batch1_1.execute()
time.sleep(1)
batch1_2.execute()
time.sleep(1)
batch1_3.execute()
time.sleep(1)

It does work, however, sometimes individual requests return:

"HttpError 500 when requesting https://classroom.googleapis.com/v1/courses/[COURSE ID]/students?alt=json returned "Internal Error""

And for these individual requests, I'd like to write the code so that it retries the individual failed request when receiving a 5xx error. I am unsure how to implement this though.

At the moment, I'm having to re-run the whole script if even just 1 student hasn't made it to the course, which of course, is a waste of resources.

1

There are 1 best solutions below

0
On BEST ANSWER

When you create a batch, you can provide a callback function that will be called for each of the requests you add to the batch.

The callback takes three parameters:

  • request_id: an id you decide to identify the request you're adding to the batch (you pass it when you call the add() method of the batch
  • response: the response of the single call you've made to the APIs
  • exception: exception object if the request of the batch thew an error

Below you have some pseudo-code to explain the logic.

# sample callback function
def my_batch_callback(request_id, response, exception):
    if exception is not None:
        # Do something with the exception
        print(exception)
    else:
        # Do something with the response
        print("Request is successful: {}".format(response))
    pass

# creation of batch passing in the call back
batch = service.new_batch_http_request(callback=my_batch_callback)

# addition to batch with a specific id
batch.add(service.object().insert(name="test-1", request_id="id-1"))
batch.add(service.object().insert(name="test-2", request_id="id-2"))
batch.add(service.object().insert(name="test-3", request_id="id-3"))

Using the callback you can save all the erroneous requests using their id and retry them again in a second moment. There are different ways to do this: you can use a simple list and check it after you run the batch or you can create a dedicated class and take advance of the persistence that it offers.

I suggest you to also have a look at the official documentation here.