AWS Lambda + DynamoDB handling larger amount of data timed out

714 Views Asked by At

First of all: I am receiving about 50.000 products from a supplier via API. The API has no pagination and therefore sending all 50k products in one Get Request.

I tried to handle this by fetching and storing the data into DynamoDB by using a aws Lambda function.

Currently the Dynamo DB has an auto Scaling up to 25 Write Units. But the throttling of the Dynamo still runs high (up to 40-50). This results that the lambda function takes very long to execute and running out of the 15 minutes limit.

Thus the API has no pagination I need to give the lambda 1GB of memory..

I am now wondering whats the best way to go for my case ? Of course I could increase the Dynamo write unit Limit more and more. But I am looking for a cost effective way of handling this.

As programming Language I am using Golang. And yes aws-sdk v2 is used for all the dynamo things in code.

Hopefully someone here can help me out.

0

There are 0 best solutions below