batchWriteItem() returns inconsistent results

588 Views Asked by At

In node js I am trying to write records to a dynamoDB table using batchWriteItem().

On my first call to the function insertTransactionDetails() I am sending 9 records to insert while on the 2nd call to the same function I am sending 2 records.

The final records that get inserted in table are different on each run.

Test 1: I see 4 records from 1st call, followed by 2 records from the 2nd call, followed by 3 records from the first function call. Test 2: I see 2 records from 1st call, followed by 2 records from the 2nd call, followed by 5 records from the first function call.

Expected result was to see 9 records from 1st function call followed by 2 records from the 2nd function call.

I also noticed that for some reason only 9 records are getting inserted each time instead of a total of 11 records (9+ 2).

I have tried debugging a lot and searching online but unable to understand the root cause. I will really appreciate someone helping me to find the issue here. Thanks a lot.

function is called both the time as insertTransactionDetails(dataToInsert,0,bulkSearchParams). dataToInsert[][] is a 2 dimensional array. For this test data it's size would be dataToInsert[0][9] and dataToInsert[0][2] for the 2 calls respectively.

The data returned by batchWriteItem() is pasted below and shows that the UnprocessedItems is empty {} which makes me believe that the table provisioning is not an issue.

Below is the code.

Logs: Bulk Search - insertTransactionDetails()  - Success path: **{\"UnprocessedItems\":{},\"ConsumedCapacity\":[{\"TableName\":\"RaptorBulkSearchRequestTransactionDetails\",\"CapacityUnits\":18}]}** 

dataToInsert[i][j] = 
        {
          PutRequest: { 
              Item: {
                'RequestID' : {S: bulkSearchParams.operationId.concat('-',bulkSearchParams.sourceID)},
                'TimeStamp': {N: epochTime.toString()},
                'TransactionID': {S: bulkSearchParams.cardNumber.toString().concat('-',data.response.transactionRecords[j].GUID)},
                'TransactionItem': {S: JSON.stringify(data.response.transactionRecords[j])}
              }
          }
        };

function insertTransactionDetails (dataToInsert,index,bulkSearchParams){
    if (index < dataToInsert.length){
      // Call DynamoDB to add the item to the table
      var batchRequest = {
        RequestItems: {
            "RaptorBulkSearchRequestTransactionDetails": dataToInsert[index]
        },
        "ReturnConsumedCapacity": "TOTAL"
      };
      dynamodb.batchWriteItem(batchRequest, function(err, data) {
      if (err) {
          logErrorMessage(`Bulk Search - insertTransactionDetails() - Failure path: ${err}`, "routes.bulkSearch", bulkSearchParams.operationId, "NA");
          index++;
          insertTransactionDetails(dataToInsert,index,bulkSearchParams); //increment index for dataToInsert and insert the next set
        } else {
          logInfoMessage(`Bulk Search - insertTransactionDetails()  - Success path: ${JSON.stringify(data)}`, "routes.bulkSearch", bulkSearchParams.operationId, "NA"); 
          index++;
          insertTransactionDetails(dataToInsert,index,bulkSearchParams); //increment index for dataToInsert and insert the next set         
        }  
      });
    }
    else{
      //proceed to the next card if available
      bulkSearchParams.index ++; //incrementing the index for cardNumber
      processBulkSearch(bulkSearchParams);
    }
  }
1

There are 1 best solutions below

0
On

It was not dynamoDB issue. It was issue with my sort key i.e. TimeStamp.

The 2nd request was overwriting 2 records from 1st request.