Currently I am working on CSV imports with mongo and c#.
var contractModels = listContracts.Select(item =>
new ReplaceOneModel<UsaSpendingsContract>(
new ExpressionFilterDefinition<UsaSpendingsContract>(doc =>
doc.PageUrl == item.PageUrl),
item)
{ IsUpsert = true });
var bulkWriteResult = await _contractsCollection.BulkWriteAsync(contractModels, new BulkWriteOptions { IsOrdered = false });
Above is the code, when I read the csv file and batch a list of 1000 records at a time and try to insert them in the mongo db.
I have about 30 million records to insert.
If the collection size goes beyond 500k records the writes slow down tremendously, like 1000 records take upto 20 mins to write!
As such I keep draining the collection once it reaches 200k records. Doing this keeps the 1k write under 5 mins.
At this rate I am able to insert 1 million records in 16 hrs!
Can some one please guide me in the right direction here?
I am using c# with Asp net core 2.2 and mongo db c# driver v 2.9.2
Thanks in advance!