We have a requirement to quickly store large volumes of data, something on an order of 100k records at once. I'm evaluating RavenDB, and it seems like the metrics I'm getting are rather low - it takes about 2.5-3 seconds to store 10k of records.
The code is pretty much straight from the documentation:
using (IDocumentStore store = new DocumentStore { Urls = new[] { "http://localhost:8080" }, Database = "xxx" })
{
store.Initialize();
for (int i = 0; i < 10; i++)
{
using (IDocumentSession session = store.OpenSession())
{
var things = DataGenerator.GenerateListOfThings(); // gets me a 10k objects
foreach (var thing in things)
{
session.Store(thing);
}
session.SaveChanges();
}
}
}
"Thing" objects is a flat object with about 20 properties, nothing special about it.
I've also tried saving in chunks of 1000 entities, which improved the run times by about 10%, and bulk load, which with similar results.
Raven runs in docker container with default config on a machine with i7, SSD and 16Gb of ram. Documentation says about "150k writes on commodity hardware" but I'm not seeing anything close to it. Am I missing something?
You'll get better performance with Bulk Insert, see here: https://ravendb.net/docs/article-page/4.1/Csharp/client-api/bulk-insert/how-to-work-with-bulk-insert-operation#example
Parallelizing the inserts will get even better performance.