Degraded performance when inserting records in batches into a table with columnstore index

33 Views Asked by At

With reference to this article, https://learn.microsoft.com/en-us/sql/relational-databases/indexes/columnstore-indexes-data-loading-guidance?view=sql-server-2016, it is suggested to have bulk load sizes (no of rows) between 102,400-1,048,576. Based on this, I did some testing by increasing the no of rows/batch from 100,000 to 250,000 in an SSIS package. Surprisingly, the package took 85 minutes to complete instead of 65 minutes for the same volume of data (50 million) before the change. Another execution took 60 minutes now instead of 25 minutes earlier for the same volume 20 million.

The below properties were modified in the package - from 100,000 to 250,000.

defaultBufferMaxRows
FastLoadOptions - ROWS_PER_BATCH
FastLoadMaxInsertCommitSize

Not able to understand why there is a degradation in performance, while following the guidance of Microsoft. Any clues? Or any other suggestions to improve load performance via SSIS?

Note: there were no OnError, OnWarning or OnInformation messages regarding lack of buffer memory or virtual memory during the package execution. This is also verified in the Azure graphs that show minimum memory available at a minimum of 4.5 GB, in the period of this package execution.

0

There are 0 best solutions below