Should I use SqlBulkCopy or Stored Procedure to import data

2.8k Views Asked by At

I've got a log file which is 248 MB and it can extend up to a GB. So you can imagine how many rows there can be.I need to import all the rows into a table in an SQL Server database. For that I first create a DataTable and add all the lines in the log file into that DataTable as new rows. This happens pretty fast. More than a million records get added to the table in about 30 seconds. After the table is filled with the lines I then import the records in the DataTable to the database using stored procedure. However this phase executes very heavily. Now I wonder, if I should use SqlBulkCopy WriteToServer method instead or should I keep with this way? If SqlBulkCopy is a better choise then should I use DataTable or IDataReader version. Thanks in advance.

1

There are 1 best solutions below

7
On BEST ANSWER

I would go with SqlBulkCopy for data imports of any real volume like this. The performance difference vs. SqlDataAdapter can be large. e.g. I blogged as performance comparison for importing 100K rows:

SqlBulkCopy: 1.5885s
SqlDataAdapter: 25.0729s

You can get even greater throughput if you use a TableLock option with SqlBulkCopy, which in my test took the import then down to 0.8229s.

It's also worth noting that with SqlBulkCopy, you can have multiple instances bulk loading a segment of data into the same destination table in parallel without them contending with each other. Apologies for another external link, but I think it's relevant. That's about loading into a heap table, no indexes, for optimal performance, which may not be an option for your current scenario but is definitely worth knowing of.