I've got a log file which is 248 MB and it can extend up to a GB. So you can imagine how many rows there can be.I need to import all the rows into a table in an SQL Server database. For that I first create a DataTable and add all the lines in the log file into that DataTable as new rows. This happens pretty fast. More than a million records get added to the table in about 30 seconds. After the table is filled with the lines I then import the records in the DataTable to the database using stored procedure. However this phase executes very heavily. Now I wonder, if I should use SqlBulkCopy WriteToServer method instead or should I keep with this way? If SqlBulkCopy is a better choise then should I use DataTable or IDataReader version. Thanks in advance.
Should I use SqlBulkCopy or Stored Procedure to import data
2.8k Views Asked by Mikayil Abdullayev At
1
There are 1 best solutions below
Related Questions in SQL-SERVER-2005
- Sqlcmd can connect to old SQL Server 2005, but connection string pyodbc with Driver 18 from Microsoft does not work
- Possible reasons for an SSIS conditional split into the same destination table?
- Replace occurrences of characters in a string
- What does the value 30 indicate in the type column in the MSrepl_commands table?
- What is the fastest way to extract data from SQL Server 2005 using PowerShell 7?
- Auto applied migrations using EF Core 7.0.14
- Need a SQL Query for Getting matched and not matched Records
- SQL Server query returns multiple line results instead of returning 1 line
- Write error with spark.write against SQL Server table (via JDBC Connection)
- Extract data using multiple conditions for the column value?
- MS SQL SERVER 2005 temp db log file size is increasing
- How to get the sum of a column for the current month?
- Cannot connect to server - An existing connection was forcibly closed by the remote host
- MS SQL Sort Alphanumeric values (1, 2, 3, 5, 6, 7, 9, 10, 4a, 8a, 4b, 8b)
- Is there a way to Manually unlock the Opentext Exstream Design Database?
Related Questions in C#-4.0
- How to call a C language function from x86 assembly code?
- What does: "char *argv[]" mean?
- User input sanitization program, which takes a specific amount of arguments and passes the execution to a bash script
- How to crop a BMP image in half using C
- How can I get the difference in minutes between two dates and hours?
- Why will this code compile although it defines two variables with the same name?
- Compiling eBPF program in Docker fails due to missing '__u64' type
- Why can't I use the file pointer after the first read attempt fails?
- #include Header files in C with definition too
- OpenCV2 on CLion
- What is causing the store latency in this program?
- How to refer to the filepath of test data in test sourcecode?
- 9 Digit Addresses in Hexadecimal System in MacOS
- My server TCP doesn't receive messages from the client in C
- Printing the characters obtained from the array s using printf?
Related Questions in BULK-LOAD
- How to bulk upload events into Outlook Calendar via a csv file while preserving line breaks in the description?
- How does bulkload in databases such as hbase/cassandra/KV store work?
- Should I import 1 million rows at 1 time or import 1 million time 1 row to MySQL?
- Microsoft SQL Server error certificate verify failed:self-signed certificate" when using bcp utility
- Bulk Download All invoices
- MySqlBulkLoader .Load method always returns 1
- OPENROWSET Bulk Load multiple JSON files from Azure Blob Storage
- How to insert Bulk data from csv in Postgres-Heroku?
- Why ActiveRecord Model.import is failing to insert bulk records in MySQL?
- Why I am facing an error while Loading Data Infile Error code: 1264
- Bulk load data issue in python
- Bulk import a list of tables from SQL server to RStudio
- Oracle Bulk Load takes the schema of the current connection
- Overload CQRS pattern with multiple database calls
- Batch Insert: Using Single vs. Multiple Statements
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular # Hahtags
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
I would go with SqlBulkCopy for data imports of any real volume like this. The performance difference vs. SqlDataAdapter can be large. e.g. I blogged as performance comparison for importing 100K rows:
SqlBulkCopy: 1.5885s
SqlDataAdapter: 25.0729s
You can get even greater throughput if you use a TableLock option with SqlBulkCopy, which in my test took the import then down to 0.8229s.
It's also worth noting that with SqlBulkCopy, you can have multiple instances bulk loading a segment of data into the same destination table in parallel without them contending with each other. Apologies for another external link, but I think it's relevant. That's about loading into a heap table, no indexes, for optimal performance, which may not be an option for your current scenario but is definitely worth knowing of.