Error while using BulkCopy (Failed to read the result set)

1.1k Views Asked by At

I developed a vb.net application that fills tables with API requests from another server and transfers those tables to MySql database using MySqlBulkCopy.

Here is the helper class(initialized only once) that I use to bulk copy multiple tables. Executing writeToServer() fires the following error: "Failed to read the result set." without any further information.

    Private connStr As String
    Private conn As MySqlConnection
    Private blkInsert As MySqlBulkCopy
    Public ErrorMessage As String
    Public ErrorTableName As String

    Public Sub New()
        connStr = My.MySettings.Default.MyBaseConnectionString
        conn = New MySqlConnection(connStr)

        blkInsert = New MySqlBulkCopy(conn)
    End Sub

    Public Function WriteToMySql(destinationTableName As String, localTable As DataTable) As Boolean
        blkInsert.DestinationTableName = destinationTableName

        Try
            blkInsert.WriteToServer(localTable)
        Catch ex As Exception
            ErrorMessage = ex.Message
            ErrorTableName = destinationTableName
            Return False
        End Try

        Return True
    End Function
End Class

To the WriteToMySql I pass the name of the table to be copied to, and the table that contains data.

If Not (blkInsert.WriteToMySql("tb_vendor", Ds.Vendor) AndAlso
                blkInsert.WriteToMySql("tb_group", Ds.Group) AndAlso
                blkInsert.WriteToMySql("tb_subgroup", Ds.SubGroup) AndAlso
                blkInsert.WriteToMySql("tb_product", Ds.Product) AndAlso
                blkInsert.WriteToMySql("tb_product_attribute", Ds.Characteristic) AndAlso
                blkInsert.WriteToMySql("tb_product_attributes", productAttributes) AndAlso
                blkInsert.WriteToMySql("tb_encoded_names", productNamesEncoded) AndAlso
                blkInsert.WriteToMySql("tb_pictures", Ds.PictureNew) AndAlso
                blkInsert.WriteToMySql("tb_group_category", groupCategories)) Then
...

The error appears inconsistently, and when it does, re-executing(with the same tables) does not help, it triggers the same error, like it is something with the table itself. I am inserting 6 tables, and the one that contains the most rows fires that error(always the same table), while any other table is being copied successfully.

The only way to fix the error is to load the data from API's(reinitialize and fill the tables all over again) and try inserting. Most of the time it works without causing that error, it fires super random.

Any help would be appreciated, I can provide more info if needed. Thanks in advance.

Edit(answer to questions): The data table that is creating the problem consists of no more than 200k rows, but I have already tried inserting much larger tables(2-3m rows) to check if it has to do something with the size, no errors were present.

When the error occurs, I see that BulkCopy inserted a portion of the rows, and then threw the error. Basically there are inserted rows on the DB, but not all of them.

Ds is the name of my variable(dataset) which I fill from API's. The other standalone tables are generated by me afterwards for processing data and generating needed values.

I have not set any timeout, but while debugging I see that it is equal to 0, which means it should wait no matter how long it takes.

I am trying to trigger the error while debugging, it appeared on the release version lately. When I do so, I will provide more info.

Edit2(Stack trace and Inner Exception):

Failed to read the result set.

StackTrace:

   at MySqlConnector.MySqlDataReader.ActivateResultSet()
   at MySqlConnector.MySqlDataReader.<CreateAsync>d__97.MoveNext()

--- End of stack trace from previous location where exception was thrown ---

   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd(Task task)
   at MySqlConnector.Core.CommandExecutor.<ExecuteReaderAsync>d__0.MoveNext()

--- End of stack trace from previous location where exception was thrown ---

   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at MySqlConnector.MySqlCommand.<ExecuteNonQueryAsync>d__69.MoveNext()

--- End of stack trace from previous location where exception was thrown ---

   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at MySqlConnector.MySqlBulkLoader.<LoadAsync>d__73.MoveNext()

--- End of stack trace from previous location where exception was thrown ---

   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at System.Runtime.CompilerServices.ConfiguredValueTaskAwaitable`1.ConfiguredValueTaskAwaiter.GetResult()
   at MySqlConnector.MySqlBulkCopy.<WriteToServerAsync>d__29.MoveNext()

--- End of stack trace from previous location where exception was thrown ---

   at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at System.Runtime.CompilerServices.ValueTaskAwaiter`1.GetResult()
   at MySqlConnector.MySqlBulkCopy.WriteToServer(DataTable dataTable)
   at PolyBridge.BulkInsert.WriteToMySql(String destinationTableName, DataTable localTable)

InnerException:

Error during LOAD DATA LOCAL INFILE
2

There are 2 best solutions below

2
On

Try setting the bulk copy timeout to a large number, like this 3600 seconds for one hour.

    blkInsert = New MySqlBulkCopy(conn)
    blkInsert.BulkCopyTimeout = 3600

You can also disable the timeout altogether with this

    blkInsert.BulkCopyTimeout = 0
2
On

I know SqlServer and Oracle pretty well, but MySql is not my particular area of expertise. However, in checking Google for similar cases, I did find something that might explain it: exceeding the max_allowed_packet size.

This would only happen on the largest files and the apparent solution is to raise this setting to its maximum value:

set global max_allowed_packet=268435456