TSQL ADO.NET Incorrect Row Count Returned

346 Views Asked by At

In C# .NET 4.0 have a BlockingCollection that is taken from

BlockingCollection

Sample BC_AddTakeCompleteAdding

My problem is that a SQLCommand.ExecuteNonQuery in .NET is returning the wrong row count.
The update is on the primary key so should get one row.
Some times get the correct number.
Often getting a number greater than 1 (100-10000) in .NET.
And it is not always the same wrong number even when running the exact same TSQL against the same PK.
Can copy paste the TSQL to SSMS and get the correct answer (1) every time.

update [docSVsys]  set [FTSstatus] = '1', [FTSdate] = '10/23/2012 8:51:32 AM' , 
textHash = '5b4d553360fbe733a66eebf36fa666f7', [textSize] = '39504' 
where [sID] = '1525850'

Declaring the variable on use and no other variable is named rowsRet5

int rowsRet5 = sqlCmdUpate.ExecuteNonQuery();

Checked for that textHash value and only the one row was updated.
It appears to perform the proper update but reports the wrong count.
Given the count is wrong not willing to use this on production data.

This command is at the end of the consumer side.
Have two .BeginExecuteNonQuery above this update.
Those updates are to different tables and do not reference docSVsys.
Those tables do have FK reference to docSVsys.
In debug if I stop in callbacks (slow it down) then I don't get this error.
I am wondering if BeginExecuteNonQuery in a Task is not the problem.
This wrong rowCount does not match either of asynch rowCounts but is in the same range.

This base code has processed millions of rows.
Did not change any of the TSQL.
It went bad when converted to producer consumer.

To mark a document as in progress use a very similar TSQL in the producer side and it has no problems. That loop also has a BeginExecuteNonQuery.

1

There are 1 best solutions below

0
On BEST ANSWER

The problem appears to be sharing a connection and command on both the producer and consumer side.

Yes I realize that is obviously a bad thing to do.
When it was a single loop (no producer consumer sides) it was OK to share that command.
When I converted to consumer producer I did not think to split the command.