I am getting ObjectOptimisticLockingFailureException
with batch update when I use microsoft sqlserver driver (sqljdbc4.jar). However with the same code, when I switch the driver to Jtds driver, I am able to succesfully do batch update and insert. we are using sql server database and JPA with Hibernate.
Below is what I am doing.
Get the data from the database with status_code as SUBMITTED and current flag as Y.
Batch Update the records to current flag N and Batch insert new record with new status BATCH_LOCKED.
- Batch update the above records again with current flag N and batch insert new record with status COMPLETED.
All these updates go through fine. When the transaction is committing, I am getting the exception below
I am not specifically defining Optimistic locking and nor have i defined @Version annotation for any column. How does the Optimistic locking happening by default?
If I comment the Step #3, then I do not get the exception.
Why do I not get the exception with JTDS driver?
org.springframework.orm.ObjectOptimisticLockingFailureException: Batch update returned unexpected row count from update [1]; actual row count: 0; expected: 1; nested exception is org.hibernate.StaleStateException: Batch update returned unexpected row count from update [1]; actual row count: 0; expected: 1
In step 3 when you try to update the same entries, the value has been changed in step 1 and 2. So either you need to read the updated data again and then update it.
The Optimistic Locking Exception prevents lost updates and you shouldn't ignore it. You can simply catch it in a common exception handler and redirect the user to the current workflow starting point, indicating there was a concurrent change that he was not aware of.