I am working on Workflow like system. I have one task table and status field. Value for status can be one of New,ready,processing,error,abort,done.
I have about 7 processes which will be triggered based on different situation to change value of task status. Most time each process will work on its own data set and every time it only processes up to 5000 records. But I still see some deadlock if data reach around 2 million records. I check with SQL Profiler, looks like some page resource related. I am not good at sql server performance tuning and do no understand it very well.
Since inactive task will be archived every day, I am thinking to redesign the table to support around 10 Million records.
Few choice may be :
- Create split tables based on status.
- Create a master table with static data and supported table based on status
Is there any good practice for this kind of situation?
Thanks!