We have a table with 800M rows and 93 columns that needs historical data removed. The data to be removed makes up > 95%. So far, attempts to delete in batches have failed, estimating at least one week to complete. The existing indexes are the main cause of the slowness. Removing the indexes and rebuilding after deletion has also proved too slow in testing.
It was decided that we should create a new table containing only the data we need, before renaming the old table as backup and renaming new to original.
The problem is that there are at least a hundred keys, constraints and indexes.
We are considering generating a script from Object Browser, renaming all the object names using find/replace, creating the identically structured table, populating with data, renaming original as backup, then renaming the new table and all the objects to those contained by the original table.
This is a critical table on a live database with only a few overnight hours to complete the process. So many manual renamings make us nervous as there cannot be any problems when the client begins trading.
What is the most reliable method? Is there an automated process to undertake this procedure? Any recommendations appreciated.
I would do the following during your downtime (i.e. the database is not used by anyone):
SIMPLE
and/or disable triggers/NC indexes etc. Once the work is done, restore back to original settings.But the final solution would be to upgrade your sql server 2008 to SQL Server 2016+, in SQL Server 2016 SP1, even standard edition has the partition capability and that partition method will make your life much easier (another topic though)