After frustration of waiting for data to be deleted from a table with 50+ million rows. I finally did some research and found an interesting article from SQLPerformance.com where Aaron Bertrand explains and benchmarks different ways to delete data from big tables.
Using a while loop helped a lot, I could easily track the progress of the delete and not have to worry about rollbacks when having to stop a huge delete script.
Here’s a sample below showing the script used to delete rows from a huge table. The CHECKPOINT and BACKUP LOG is important.
Make sure you know what your database’s Recovery Model first. See the screenshot below the script to find out what your database’s recovery model is.
SET NOCOUNT ON; DECLARE @R INT; SET @R = 1; WHILE @R > 0 BEGIN DELETE TOP (10000) -- change this to your needs dbo.HugeTable WHERE Criteria = 'XXX'; SET @R = @@ROWCOUNT; -- CHECKPOINT; -- if simple -- BACKUP LOG ... -- if full END;
Right click on your database and select properties and navigate to the “Options” tab to view your database recovery model.
I suggest that everyone who has had to delete (not truncate) large amounts of data from a table should read Aaron Bertrand article on “Break large delete operations into chunks“.