Deleting large amounts of data from a SQL table

After frustration of waiting for data to be deleted from a table with 50+ million rows. I finally did some research and found an interesting article from where Aaron Bertrand explains and benchmarks different ways to delete data from big tables.

Using a while loop helped a lot, I could easily track the progress of the delete and not have to worry about rollbacks when having to stop a huge delete script.

Here’s a sample below showing the script used to delete rows from a huge table. The CHECKPOINT and BACKUP LOG is important.

Make sure you know what your database’s Recovery Model first. See the screenshot below the script to find out what your database’s recovery model is.



SET @R = 1;

WHILE @R > 0
	DELETE TOP (10000) -- change this to your needs
	  Criteria = 'XXX';


	-- CHECKPOINT; -- if simple
	-- BACKUP LOG ... -- if full

Right click on your database and select properties and navigate to the “Options” tab to view your database recovery model.


I suggest that everyone who has had to delete (not truncate) large amounts of data from a table should read Aaron Bertrand article on “Break large delete operations into chunks“.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s