How to delete records in SQL 2005 while maintaining transaction logs

I use the following stored procedure to delete a large number of records. I understand that the DELETE statement writes to the transaction log and deletes many lines, which will increase the log.

I considered other options for creating tables and inserting records to save and then crop the source, this method will not work for me.

How to make my stored procedure below more efficient by making sure I keep the transaction log unnecessary?

CREATE PROCEDURE [dbo].[ClearLog] ( @Age int = 30 ) AS BEGIN -- SET NOCOUNT ON added to prevent extra result sets from -- interfering with SELECT statements. SET NOCOUNT ON; -- DELETE ERRORLOG WHILE EXISTS ( SELECT [LogId] FROM [dbo].[Error_Log] WHERE DATEDIFF( dd, [TimeStamp], GETDATE() ) > @Age ) BEGIN SET ROWCOUNT 10000 DELETE [dbo].[Error_Log] WHERE DATEDIFF( dd, [TimeStamp], GETDATE() ) > @Age WAITFOR DELAY '00:00:01' SET ROWCOUNT 0 END END 
+4
source share
5 answers

Here's how I do it:

 CREATE PROCEDURE [dbo].[ClearLog] ( @Age int = 30) AS BEGIN SET NOCOUNT ON; DECLARE @d DATETIME , @batch INT; SET @batch = 10000; SET @d = DATEADD( dd, -@Age , GETDATE() ) WHILE (1=1) BEGIN DELETE TOP (@batch) [dbo].[Error_Log] WHERE [Timestamp] < @d; IF (0 = @@ROWCOUNT) BREAK END END 
  • Make Tiemstamp SARGable Comparison
  • Separate GETDATE () at the beginning of the package to create a consistent run (otherwise it might block in an infinite loop as the new record age, as old ones are deleted).
  • use TOP instead of SET ROWCOUNT ( deprecated : Using SET ROWCOUNT will not affect DELETE, INSERT, and UPDATE statements in the next release of SQL Server. )
  • check @@ ROWCOUNT to break the loop instead of redundant SELECT
+4
source

Assuming that you have the ability to rebuild the error log table in the partition scheme, one option would be to split the table by date and swap sections. Do a google search for the "alter table switch" section to figure it out a bit.

+1
source

how about you running it more often and deleting fewer lines each time? Run this every 30 minutes:

 CREATE PROCEDURE [dbo].[ClearLog] ( @Age int = 30 ) AS BEGIN SET NOCOUNT ON; SET ROWCOUNT 10000 --I assume you are on an old version of SQL Server and can't use TOP DELETE dbo.Error_Log Where Timestamp>GETDATE() -@Age WAITFOR DELAY '00:00:01' --why??? SET ROWCOUNT 0 END 

the date processing method will not shorten the time, and you will only delete 30 minutes of data each time.

+1
source

If your database is in full recovery mode, the only way to minimize the impact of your applications on deletion is to “skip them” - delete so much in the “transaction interval”. For example, if you back up t-log every hour, delete, say, 20,000 lines per hour. It may not lose everything you need all at once, but it will be all the same 24 hours or in a week?

If your database is in SIMPLE or BULK_LOGGED mode, this should lead to its removal to pieces. But, since you already do this, I will have to guess that your database is in FULL recovery mode. (This or the connection that calls the procedure may be part of the transaction.)

+1
source

The solution I used in the past was to temporarily set the recovery model to "Bulk Logged", and then return to "Full" at the end of the stored procedure:

 DECLARE @dbName NVARCHAR(128); SELECT @dbName = DB_NAME(); EXEC('ALTER DATABASE ' + @dbName + ' SET RECOVERY BULK_LOGGED') WHILE EXISTS (...) BEGIN -- Delete a batch of rows, then WAITFOR here END EXEC('ALTER DATABASE ' + @dbName + ' SET RECOVERY FULL') 

This will significantly reduce transaction log consumption for large batches. I don't like the fact that it installs a recovery model for the entire database (not just for this session), but it is the best solution I could find.

0
source

All Articles