In general, the process is likely to be much slower if it has more than the total amount of blocking, but if you only care about the size of the transaction log, you can try the following.
- Add a column with a zero number without an identifier (change only metadata).
- Enter the code to update it with unique consecutive integers. This will reduce the size of each individual transaction and reduce the size of the log (assuming a simple recovery model). My code below does this in batches of 100, I hope you have a working PC that you can use to pick up where you left off, and not re-checks that will last longer.
- use
ALTER TABLE ... ALTER COLUMN to mark a column as NOT NULL . This will require that the entire table is locked and scanned to check for changes, but does not require a lot of logs. - Use
ALTER TABLE ... SWITCH to make the column an โโidentifier. This change is for metadata only.
Code example below
CREATE TABLE table_1 ( original_column INT ) INSERT INTO table_1 SELECT DISTINCT number FROM master..spt_values ALTER TABLE table_1 ADD id INT NULL DECLARE @Counter INT = 0 , @PrevCounter INT = -1 WHILE @PrevCounter <> @Counter BEGIN SET @PrevCounter = @Counter; WITH T AS ( SELECT TOP 100 * , ROW_NUMBER() OVER ( ORDER BY @@SPID ) + @Counter AS new_id FROM table_1 WHERE id IS NULL ) UPDATE T SET id = new_id SET @Counter = @Counter + @@ROWCOUNT END BEGIN TRY; BEGIN TRANSACTION ; ALTER TABLE table_1 ALTER COLUMN id INT NOT NULL DECLARE @TableScript NVARCHAR(MAX) = ' CREATE TABLE dbo.Destination( original_column INT, id INT IDENTITY(' + CAST(@Counter + 1 AS VARCHAR) + ',1) ) ALTER TABLE dbo.table_1 SWITCH TO dbo.Destination; ' EXEC(@TableScript) DROP TABLE table_1 ; EXECUTE sp_rename N'dbo.Destination', N'table_1', 'OBJECT' ; COMMIT TRANSACTION ; END TRY BEGIN CATCH IF XACT_STATE() <> 0 ROLLBACK TRANSACTION ; PRINT ERROR_MESSAGE() ; END CATCH ;
Martin smith
source share