We need to change the data type of about 10 primary keys in our db from numeric(19,0)to bigint. On small tables, a simple data type update works very well, but on large tables (60-70 million rows) a significant amount of time is required.
What is the fastest way to achieve this, preferably without locking the database.
I wrote a script that generates the following (which I believe I got from another SO post)
ALTER TABLE query_log ADD id_bigint bigint NULL;
GO
CREATE NONCLUSTERED INDEX IX_query_log_id_bigint ON query_log (id_bigint)
INCLUDE (id);
GO
declare @count int
declare @iteration int
declare @progress int
set @iteration = 0
set @progress = 0
select @count = COUNT(*) from query_log
RAISERROR ('Processing %d records', 0, 1, @count) WITH NOWAIT
WHILE 1 = 1 BEGIN
UPDATE X
SET X.id_bigint = id
FROM (
SELECT TOP 10000 * FROM query_log WHERE id_bigint IS NULL
) X;
IF @@RowCount = 0 BREAK;
set @iteration = @iteration + 1
set @progress = @iteration * 10000
RAISERROR ('processed %d of %d records', 0, 1, @progress, @count) WITH NOWAIT
END;
GO
ALTER TABLE query_log
DROP CONSTRAINT PK__query_log__53833672
GO
BEGIN TRAN;
UPDATE T
SET T.id_bigint = T.id
FROM query_log T WITH (TABLOCKX, HOLDLOCK)
WHERE T.id_bigint <> T.id;
EXEC sp_rename 'query_log.id', 'id_numeric';
EXEC sp_rename 'query_log.id_bigint', 'id';
COMMIT TRAN;
GO
DROP INDEX IX_query_log_id_bigint ON query_log;
GO
ALTER TABLE query_log ALTER COLUMN id bigint NOT NULL;
GO
ALTER TABLE query_log
ADD CONSTRAINT PK_query_log PRIMARY KEY (id)
GO
This works very well for small tables, but very slowly for very large tables.
Note that this is preparing for migration to Postgres, and the EnterpriseDB Migration toolkit does not seem to understand the numeric (19.0) data type