I finally got my insert batch, and now I am working on the package size, but I donโt see any performance difference between the value 50 and the value 10000. It seems very strange to me, but I donโt know what is going on behind the scenes, so that this may be normal behavior.
I insert 160 thousand rows into the table, and the average time for my tested values โโis 115 +/- 2 seconds. Without batch processing, it takes 210 seconds, so I'm quite happy with the improvement. Target table:
CREATE TABLE [dbo].[p_DataIdeas]( [wave] [int] NOT NULL, [idnumber] [int] NOT NULL, [ideaID] [int] NOT NULL, [haveSeen] [bit] NOT NULL CONSTRAINT [DF_p_DataIdeas_haveSeen] DEFAULT ((0)), CONSTRAINT [PK_p_DataIdeas] PRIMARY KEY CLUSTERED ( [wave] ASC, [idnumber] ASC, [ideaID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON ) ON [PRIMARY] ) ON [PRIMARY]
I am reading What to look for when setting up UpdateBatchSize , and the answer was just to test several different values. I can understand this, but should you not count or at least evaluate a good value if you know the table design, the SQL question and the data that needs to be inserted?
Are there any best practices that anyone can recommend?
Fredrik norlin
source share