I found a problem in my application; basically, one routine prepares (a lot) of data that is later inserted into my local database through the LINQ-to-SQL data context. However, even a relatively modest amount of new data (100,000-ish) requires a huge amount of time to be stored in the database when calling SubmitChanges() . However, most of the time, most likely, the application should save from 200,000 to 300,000 lines.
According to the SQL Server profiler, all generated queries look like the one below, and there is one for each element that the application inserts.
exec sp_executesql N'INSERT INTO [dbo].[AdjectivesExpanded]([Adjective], [Genus], [Casus], [SingularOrPlural], [Kind], [Form]) VALUES (@p0, @p1, @p2, @p3, @p4, @p5) SELECT CONVERT(BigInt,SCOPE_IDENTITY()) AS [value]',N'@p0 bigint,@p1 char(1),@p2 tinyint,@p3 bit,@p4 tinyint,@p5 nvarchar(4000)',@p0=2777,@p1='n',@p2=4,@p3=0,@p4=3,@p5=N'neugeborener'
Does anyone have an idea how to improve the performance of bulk inserts using LINQ-to-SQL data contexts, ideally without getting rid of the typical DataContext and abandoning manual queries as such? In addition, there is a small opportunity or place to configure the base database. If anything at all, I can turn off integrity constraints if that helps.