I suggest using XML functionality in SQL Server 2005/2008, which will allow you to perform bulk insert and bulk update. I would do the following approach:
- Process the entire file in the data structure in memory.
- Create one XML document from this structure to go to the saved process.
- Create a stored procedure to load data from an XML document into a temporary table, then perform inserts and updates. The following is a guide to creating a stored procedure.
There are many advantages to this approach:
- The whole operation completes in one database call, although if your data set is really large, you might want to load it.
- You can easily transfer all database records into a single transaction and roll back if something fails.
- You are not using dynamic SQL that could pose a security risk.
- You can return the identifiers of inserted, updated, and / or deleted records using the OUTPUT clause.
For a stored procedure, you will need the following:
CREATE PROCEDURE MyBulkUpdater ( @p_XmlData VARCHAR(MAX) ) AS DECLARE @hDoc INT EXEC sp_xml_preparedocument @hDoc OUTPUT, @p_XmlData
Now you can simply insert, update and delete the real table from your temporary table, if required, for example
INSERT INTO MyRealTable (Field1, Field2) SELECT Field1, Field2 FROM
For the XML example to be passed, you can do:
SELECT TOP 1 *, 0 AS __ORDERBY FROM MyRealTable AS MyRealTable FOR XML AUTO, ROOT('ROOT')
See OPENXML , sp_xml_preparedocument, and sp_xml_removedocument for more information.
source share