Tsql: best way to update db table with csv file

I have a product table in sql server 2005 that needs to be updated with some fields in the csv file. Both files have a vendor part number that can be associated with where I can update the products.discontined field to others in the csv file.

My question is the best way to approach this?

I considered creating an odbc connection to an excel file and figuring out how to combine the update of two columns. Import the entire csv file (~ 60 MB) into the temp table in the sql server, and then write the tsql procedure to search, compare, update? Also run the opensource command from the query analyzer and write the procedure to read in the csv file and update the table in this way.

early

+5
source share
1 answer

If this is available to you, the best option with SQL Server 2005 is the SQL Server Integration Service (formerly Data Transformation Services or DTS). This will allow you to create a package that can be located anywhere and run on a schedule or called every time you decide. It can be launched asynchronously and to the application calling it.

update:
In addition, you can run and debug the package so that it becomes perfect before you decide to publish it. This will allow you to consider things like invalid documents, etc.

SSIS, , (: http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/)

, csv, , :

BULK
INSERT MyTable
FROM ‘c:\mycsv.csv’
WITH
(
    FIELDTERMINATOR = ‘,’,
    ROWTERMINATOR = ‘\n’
)

tsql, db.

, .

+3

All Articles