Bulk copy of DataTable in MySQL (similar to System.Data.SqlClient.SqlBulkCopy)

I port my program from Microsoft SQL Server to MySQL. Everything works well, except for one problem with mass copy.

In the solution with MS SQL, the code is as follows:

connection.Open(); SqlBulkCopy bulkCopy = new SqlBulkCopy(connection); bulkCopy.DestinationTableName = "testTable"; bulkCopy.WriteToServer(rawData); 

Now I am trying to do something similar for MySQL. Since I think that there will be poor performance, I do not want to write the DataTable to a CSV file and do an insert with the MySqlBulkLoader class MySqlBulkLoader .

Any help would be greatly appreciated.

+11
c # database mysql
source share
2 answers

Since I think that there will be poor performance, I do not want to write a DataTable to a CSV file and do an insert with the MySqlBulkLoader class there.

Do not rule out a possible solution based on unfounded assumptions. I just tested inserting 100,000 rows from a System.Data.DataTable into a MySQL table using the standard MySqlDataAdapter#Update() inside a Transaction . It took about 30 seconds to complete:

 using (MySqlTransaction tran = conn.BeginTransaction(System.Data.IsolationLevel.Serializable)) { using (MySqlCommand cmd = new MySqlCommand()) { cmd.Connection = conn; cmd.Transaction = tran; cmd.CommandText = "SELECT * FROM testtable"; using (MySqlDataAdapter da = new MySqlDataAdapter(cmd)) { da.UpdateBatchSize = 1000; using (MySqlCommandBuilder cb = new MySqlCommandBuilder(da)) { da.Update(rawData); tran.Commit(); } } } } 

(I tried several different values ​​for UpdateBatchSize , but they did not have a significant effect on the past tense.)

In contrast, the following code using MySqlBulkLoader took only 5 or 6 seconds ...

 string tempCsvFileSpec = @"C:\Users\Gord\Desktop\dump.csv"; using (StreamWriter writer = new StreamWriter(tempCsvFileSpec)) { Rfc4180Writer.WriteDataTable(rawData, writer, false); } var msbl = new MySqlBulkLoader(conn); msbl.TableName = "testtable"; msbl.FileName = tempCsvFileSpec; msbl.FieldTerminator = ","; msbl.FieldQuotationCharacter = '"'; msbl.Load(); System.IO.File.Delete(tempCsvFileSpec); 

... including the reset time of 100,000 lines from the DataTable to a temporary CSV file (using code similar to this one ), voluminous - loading from this file and subsequent deletion of the file.

+11
source share

Using any BulkOperation NuGet package, you can easily do this.

Below is an example of using the package with https://www.nuget.org/packages/Z.BulkOperations/2.14.3/

 using Z.BulkOperations; 

......

 MySqlConnection conn = DbConnection.OpenConnection(); DataTable dt = new DataTable("testtable"); MySqlDataAdapter da = new MySqlDataAdapter("SELECT * FROM testtable", conn); MySqlCommandBuilder cb = new MySqlCommandBuilder(da); da.Fill(dt); 

Instead of using

 ...... da.UpdateBatchSize = 1000; ...... da.Update(dt) 

just following two lines

 var bulk = new BulkOperation(conn); bulk.BulkInsert(dt); 

It only takes 5 seconds to copy the entire DataTable to MySQL without first dumping 100,000 rows from the DataTable into a temporary CSV file.

0
source share

All Articles