Since I think that there will be poor performance, I do not want to write a DataTable to a CSV file and do an insert with the MySqlBulkLoader class there.
Do not rule out a possible solution based on unfounded assumptions. I just tested inserting 100,000 rows from a System.Data.DataTable into a MySQL table using the standard MySqlDataAdapter#Update() inside a Transaction . It took about 30 seconds to complete:
using (MySqlTransaction tran = conn.BeginTransaction(System.Data.IsolationLevel.Serializable)) { using (MySqlCommand cmd = new MySqlCommand()) { cmd.Connection = conn; cmd.Transaction = tran; cmd.CommandText = "SELECT * FROM testtable"; using (MySqlDataAdapter da = new MySqlDataAdapter(cmd)) { da.UpdateBatchSize = 1000; using (MySqlCommandBuilder cb = new MySqlCommandBuilder(da)) { da.Update(rawData); tran.Commit(); } } } }
(I tried several different values ββfor UpdateBatchSize , but they did not have a significant effect on the past tense.)
In contrast, the following code using MySqlBulkLoader took only 5 or 6 seconds ...
string tempCsvFileSpec = @"C:\Users\Gord\Desktop\dump.csv"; using (StreamWriter writer = new StreamWriter(tempCsvFileSpec)) { Rfc4180Writer.WriteDataTable(rawData, writer, false); } var msbl = new MySqlBulkLoader(conn); msbl.TableName = "testtable"; msbl.FileName = tempCsvFileSpec; msbl.FieldTerminator = ","; msbl.FieldQuotationCharacter = '"'; msbl.Load(); System.IO.File.Delete(tempCsvFileSpec);
... including the reset time of 100,000 lines from the DataTable to a temporary CSV file (using code similar to this one ), voluminous - loading from this file and subsequent deletion of the file.
Gord thompson
source share