Can a C # program read a text file into memory, and then pass this object to a method that requires a file name?

In a C # program, I import a large text file (300 MB) into a MySQL database through the MySqlBulkLoader function from MySql.Net Connector.

Import takes a lot of time and leads to almost 100% disk usage on the Windows 2003 server on which it is running. In an attempt to speed up import, the program announces the splitting of a large file into smaller pieces.

Is it possible to read a small fragment of a file (8mb) into memory (i.e. an array) and then transfer it to MySQLBulkLoader as a file?

The volume loader looks for the path to the file name:

MySql.Data.MySqlClient.MySqlBulkLoader myBulk = new MySql.Data.MySqlClient.MySqlBulkLoader(connection); myBulk.Timeout = 10 * 60; / myBulk.TableName = "some_table"; myBulk.Local = true; myBulk.LineTerminator = @"\n"; myBulk.FileName = aFile.FullName; myBulk.FieldTerminator = ""; 
+2
c # import mysql text-files
source share
2 answers

Memory is not a file, so the short answer is no. Alternatives:

  • Read the file by writing it as a temporary file ( System.IO.Path.GetTempFileName() - your friend is here to provide a partial file name) and pass this file name MySqlBulkLoader
  • Use the "RAM Disk" tool to create a memory disk to place a copy of the complete 300 MB file, and then transfer the path to the MySqlBulkLoader file.
+1
source share

The following class agrees that this is not possible and writes the DataTable to disk before loading.

Perhaps this does not suit all circumstances, but at that time it corresponded to my needs.

 using MySql.Data.MySqlClient; using System.Data; using System.IO; using System.Text; namespace ImportDatabase { class DataTableToMySql { public MySqlConnection Connection { get; set; } public DataTable SourceDataTable { get; set; } public string FieldTerminator { get; set; } public string LineTerminator { get; set; } public DataTableToMySql(MySqlConnection conn, DataTable table) { FieldTerminator = "\t"; LineTerminator = "\n"; Connection = conn; SourceDataTable = table; } public void Execute() { string fileName = Path.GetTempFileName(); try { byte[] fieldTerm = Encoding.UTF8.GetBytes(FieldTerminator); byte[] lineTerm = Encoding.UTF8.GetBytes(LineTerminator); PrepareFile(fileName, fieldTerm, lineTerm); LoadData(fileName); } finally { File.Delete(fileName); } } private void LoadData(string fileName) { MySqlBulkLoader bl = new MySqlBulkLoader(Connection); bl.FieldTerminator = FieldTerminator; bl.LineTerminator = LineTerminator; bl.TableName = SourceDataTable.TableName; bl.FileName = fileName; bl.Load(); } private void PrepareFile(string fileName, byte[] fieldTerm, byte[] lineTerm) { using (FileStream fs = new FileStream(fileName, FileMode.Append)) { foreach (DataRow row in SourceDataTable.Rows) { int i = 0; foreach (object val in row.ItemArray) { byte[] bytes; if (val is DateTime) { DateTime theDate = (DateTime)val; string dateStr = theDate.ToString("yyyy-MM-dd HH:mm:ss"); bytes = Encoding.UTF8.GetBytes(dateStr); } else bytes = Encoding.UTF8.GetBytes(val.ToString()); fs.Write(bytes, 0, bytes.Length); i++; if (i < row.ItemArray.Length) fs.Write(fieldTerm, 0, fieldTerm.Length); } fs.Write(lineTerm, 0, lineTerm.Length); } } } } } 
0
source share

All Articles