C # SQL export (problem with OutofMemory)

I am trying to do bulk export from sql table, the code is working fine. Otherwise, the problem is that the problem is exporting 50 columns in 2.7 million rows. Can anyone help me with OUtofMemory exception. I know why this happens because he tried to read it all in memory and then write, but I'm not sure how to put it aside to read line by line and then write it out, not taking forever.

string contents = File.ReadAllText(textBox3.Text); string tableid = tableid = Regex.Match(contents, @"CoreDBCaseID=(?<id>\d+)").Groups["id"].Value; string server = server = Regex.Match(contents, @"Server=(?<Server>[^;]+)").Groups["Server"].Value; string security = security = Regex.Match(contents, "Security=(?<Security>[^;]+)").Groups["Security"].Value; string database = database = Regex.Match(contents, "Database=(?<Database>[^\r]+)").Groups["Database"].Value; string[] data = new string[] { string.Format("Table={0}", tableid), string.Format("Server={0}", server), string.Format("Security={0}", security), string.Format("Database={0}", database), }; string sqltable = ("TDS_C" + tableid + "_table"); String cols = String.Join(",", listBox1.Items.Cast<String>().ToArray()); string sql = "select " + cols + " from " + sqltable + " where *; SqlConnection con = new SqlConnection("Data Source=" + server + ";Initial Catalog=" + database + ";Integrated Security=" + security); con.Open(); SqlDataAdapter tabadapter = new SqlDataAdapter(sql, con); DataSet dataset = new DataSet(); tabadapter.FillSchema(dataset, SchemaType.Source, sqltable); tabadapter.Fill(dataset, sqltable); DataTable tbltarget = dataset.Tables[0]; string output_text = tbltarget.Columns.Cast<DataColumn>().ToList() .Select(col => col.ColumnName) .Aggregate((current, next) => current + "|" + next) + "\r\n" + tbltarget.Rows.Cast<DataRow>().ToList() .Select(row => row.ItemArray.Aggregate((current, next) => current.ToString() + "|" + next.ToString().Replace("\n", ""))) .Cast<string>().Aggregate((current, next) => current + "\r\n" + next); File.WriteAllText(@"C:\Export.txt); con.Close(); 
+4
source share
3 answers

Do not fill in the data set. Instead, use a data reader, after which you can transfer this data while reading, which means much less memory.

You can look at doing asynchronous reads from db and run this asynchronous call to a file. This will stop each read / write operation as a completely blocking operation.


A simple example:

 using (SqlConnection connection = new SqlConnection(connectionString)){ connection.Open(); using (FileStream strm = new FileStream(filePath)){ using (TextWriter wrt = new TextWriter(strm)){ SqlCommand cmd = new SqlCommand(sql, connection); IDataReader rdr = cmd.ExecuteReader(); while rdr.Read() { wrt.Write(rdr[0].ToString() + "|" + rdr[1].ToString(); // change for your manipulation of the columns } }}} 
+4
source

This is best done using SSIS (if you are using SQL Server). Then call the SSIS package from the code. You do not have to move this amount of data from the database server to the application server for processing. Creating a large export file is best done only from the database.

+1
source

How to make a choice of 10 thousand lines at a time? Then they write between them. The kind of hybrid of the two options that you give.

0
source

All Articles