Export large amounts of data

Here is my problem.

We have 2 types of reports on our website, data displayed in the grid, and data that are instantly downloaded as a report.

These reports may contain several years of data (1 + million rows), we allow our customers to upload data with a date range, but we began to limit how long they can view the data in order to prevent performance problems in our website. However, the data is still getting quite large even on a small date range, now that it expands and if it loads too much, our memory accumulates at several concerts and ends from memory.

Question: I rather do not limit their data, so I try to find a good solution that allows them to download as much as they want.

I can limit what they see only by returning data to the page, so there is no performance issue, however loading is always a problem.

I looked at async but couldn't get it to work as it accumulates memory when I load data.

Ideas? Thoughts? Suggestions?

Code example:

 // Get Data

 SqlConnection con = new SqlConnection ();
 SqlCommand cmd = new SqlCommand ();
 SqlDataAdapter da;
 DataSet ds = new DataSet ();

 con.ConnectionString = "MyConnectionString";
 con.Open ();

 cmd.Connection = con;
 cmd.CommandType = CommandType.StoredProcedure;
 cmd.CommandText = "MyStoredProc";
 da = new SqlDataAdapter (cmd);
 da.Fill (ds);

 con.Close ();

 StringWriter sw = new StringWriter ();
 HtmlTextWriter htw = new HtmlTextWriter (sw);
 DataGrid dg = new DataGrid ();
 dg.DataSource = ds.Tables [0];
 dg.DataBind ();
 dg.RenderControl (htw);

 Response.ClearContent ();
 Response.ContentType = "application / vnd.ms-excel";
 Response.AddHeader ("Content-Disposition", "attachment; filename = Report.xls");
 Response.Write (sw.ToString ());
 Response.End ();

When I run this with my data, which is approximately 800 thousand lines, my memory bursts and I get an error from the memory, and also worsen the situation .. it always freezes in the RenderControl until it finishes

+4
source share
4 answers

I assume that the data comes from a database. If so, you should not wait for this operation to complete. This is a poor user interface design, especially if the memory can go up to 4 GB.

I agree with the other suggestions that you should explore in improving your code and design, probably to help reduce footprint. But no matter what, you should have something like a planned job architecture.

You allow the user to delete the download in the search / file file, and he is added to the queue in the database table. There is a db / .net process that runs and processes these jobs and generates a file in the appropriate format on the server. Perhaps you can reuse the file for many users if the data is the same and you are using the correct naming conventions. Then the user will have to go to the download queue page and view all of his downloads that he has planned. After completion, he will be able to download the file.

If you have a requirement that will not allow you to do this, send a comment explaining it.

+2
source

Ok, here we go:

  • DO NOT ALLOW THE TABLE
  • DO NOT USE THIS TRIBUTE

Done.

Get a data reader, write HTML when you go - you never store all the data in memory. Your approach will never scale.

+1
source

Can you rewrite a stored procedure to redirect and scroll through a dataset? Then, overwrite the output to transfer the file, rather than outputting everything all at once (your current method basically just writes an HTML table).

Paging data will keep the download process from storing all this data in memory

0
source

Solved !!!

I had the same problem when I was working on Exporting large amounts of data to Excel.

: You can use the open XMl library to solve your problem. Using this DLL, you can export a large amount of data to excel, and memory consumption will also be less.

more information you can get from here https://msdn.microsoft.com/en-us/library/office/hh180830(v=office.14).aspx

0
source

All Articles