What is the best / fastest way to export a large dataset from C # to Excel

I have code that uses the OpenXML library to export data.

I have 20,000 rows and 22 columns and it takes age (about 10 minutes).

is there any solution that will export data from C # to excel, which would be faster since I am doing this from an asp.net mvc application and many user browsers are wasting time.

+4
source share
7 answers

I ended up working with open source code called ClosedXML, which worked great

+1
source

Assuming 20'000 rows and 22 columns with 100 bytes each, it is only 41 megabytes. plus xml tags, as well as formatting, I would say that you end up zipping (.xlsx is nothing but a few zipped xml files) 100 mb of data.

Of course, this takes time, as well as fetching data. I recommend that you use the excel plus package instead of the Office OpenXML development package. http://epplus.codeplex.com/

There is probably a bug / performance issue in the Microsoft Office code.

+2
source

CSV This is a simple text file, but it can be opened with any version of Excel.

Without a doubt, this is an easier way to export data to excel. Many websites provide export of data in CSV format.

What you need to do is just add a comma (,) to separate the values โ€‹โ€‹and line break to separate the entries. To create a csv file, you donโ€™t need an additional resource, so it is pretty fast.

+1
source

Depending on which version of Excel you are targeting, you can provide data as an OData service that Excel 2010 can naturally consume and will handle the loading and formatting for you.

0
source

I assume that this data should be completely sent to the client and already pre-filtered in some way, but should still be sent back to the person who made the request.

In this case, you want to perform this operation asynchronously. Iโ€™m not sure if this will correspond to your workflow, but Iโ€™ll say that a person requests this large document in XML format, I would: a) turn on another workflow to start generating the document, returning a โ€œtokenโ€ (perhaps the GUID for the requestor persons); b) return the link to the page on which the requestor can click the link (transfer the token), allowing the page to search for results.

If the stream has finished processing the document, it places it in a special folder with a unique name and adds a token to the database table indicating the location of the document. If a person requests this page, a token exists in the database, and the document exists in the file system, they are allowed to click and download it via HTTP. If this does not exist, they either say that it does not exist, or wait for the results. (This message may be based on the time the request was received.)

If a person uploads a document successfully (and you can do it through a script), you can delete the database entry for the document with this token and delete the file from the file system.

Hope I read this question correctly.

0
source

I found that I can speed up the export of data from a database to an Excel spreadsheet by limiting the number of export operations. I found that by accumulating 100 rows of data before writing, the creation speed increased in size by at least 5-10x.

0
source

Error exporting data that is most often executed when exporting data to a workflow

  • Build model
  • XML DOM assembly
  • Save XML DOM to file

This workflow is overhead because it takes this time to create the XML DOM, the XML DOM is stored in memory with the model, and then the entire data set is written to the file.

The best way to handle this is to convert your model record by writing directly to the target format and writing it directly to a (buffered) file.

The low-overhead format that Excel quickly writes and reads is CSV (well, it's out of date, it's uncomfortable ...).

0
source

All Articles