What is the best approach for exporting big CSV data using PHP / MySQL?

I am working on a project that I need to pull out from a database containing almost 10 thousand rows, and then export it to CSV. I tried the usual method for loading CSV, but I always get a memory limit problem, even if we already set memory_limit to 256 MB.

If any of you have the same problem, share your ideas about what are the best solutions or approaches.

Really rate your thoughts with guys.

Here is my actual code:

$filename = date('Ymd_His').'-export.csv'; //output the headers for the CSV file header("Cache-Control: must-revalidate, post-check=0, pre-check=0"); header('Content-Description: File Transfer'); header("Content-type: text/csv"); header("Content-Disposition: attachment; filename={$filename}"); header("Expires: 0"); header("Pragma: public"); //open the file stream $fh = @fopen( 'php://output', 'w' ); $headerDisplayed = false; foreach ( $formatted_arr_data_from_query as $data ) { // Add a header row if it hasn't been added yet -- using custom field keys from first array if ( !$headerDisplayed ) { fputcsv($fh, array_keys($ccsve_generate_value_arr)); $headerDisplayed = true; } // Put the data from the new multi-dimensional array into the stream fputcsv($fh, $data); } // Close the file stream fclose($fh); 
+6
source share
4 answers

If you really have to handle PHP, you need to use the MYSQL constraint command to capture a subset of your data. Take only a certain number of lines at a time, write them to a file, and then take the next set.

You may need to run unset () for several variables within your query loop. The key should be to not have too many huge increments in memory at once.

If you capture all joined tables, sort them by insert date in ascending order so that the second capture gets new elements.

+2
source

As explained in this comment: fooobar.com/questions/151196 / ... using mysqldump is probably the best option. If necessary, you can do this through php using the exec () command, as described here: php exec () - mysqldump creates an empty file

+1
source
  • Read each row of data separately from the query result set
  • write directly to php: // output
  • then read the next line, etc.

instead of creating some kind of large array or building csv in memory

0
source

SHORT DESCRIPTION: export packets of several hundred lines to CSVs that reuse variables, so memory pressure will remain low. You cannot throw the whole mysql table into an array (and then into a CSV file), which is the main problem

LONG DESCRIPTION: try this to export a large table with column names (I used it, worked well, it can also be improved, compressed and optimized, but later):

  • Open the CSV file (headers, fopen, etc.)
  • Define an array with column names and: fputcsv($f, $line, $delimiter);
  • Get the list of identifiers you want (not whole lines, only ids): SELECT id FROM table WHERE condition ORDER BY your_desired_field ASC → here you have $ ids
  • $perpage=200; // how many lines you export to csv in a pack;
  • for ($z=0;$z < count($ids);$z+=$perpage) { $q="SELECT * FROM table WHERE same_condition ORDER BY your_desired_field ASC LIMIT ".$perpage." OFFSET ".$z // important: use the same query as for retrieving ids, only add limit/offset. Advice: use ORDER BY, don't ignore it, even if you do not really need it; $x=[execute query q] for($k=0;$k <count($x);$k++) { $line=array ($x[$k]->id, $x[$k]->field1, $x[$k]->field2 ..); fputcsv($f, $line, $delimiter); } } // end for $z
  • close CSV

So, you will go through the whole table of results, get 200 lines and write them in CSV, which will wait until you write all the lines. All the necessary memory for 200 lines, because you overwrite the variable. I am sure that this can be done better, but it took several hours for me and did not find a solution; In addition, it slightly influenced my architecture and application requirements, so I chose this solution.

0
source

All Articles