SHORT DESCRIPTION: export packets of several hundred lines to CSVs that reuse variables, so memory pressure will remain low. You cannot throw the whole mysql table into an array (and then into a CSV file), which is the main problem
LONG DESCRIPTION: try this to export a large table with column names (I used it, worked well, it can also be improved, compressed and optimized, but later):
- Open the CSV file (headers, fopen, etc.)
- Define an array with column names and:
fputcsv($f, $line, $delimiter); - Get the list of identifiers you want (not whole lines, only ids):
SELECT id FROM table WHERE condition ORDER BY your_desired_field ASC → here you have $ ids $perpage=200; // how many lines you export to csv in a pack;for ($z=0;$z < count($ids);$z+=$perpage) { $q="SELECT * FROM table WHERE same_condition ORDER BY your_desired_field ASC LIMIT ".$perpage." OFFSET ".$z // important: use the same query as for retrieving ids, only add limit/offset. Advice: use ORDER BY, don't ignore it, even if you do not really need it; $x=[execute query q] for($k=0;$k <count($x);$k++) { $line=array ($x[$k]->id, $x[$k]->field1, $x[$k]->field2 ..); fputcsv($f, $line, $delimiter); } } // end for $z- close CSV
So, you will go through the whole table of results, get 200 lines and write them in CSV, which will wait until you write all the lines. All the necessary memory for 200 lines, because you overwrite the variable. I am sure that this can be done better, but it took several hours for me and did not find a solution; In addition, it slightly influenced my architecture and application requirements, so I chose this solution.
source share