Why only 1000 records are read from a very large csv file using php?

I am just exploring the use of php as a requirement that arose in my work, and I am trying to understand why out of 124938 entries in my csv file, only 1011 entries are read. Here is the main code that I use.

<?php print "<table>\n"; $fp = fopen('STDPRICE_FULL.csv','r') or die("can't open file"); while($csv_line = fgetcsv($fp,1024)) { print '<tr>'; for ($i = 0, $j = count($csv_line); $i < $j; $i++) { print '<td>'.$csv_line[$i].'</td>'; } print "</tr>\n"; } print '</table>\n'; fclose($fp) or die("can't close file"); ?> 

When I print the invoice ($ csv_line), it shows me only 1011 entries.
Now I believe that this may be due to the size of the integer, but I'm not sure.
I also looked for ways to increase the size of an integer, but php seems to handle type conversions itself.
Can anyone suggest what I can do to read all the lines from a csv file.

+5
source share
2 answers
0
source

Use the ParseCSV class. We used for very large files with great success.

ParseCSV Class on GitHub

It handled a lot of the problems that we processed CSV files in PHP.

0
source

All Articles