I have a file that contains two JSON arrays; one contains column names that have 4 values ββand another array that contains more than 10,000 record values.
I use Symfony and Propel; when using json_decode, it gives out a valid memory size. In my php.ini I specified a maximum size of up to 500 MB, but the file was executed within 10 seconds and reset the error.
data file contains
{ "columns_map":["Name","Age","Address","State"], "rows_map":{"1":["value1","value2","value3","value4"], "4":["value1","value2","value3","value4"] "6":["value1","value2","value3","value4"].......upto 10,000 and more records } }
on my symfony page i have this code
$file = "path to the file"; $content = file_get_contents($file); $array = json_decode($content);
I want to store the values ββof a file array in a PHP array and process it, and I want to read regardless of the memory limit set in php.ini.
I want to save all the values ββat once or lay out the file and save (for example, reading the first 1000 records and the cycle to the end, but how to read the first 1000 records in the rows_map array?).
Arasu source share