Import HUGE JSON-encoded arrays without memory limitation

I have a file that contains two JSON arrays; one contains column names that have 4 values ​​and another array that contains more than 10,000 record values.

I use Symfony and Propel; when using json_decode, it gives out a valid memory size. In my php.ini I specified a maximum size of up to 500 MB, but the file was executed within 10 seconds and reset the error.

data file contains

{ "columns_map":["Name","Age","Address","State"], "rows_map":{"1":["value1","value2","value3","value4"], "4":["value1","value2","value3","value4"] "6":["value1","value2","value3","value4"].......upto 10,000 and more records } } 

on my symfony page i have this code

 $file = "path to the file"; $content = file_get_contents($file); $array = json_decode($content); 

I want to store the values ​​of a file array in a PHP array and process it, and I want to read regardless of the memory limit set in php.ini.

I want to save all the values ​​at once or lay out the file and save (for example, reading the first 1000 records and the cycle to the end, but how to read the first 1000 records in the rows_map array?).

+4
source share
3 answers

I solved this by creating my own custom class with encoding and decoding functions

-2
source

Make sure you update the correct php.ini (usually there are separate files on Linux systems for Apache, CGI and CLI). You can make sure that you get a memory value that is updated by checking the return value ini_get('memory_limit') in the same context. And don't forget to restart the web server if you are using Apache or some kind of CGI server.

Even 10 thousand items should not run out of 500 MB of memory; if this is true, then you are likely to encounter the same problem, trying to parse it yourself. It is impractical to read and parse pieces of raw JSON strings. Choose the format that works best, insert the data into the database or write the data in chunks to separate the files and parse each separately.

+1
source

Can I store files separately? Then it is already much easier. Pay attention, for example, to the following structure:

  • 1.json (first 1000 rows + column map)
  • 2.json (second 1000 rows + column map)

Another problem may be caused. I observed a similar problem with Doctrine that made me use a simple PDO to insert objects. Doctrine clogs the entire memory and processor, while prepared statements with PDOs can easily handle this amount.

Another option is to use CSV (this is from the very 80s, I know). But this should allow you to read it in a line.

0
source

All Articles