Convert JSON string to WITHOUT json_decode array

I use PHP on a shared server to access an external site through an API that returns JSON containing 2 levels of data (Level 1: Performer and Level 2: Category array inside the executor). I want to convert this to a multidimensional associative array WITHOUT USING json_decode (it uses too much memory for this use !!!)

Sample JSON data:

[ { "performerId": 99999, "name": " Any performer name", "category": { "categoryId": 99, "name": "Some category name", "eventType": "Category Event" }, "eventType": "Performer Event", "url": "http://www.novalidsite.com/something/performerspage.html", "priority": 0 }, { "performerId": 88888, "name": " Second performer name", "category": { "categoryId": 88, "name": "Second Category name", "eventType": "Category Event 2" }, "eventType": "Performer Event 2", "url": "http://www.novalidsite.com/somethingelse/performerspage2.html", "priority": 7 } ] 

I tried using substr and split the "[" and "]".

Then made a call:

 preg_match_all('/\{([^}]+)\}/', $input, $matches); 

This gives me a row for each row, but truncates after the final "}" category data.

How can I return FULL ROW of AS ARRAY data using something like preg_split, preg_match_all, etc. INSTEAD of heavy calls like json_decode to a common JSON string?

As soon as I have an array with each correctly identified line, I MAY than execute json_decode on this line without overloading the memory on the shared server.


For those who want to get more details about using json_decode, causing an error:

 $aryPerformersfile[ ] = file_get_contents('https://subdomain.domain.com/dir/getresults?id=1234'); $aryPerformers = $aryPerformersfile[0]; unset($aryPerformersfile); $mytmpvar = json_decode($aryPerformers); print_r($mytmpvar); exit; 
0
source share
2 answers

If you have a limited amount of memory, you can read the data as a stream and parse the JSON one piece at a time, instead of parsing it all at once.

getresults.json:

 [ { "performerId": 99999, "name": " Any performer name", "category": { "categoryId": 99, "name": "Some category name", "eventType": "Category Event" }, "eventType": "Performer Event", "url": "http://www.novalidsite.com/something/performerspage.html", "priority": 0 }, { "performerId": 88888, "name": " Second performer name", "category": { "categoryId": 88, "name": "Second Category name", "eventType": "Category Event 2" }, "eventType": "Performer Event 2", "url": "http://www.novalidsite.com/somethingelse/performerspage2.html", "priority": 7 } ] 

PHP:

 $stream = fopen('getresults.json', 'rb'); // Read one character at a time from $stream until // $count number of $char characters is read function readUpTo($stream, $char, $count) { $str = ''; $foundCount = 0; while (!feof($stream)) { $readChar = stream_get_contents($stream, 1); $str .= $readChar; if ($readChar == $char && ++$foundCount == $count) return $str; } return false; } // Read one JSON performer object function readOneJsonPerformer($stream) { if ($json = readUpTo($stream, '{', 1)) return '{' . readUpTo($stream, '}', 2); return false; } while ($json = readOneJsonPerformer($stream)) { $performer = json_decode($json); echo 'Performer with ID ' . $performer->performerId . ' has category ' . $performer->category->name, PHP_EOL; } fclose($stream); 

Output:

 Performer with ID 99999 has category Some category name Performer with ID 88888 has category Second Category name 

This code could, of course, be improved by using a buffer for faster reading, taking into account that string values ​​can include { and } characters, etc.

+3
source

You have two options, and none of them include recording your own decoder; don't overcomplicate the solution with unnecessary work.

1) Reduce the size of json that is being decoded, or 2) Increase the allowed memory on your server.

The first option will require access to the generated json. This may or may not be possible depending on whether you are the one who originally created json. The easiest way to do this is to unset() any useless data. For example, maybe there is some debugging information that you won't need, so you can do unset($json_array['debug']); according to useless data. http://php.net/manual/en/function.unset.php

The second option requires that you have access to the php.ini file on your server. You need to find a line with something like memory_limit = 128M and make the 128M part larger. Try increasing this to double the value already in the file (so in this case it will be 256M). This may not solve your problem, since json big data can still be the core of your problem; it only provides work for inefficient code.

0
source

All Articles