I have a massive PHP script.
So much so that I had to do
ini_set('memory_limit', '3000M'); set_time_limit (0);
It works fine on one server, but on the other I get: Not enough memory (allocated 1653342208) (tried to allocate 71 bytes) in / home / writeabo / public _html / propturk / feedgenerator / simple_html_dom.php on line 848
Both are in the same package from the same host, but from different servers.
Above problem solved new problem below for reward
Update. The script is so large that it loads site and parser data from 252 pages, including more than 60,000 images, which makes two copies. Since then I have broken it into pieces.
I now have one more problem. when I write an image from an external site to the server as follows:
try { $imgcont = file_get_contents($va); // $va is an img src from an array of thousands of srcs $h = fopen($writeTo,'w'); fwrite($h,$imgcont); fclose($h); } catch(Exception $e) { $error .= (!isset($error)) ? "error with <img src='" . $va . "' />" : "<br/>And <img src='" . $va . "' />"; }
Suddenly he goes to a page with an internal internal server error, and I need to do it again, after which it works, because the files are only copied if they do not already exist. Is there anyway I can get a 500 response code and send it back to the url so that it reappears? How should all this be an automated process?
source share