PHP: coding long scripts when servers set a run-time limit

FastCGI servers, for example, impose run-time limits on PHP scripts that cannot be changed using set_time_limit() in PHP. IIS does this too, I believe.

I wrote an import script for a PHP application that works well in mod_php, but does not work in FastCGI (mod_fcgid), because the script is killed after a certain number of seconds. I still do not know how to determine what your time limit is in this case, and have not decided how I am going to get around it. Doing this in small redirected chunks seems like one kludge, but how?

What methods would you use when coding a long-term task, such as an import or export task, where the server through

can be stopped by a separate php script?

Assume that you are creating a portable script, so you don’t have to know if PHP will eventually run in mod_php, FastCGI or IIS or if there will be a maximum execution time at the server level. This probably also excludes shell scripts, etc.

+7
php portability fastcgi execution-time
source share
3 answers

Use the PHP command line interface , which is not subject to script restrictions imposed by web servers. If you need to automate the execution of your script, you can schedule it with cron.

+4
source share

What you are really talking about is the order of work. This is the practice of running PHP code asynchronously from a front-end request. There are two main ways to do this in PHP. One of them is to use the Gearman program, the other is to use the Zend Server Job Queue, which I personally know. I have a blog post on how you can do this: It's your turn . I found that the implementation I have there is extremely easy to use.

What you can also try is to set max_execution_time to 0 until your logic executes.

-one
source share

Doing this in small redirected chunks seems like one kludge, but how?

This is how I processed the full database backup (phpBB) when the built-in export mechanism started attacking the max_execution_time constraint.

I did this for one table at a time, and for large tables in pieces of 5,000 rows. (It turned out that the limiting factor in the whole process was not the export execution time, but the actual file size that phpmyadmin could process upon import.)

After each export fragment, I returned the page with the meta refresh tag in the header, redirecting the script back to itself with the next block table number and the beginning of the line in the query string.

 <?php if(!$all_done){ $new_url=$_SERVER['PHP_SELF'].'?tablecount='.$count; if(!$tabledone && ""!=$start_row && null!=$start_row){ $new_url.="&startrow=".$start_row; } else { $new_url.="&startrow=0"; } echo('<meta http-equiv="refresh" content="0.5;url='.$new_url.'" />'); } ?> 

The counters were such that I could iterate over the array of table names that I got with SHOW TABLES.

Before I had the wit to drop a gigantic word-match table (which phpBB can rebuild on its own) from export, this backup script would take half an hour.

-one
source share

All Articles