PHP closes the connection early. script freezes if any exit is completed

I close the client connection early with this:

static public function early_close( $output ) { ignore_user_abort(true); echo $output; // Disable gzip compression in apache, as it can result in this request being buffered until it is complete, // regardless of other settings. if (function_exists('apache_setenv')) { apache_setenv('no-gzip', 1); } // get the size of the output $size = ob_get_length(); // send headers to tell the browser to close the connection header("Content-Length: $size"); header('Connection: close'); header("Content-Encoding: none"); // To disable Apache compressing anything // IF PHP-FM // fastcgi_finish_request(); // flush all output if( ob_get_level() > 0 ) { ob_end_flush(); ob_get_level()? ob_flush():null; flush(); } // if you're using sessions, this prevents subsequent requests // from hanging while the background process executes if( session_id() ) { session_write_close(); } } 

It works fine, but after this event, if any script outputs something (either by echoing or by adding a new header), the script stops executing from this point.
I tried to start output buffering after an early close, and then drop it, but it does not work:

 Server::early_close(); ob_start(); heavy_work(); ob_clean(); 

Any ideas? Using php 5.3.x

+7
php
source share
4 answers

Classic code for this:

 ob_end_clean(); header("Connection: close"); ignore_user_abort(); // optional ob_start(); echo ('Text the user will see'); $size = ob_get_length(); header("Content-Length: $size"); ob_end_flush(); // Strange behaviour, will not work flush(); // Unless both are called ! // Do processing here sleep(30); echo('Text user will never see'); 

Otherwise, I recommend that you read the following if you want to make asynchronous calls: Methods of asynchronous processes in PHP

+3
source share

IMHO, you should not go along this route. Http requests should be kept as short as possible to improve usability.

If some kind of “heavy processing” needs to be done, you can “paint” it using some kind of queue. A separate process / daemon on the server can pick up these tasks from the queue for their execution. The http application can then check to see if such a job continues to be processed / has been started / completed.

There are many libraries to facilitate this: Gearman , ØMQ , RabbitMQ , etc.

Http requests are not suitable for lengthy operations, so when trying to perform all kinds of problems :)

UPDATE

If you cannot use libraries on the server (for example, Gearman, etc.), you can create your own queue based on files or db, press "commands" in the queue from your application and have a cronjob read this queue and complete these tasks .

+2
source share

You need echo chr (0); after echo $ exits. Sending a null byte will cause the browser to end the connection. Also, I assume that ob_start () exists in front of the server :: early_close ()? If not, you need ob_get_length to work correctly.

+1
source share

Based on your comments today, I recommend replacing the existing solution with an AJAX request to load the page. Instead of closing the connection early and continuing processing on the server, open the response as usual and add an AJAX request to add additional processing after loading the page on the client. This completely eliminates the problem of extraneous output, and you can also send messages about successful / unsuccessful access to the user.

Another solution is to queue your work in a table or memory and configure cron to process in the background.

+1
source share

All Articles