Something faster than get_headers ()

I am trying to create a PHP script that will check the status of an HTTP site as quickly as possible.

I am currently using get_headers () and running it in a loop of 200 random URLs from mysql database.

To test all 200 - it takes an average of 2 m 48 seconds.

Is there anything I can do to make this much faster?

(I know about fsockopen - it can check port 80 on 200 sites in the 20s - but it's not the same as asking for an http status code, because the server can respond to the port - but it may not load sites correctly, etc. .d.)

Here is the code.

<?php function get_httpcode($url) { $headers = get_headers($url, 0); // Return http status code return substr($headers[0], 9, 3); } ### ## Grab task and execute it ### // Loop through task while($data = mysql_fetch_assoc($sql)): $result = get_httpcode('http://'.$data['url']); echo $data['url'].' = '.$result.'<br/>'; endwhile; ?> 
+7
source share
2 answers

You can try the curl library. You can send multiple requests simultaneously in parallel with CURL_MULTI_EXEC

Example:

 $ch = curl_init('http_url'); curl_setopt($ch, CURLOPT_HEADER, 1); $c = curl_exec($ch); $info = curl_getinfo($ch, CURLINFO_HTTP_CODE); print_r($info); 

UPDATED

Check out this example. http://www.codediesel.com/php/parallel-curl-execution/

+6
source

I don’t know if this is an option that you can consider, but you can run all of them almost the same with a fork, so the script will take a little more than one request http://www.php.net/manual/en/function .pcntl-fork.php

you can add this to a script that runs in cli mode and simultaneously run all requests, for example

Edit: you say that you have 200 calls, so you may encounter a loss of connection to the database. The problem is caused by the fact that the link is destroyed when the first script completes. so that you can’t create a new connection for each child. I see that you are using the standard mysql_ * functions, so be sure to pass the 4th parameter to create a new link every time. also check the maximum number of concurrent connections on your server

+3
source

All Articles