What I would also consider with Curl is that you can sink the queries. It helped me a lot, since I do not have access to the version of PHP that allows threading at the moment.
For example, I received 7 images from a remote server using file_get_contents, and it took 2-5 seconds per request. Only this process included 30 seconds or something in this process, while the user was waiting for the creation of a PDF file.
This literally reduced the time to about 1 image. Another example: I am checking 36 URLs in the time it took before to do this. I think you understand. :-)
$timeout = 30; $retTxfr = 1; $user = ''; $pass = ''; $master = curl_multi_init(); $node_count = count($curlList); $keys = array("url"); for ($i = 0; $i < $node_count; $i++) { foreach ($keys as $key) { if (empty($curlList[$i][$key])) continue; $ch[$i][$key] = curl_init($curlList[$i][$key]); curl_setopt($ch[$i][$key], CURLOPT_TIMEOUT, $timeout); // -- timeout after X seconds curl_setopt($ch[$i][$key], CURLOPT_RETURNTRANSFER, $retTxfr); curl_setopt($ch[$i][$key], CURLOPT_HTTPAUTH, CURLAUTH_ANY); curl_setopt($ch[$i][$key], CURLOPT_USERPWD, "{$user}:{$pass}"); curl_setopt($ch[$i][$key], CURLOPT_RETURNTRANSFER, true); curl_multi_add_handle($master, $ch[$i][$key]); } } // -- get all requests at once, finish when done or timeout met -- do { curl_multi_exec($master, $running); } while ($running > 0);
Then check the results:
if ((int)curl_getinfo($ch[$i][$key], CURLINFO_HTTP_CODE) > 399 || empty($results[$i][$key])) { unset($results[$i][$key]); } else { $results[$i]["options"] = $curlList[$i]["options"]; } curl_multi_remove_handle($master, $ch[$i][$key]); curl_close($ch[$i][$key]);
then close the file:
curl_multi_close($master);
Mike Q May 10 '17 at 16:33 2017-05-10 16:33
source share