I hope someone can help me understand something. If I get access to a website that is on the other side of the planet, PHP reports a page runtime of ~ 300 ms (using simple math with microtime(true) ). If I am on the same website locally on the server, PHP reports ~ 20 ms page execution time. It seems that network latency is a PHP runtime factor, which is surprising to me because I thought the whole page was provided by PHP and returned to apache to be sent to the client immediately (thus, the script execution time would be the same. regardless of origin).
Apache, PHP5 (mod_php), CentOS 5.
Can we assume that PHP is somehow expecting the output to be sent to the browser before proceeding further? I have a feeling that there may be some output buffering factors associated with this, but I'm really not sure. I have output buffering enabled through php.ini.
My question is: what happens, and how does my network latency affect PHP execution?
Hope this was the right place to ask such a question. I tried to find SO and google, but no luck finding anything even relevant.
Change I'm not talking about measuring the time it takes to transfer a web page from server to client. I am directly talking about how PHP calculates that it runs faster when I load the HTML source of the page from the local machine. Same server, same page, consistent results. I am using curl http://example.com | grep milliseconds curl http://example.com | grep milliseconds in both places to see the server reporting as PHP runtime.
joneszach
source share