After some profiling in a personal pet project, I play with some silly micro-optimizations. This is pretty much an experiment more than anything else, so the things that I'm setting up don't really need to be set up. This is still an interesting exercise.
In any case, I came across a strange performance difference between my installation of PHP 5.3 on OS X via Macports and on Ubuntu with apt.
The following code seems to show a big speed difference between two different versions of OS X, but only a small speed difference on Ubuntu.
$x = array(9); // As per BarsMonster comment, this ensures it runs for more // than a second in order to avoid possible kernel scheduler differences $count = 10000000; $s = microtime(true); for ($i=0;$i<$count;$i++) { $q = is_string($x); } var_dump(microtime(true)-$s); $s = microtime(true); for ($i=0;$i<$count;$i++) { // This is obviously only useful if you'll never need a string // with 'Array' in it here $q = (string)$x!='Array'; } var_dump(microtime(true)-$s);
Exit when working in OS X:
float(17.977133989334) float(4.2555270195007)
Exit when working on Ubuntu:
float(5.2112979888916) float(3.4337821006775)
It doesn't surprise me that the numbers for the hacked version of the cast are pretty similar, but the is_string method is very different.
What can this be connected with? If performance changes so much from installation to installation for trivial type testing functions, how can I trust the results of profiling using an OS that does not match my target deployment platform?
There is no time difference when turning APC on or off on both Ubuntu and OS X.
source share