Creating PHP Performance Profiling Predictably

I am using xdebug with PHP to perform performance profiling. But when I run the same script more than once, I often get a completely different time. So itโ€™s hard to understand how much faith will bring results.

Obviously, a lot of things on the machine can affect PHP performance. But is there anything I can do to reduce the number of variables, so are several tests more consistent?

I am running PHP under Apache on Mac OS X.

+4
source share
3 answers
  • Reduce the number of unrelated services as much as possible.
  • Reduce the number of Apache processes.
  • Direct the various caches by loading the script several times. Perhaps use a comparison tool like Apache ab or siege to make sure all Apache kids are hit.
  • Profile your script from the command line with curl or wget so that Apache only serves one resource: the script itself.

There may be an argument for getting more "real" numbers, omitting some of these steps. I look forward to other answers that this question may receive.

+4
source

There are two different tasks: measuring performance and finding problems.

To measure the time it takes, you must expect variability, because it depends on what else is going on in the machine. This is normal.

To find problems, what you need to know is the percentage of time used by various activities. The percentage does not change too much, as a function of other things, and the exact value of the percentage in any case does not matter much.

The important thing is that you find the actions responsible for a healthy percentage that you can fix, and then fix them. When you do this, you can save time up to that percentage, but finding is what you need to do. The measurement is secondary.

Added: You may ask: "Don't you need to measure to find?" Consider an example. Suppose you run your program with debugging turned on, and you pause it arbitrarily, and you see it in the process of closing the log file. You continue this, and then pause it again and see the same thing. Itโ€™s good that the gross โ€œdimensionโ€ says that he spends 100% of his time on this. Naturally, the time spent on this is not really 100%, but whatever it is, it is big, and you found it. So maybe you donโ€™t need to open / close the file often or something like that. Typically, more samples are required, but not too many.

+3
source
  • As others said, reduce the number of running services and programs to a minimum
  • Run the test several times in a row and average for emissions
  • Make sure that caching of any type is disabled (unless you specifically want to test it with the cache)
  • If the results are still very different, the problem will most likely be in your profile code. This may have some racing conditions or is dependent on network connections. You will receive more information if you provide the code.
  • You may also encounter some bottlenecks on some runs. If you carefully profile the various parts of the scripts, you can catch them.
+1
source

All Articles