The simple answer is that you cache data when something slows down. Obviously, for any medium to large sized application, you need to do a lot more planning than just wait and see the approach. But for the vast majority of websites, the question arises here: "Are you satisfied with the download time." Of course, if you are obsessive about load times, like me, you will want to try to do it even faster, despite this.
Then you must determine what exactly is the cause of the slowness. You suggested that your application code was the source, but it is worth exploring if there are other external factors, such as the large page file size, excessive requests, lack of gzip, etc. Use a site like http://tools.pingdom.com/ or an extension like yslow to get started. (quick tip make sure keepalives and gzip work).
Assuming the problem is the length of time your application code runs, you will want to profile your code with something like xdebug (http://www.xdebug.org/) and view the output with kcachegrind or wincachegrind. This will let you know which parts of your code are time consuming. From there, you will make decisions about what to cache and how to cache (or improve the logic of your code).
There are so many possibilities for solving a problem and related solutions that I should not guess. So, as soon as you identify the problem, you can post a new question related to solving this particular problem. I will say that if it is not used properly, the mysql query cache can be productive. Also, I generally avoid the APC user cache in favor of memcached.
Sameer parwani
source share