If you're just going to cache the query results directly based on the query string, the Mysql query cache already does this for you. Do not reinvent the wheel. The only potential difference is that the Mysql query cache is aggressively invalid, so stale (stale, invalid) data is never returned; depending on how you handle the invalidation, your strategy can further reduce the load on the database, but by regularly servicing obsolete obsolete data.
In addition, you really will not be able to selectively expire your cache keys when updates occur (how do you know which query lines should have expired when the insert / update starts?); as a result, you just need to set a short expiration time (possibly in seconds) to minimize the time during which you submit obsolete data. This probably means a low cache hit ratio. In the end, the caching strategy you described is simple to implement, but it is not very effective.
Be sure to read the "General Design Approaches" section of frequently asked questions. A good caching strategy deletes / replaces cached data immediately when updates occur - this allows you to cache data for hours / days / weeks and never use outdated data for users at the same time.
Frank farmer
source share