I have a map. On this map, I want to show live data collected from several tables, some of which have an amazing number of rows. Needless to say, getting this information takes a lot of time. In addition, pinging is used. Depending on which servers are offline or remote, the collection of this data can vary from 1 to 10 minutes.
I want the map to be fast and responsive, so I decided to add a new table to my database containing only the data that the map needs. This means that I need a background process to constantly update the information in my new table. Of course, the capabilities of Cron are an opportunity, but I want the data to be updated as soon as the previous interval has ended. And what if the number of offline IP addresses suddenly pops up and the cycle takes longer than the Cron job interval?
My own solution is to create an endless loop in PHP that runs on the command line. This cycle will update the data for the map in MySQL, as well as write other useful data, such as cycle time and failed attempts on pings, etc., and then restart after a short pause (a few seconds).
However - people have repeatedly told me that the PHP script works forever - this is BAD. After a while, it will run gigabytes of RAM (and other terrible things)
In part, I am writing this question to confirm whether this is true, but some tips and tricks on how I will write a clean loop that does not leak memory (if possible) will not be good. Opinions on this matter will also be appreciated.
The answer that I feel sheds light on the question that I will indicate as correct.
Hubro
source share