~ 1k / hour, 3600 seconds per hour, more than 3 runs per second (not to mention 50k / hour) ...
There are a lot of questions. Some of them:
- Does your php script need to read / process all data source records for each individual run? If not, what subset does he need (~ size, criteria, ...)
- The same question for a flash application +, who sends the data? Php script? Direct request for full static xml file?
- What operations are performed in the data source?
- Do you need some kind of concurrency mechanism?
- ...
And just because you want to deliver xml data to flash clients does not necessarily mean that you need to store XML data on the server. If, for example, clients only need a tiny subset of the available records, it might be much faster not to store the data as xml, but something more suitable for speed and "search", and then create the xml output of the subset "on the fly", maybe , using some caching, depending on what data the client requests and how / how much the data changes.
edit: Suppose you really need a truly complete dataset and need continuous simulation. Then you can think of a continuous process that saves the entire โworld modelโ in memory and works with this model for every run (world tick). This way, at least you donโt have to upload data on every tick. But such a process is usually written to something other than php.
source share