I would like to create a kind of distributed installation to run tons of small / simple REST web requests in a production environment. For each 5-10 related queries that are executed from node, I will create a very small amount of received data that will need to be stored in a standard relational database (for example, PostgreSQL).
What platforms are created for this set of problems? The nature, size of data, and quantity seem to contradict Hadoop's thinking. There are also more grid architectures such as Condor and the Sun Grid Engine, which I mentioned earlier. I am not sure if these platforms have any kind of error recovery (verification of successful completion of the task).
I would really like this to be a FIFO type queue to which I could add jobs, with the final result of updating my database.
Any suggestions for a better tool to work with?
source
share