Using a distributed processing system for FOREGROUND requests in PHP

I am familiar with php-resque and other job processing systems for processing background jobs, but I don't think it will do what I need.

In this case, I have an incoming web service request, which should perform several (2-4) independent calls to external systems and return with a consolidated response to the client. Each callout can take 300-500 ms, so I want each callout to run in parallel, so that the whole process takes no more than 500 ms +/- total.

My problem with php-resque and other systems is that wait even 1 second to start issuing these callouts, wait too long, and I am considering a different approach.

What I think:

  • each individual callout is described and stored in the database with this unique request identifier
  • we run jobs right away as an asynchronous php process (also known as a "workflow")
  • Each worker writes his result back to the task record and indicates that it is completed
  • Meanwhile, we check the task table every 50-100 ms to check the status of each task.
  • when each of them is completed, we analyze the results as necessary and return the answer.

Of course, we will use a timeout for each request and the overall process ...

Thoughts? Am I mistaken? Can php-resque run multiple jobs in parallel almost instantly?

+6
source share
2 answers

Your plan should work, but I think that you could avoid all database connections and even polling using the PHP process control functions .

  • Perform your main process as many times as the tasks you need for parallel work. See: pcntl_fork
  • Perform your tasks in these branched processes and let them work properly.
  • The process that initiates the tasks must wait for them to complete, listening to their SIGCHLD signals as they exit. Or, if they are not up to your chosen timeout, send a SIGTERM signal to them to clear. See: pcntl_sigtimedwait and posix_kill .

You will have to use these functions in the PHP script CLI, though ... because ...

http://www.php.net/manual/en/intro.pcntl.php

Process control should not be enabled in the web server environment and unexpected results can occur if any process control functions are used in the web server environment.

But your web server can easily exec() create its own CLI script that will do all the hard work, return the status of these tasks, etc.

+2
source

If your external calls are just HTTP requests, you can just use curl and make a few requests. They seem to be doing exactly what you need.

If this is something else, I can recommend Gearman .

If you want to fill your hands and write your own daemon, I would suggest skipping the IPC functions and switching to something higher level like ZeroMQ and maybe use Supervisord to restart the PHP processes if they die. It is relatively difficult to write lengthy PHP processes, so you have to build it with the notion that external scripts will die randomly and prepare to handle it gracefully.

+1
source

Source: https://habr.com/ru/post/926921/


All Articles