The scenario is as follows:
A call to the specified URL, including the Id famous SearchDefinition , should create a new Search entry in db and return a new Search.Id .
Before returning Id I need to start a new process / start asynchronous execution of a PHP file that accepts a new Search.Id and performs a search.
The user interface then checks the third PHP script to get the search status (the 2nd script continues to update the search entry in Db).
This creates a problem with creating a second PHP script in an asynchronous way.
I'm going to run this on a third-party server, so you have little control over permissions. Thus, I would prefer to avoid setting a cron / similar poll for new search entries (and I don't really like polling if I can avoid it). I'm not a big fan of using a web server to work, which is not related to the website, but to avoid permission problems that may be required.
This seems to leave me 2 options:
- Calling the 1st script returns the identifier and closes the connection, but continues execution and actually does the search (for example, stick script 2 at the end of script 1, but closes the response at the point of addition)
- Run the second PHP script asynchronously.
I am not sure how one of the above could be done. The first one still feels disgusting.
If you need to use CURL or a similar fake web call, I will do it, but I was hoping for some convenient multi-threaded approach, where I just create a new thread and point it to the corresponding function and the permissions will be inherited from the caller (i.e. Web server user).
source share