I have an asp.net website that processes requests using a third-party exe. Currently my workflow
The user accesses the website using any browser and fills out a task form.
A website calls a self-service WCW service that listens on a port
Windows service starts a third-party exe to process the job and returns the result to the website
The website displays the returned result to the user.
This website was a prototype that now needs to be converted to production. I understand that the above architecture has many points that can break. For example, if the computer is turned off or if the Windows service crashes and no longer listens on the port, all current requests will stop processing. To make the architecture more reliable, I consider the following
The user accesses the website using any browser and fills out a task form.
Website writes job data to database
A Windows service that polls the database for a new job every 10 seconds, picks up the job, and runs it using a third-party application. The results are written to the database.
The website that has now started polling the database picks up the results and displays them to the user.
The second architecture provides me with more logging capabilities, and jobs can run again if they are in the queue. However, this is due to the large number of surveys that cannot be scalable. Can anyone recommend a better architecture?
user1625066
source share