"Direct response with Node.js" - Sending an HTTP response using another Node.js process (different from the main process)

Using Node.js servers, I wonder if it is possible and recommended to send an HTTP response from a delegated workflow instead of the main process. These workflows can be the Node.js servers themselves, or simply Node.js child processes that communicate via IPC.

I don’t think that the main cluster module https://nodejs.org/api/cluster.html can do what I want to do, because in this model all workers listen on the same port and they process all requests on behalf of the main process. I am looking for one core Node.js process that responds to all HTTP requests, it can authenticate and handle some requests, but it is also able to delegate intensive requests with heavy use or processor to the work pool.

Imagine that we have a GET request for a large amount of data, say, 2-3 MB.

We have at least 3 possible scenarios:

  • The main process receives the request, queries the database for a large amount of data, and then sends the data back to the requestor.
  • The main process receives the request, sends some data to the workflow using IPC, the worker receives data from the database, performs some heavy operations, and then the worker uses IPC to send all three MB of data back to the main process, which then sends the response.
  • The main process receives the request, sends as little information as possible about the request flow to the worker, the worker does all the work, and the worker sends an HTTP response.

I am particularly curious about what makes No. 3 possible.

The following is a simple description of scenario 3:

enter image description here

(To be clear, I do not want to receive 3 responses in one request, I'm just trying to show that an employee can send a response on behalf of the main process).

Does anyone know how this can work with Node.js? How can this work in other languages? Usually I have no problems with the Node.js concurrency model, but with some data types, using the Cluster module is probably not the best way to achieve the highest levels of concurrency.

I believe that one term for this model is "direct response", that is, the worker responds directly to the request. And perhaps for this you can simply use the main cluster module https://nodejs.org/api/cluster.html .

+6
source share
2 answers

I am wondering if this is possible and it is recommended to send an HTTP response from a delegated workflow.

Yes, this is possible and perhaps the easiest, most common way to scale application servers. Unlike IPC, it can work through nodes over a network. (It will also work locally if you want it to ... but make sure you are really processor bound in your application. Although JavaScript itself is single-threaded, most I / O libraries and some NPM modules use thread pools.)

There is no reason to use Node.js as a server load balancer between server servers. Node.js is better for your application server. For something simple proxying HTTP requests, I would use Nginx or the like. Nginx can efficiently handle all actions with the client and is easily configured to balance the load.

0
source

If you are trying to use multiple processors on your computers (running Node itself uses only one process), just use PM2:

https://www.npmjs.com/package/pm2

PM2 runs various instances of your application on the processors that you define PM2. If the application does not have a status (ideally uses Node), an instance of your application will run on each processor, and PM2 will perform routing.

If I can orally redraw the diagram you sent for scenario 3, PM2 would replace "MAIN" and "W", replacing your application, and you do not need to worry about working and forking.

We use PM2 in production and it works well for us.

0
source

All Articles