Node.js force only one thread to execute code

When I run my node app.js application, the process has only 1 thread. However, the more time that runs, the more threads run for the process. The problem is that when I want to execute a certain type of code as follows:

 var io = require('socket.io')(process.env.PORT); 

It fails because the signal was sent from multiple threads, and therefore the code was not executed successfully.

A simple test if you do this:

 var io = require('socket.io')(9001); var io = require('socket.io')(9002); var io = require('socket.io')(9003); var io = require('socket.io')(9004); 

It works fine, but this code:

 var cPort = 9001; setInterval(function() { var io = require('socket.io')(cPort); cPort++; }, 1000 * 60 * 2); // 1 sec * 60 seconds * 2 = 2 minutes interval 

will not be executed, because after 2 minutes the node will have many threads, and all of them will try to execute the code - as a result, you will see error: address in use .

So, despite the execution of a multi-threaded process of the same file, how can I get node to execute this code only once?

11/06/2017 EDIT ----

To clarify the problem:

What do I mean in the question, I have no problems with resources, if I start all the servers at once (for example, 40 servers), they all start up successfully and work indefinitely. The problem occurs if I start only one server, and then run the code, which automatically runs when necessary. At this point, I always see an address in use error, it is obvious that the address is not used at the time of code execution. Currently, I have to manually start more servers on weekends, when on other days of the week more people use the service and fewer servers, so I wanted to create an automated system that starts and closes servers based on the population.

this is the server startup code:

 var cp = require('child_process'), servers = [], per_server = config.per_server, check_servers = function(callback) { for(var i = 0; i < servers.length; i++) { callback(i, servers[i]); } }; this.add_server = function(port) { var server = { port: port, load: 0, process: cp.fork(__dirname + '/../server_instance.js', [], { env: { port: port } }) }; server.process.on('message', function(message) { server.load = message.load; }); servers.push(server); }; this.find_server = function() { var min = Infinity, port = false; check_servers(function(index, details) { if(details.load < min) { min = details.load; port = details.port; } }); return port; }; 

now, if I execute controller.add_server() 40 times in a row, it will start 40 servers correctly, but if I do this:

 var start_port = 3185; setInterval(function() { var min = Infinity; check_servers(function(index, details) { if(details.load < min) { min = details.load; } }); if(min > config.per_server) { controller.add_server(start_port); start_port++; } }, 5000); 

I get a random error when creating a second, third or fourth server that is already in use.

11/07/2017 EDIT ----

As I said, I tried the following libraries for scanning / searching for ports:

Only the first time I was able to start at least 2 servers, this is the code I used:

 setInterval(function() { var min = Infinity; check_servers(function(index, details) { if(details.load < min) { min = details.load; } }); if(min > per_server) { _self.add_server(); } }, 5000); var portfinder = require('portfinder'); portfinder.basePort = 3185; this.add_server = function() { portfinder.getPortPromise() .then((port) => { console.log('port found', port); var server = { port: port, load: 0, process: cp.fork(__dirname + '/../server_instance.js', [], { env: { port: port } }) }; server.process.on('message', function(message) { server.load = message.load; }); servers.push(server); }) .catch((err) => { console.log('error happened'); }); }; 

After doing many tests, it looks like I can start 2 servers and then random them, crashing on the third or fourth attempt. Clearly, the problem is deeper than with port discovery, this library only tells me that I already know, I know which ports are open, and I double-check that before the script tries to start the server using the netstat -anp | grep PORT netstat -anp | grep PORT .

So, itโ€™s clear that the problem is not in finding open ports, but in terms of the result, it looks like node is trying to start the server several times from one command.

track EDIT ----

adding server_instance.js code:

 var io = require('socket.io')(process.env.port), connections_current = 0, connections_made = 0, connections_dropped = 0; io.on('connection', function(socket) { connections_current++; connections_made++; // ... service logic here, not relevant (like query db, send data to users etc) socket.on('disconnect', function() { connections_current--; connections_dropped++; }); }); setInterval(function() { process.send({ load: connections_current }); }, 5000); 

11/08/2017 EDIT ----

I tested many solutions to solve the problem, and I observed this situation:

  • local test on mac osx where I can create a maximum of 3000 server connections. The error will never happen, node has 1 process and 6 threads for the router file. With 3,000 connections, I can create even 200 servers without any problems.

  • server test on linux debian, where I generate 2 million connections to the server. The error always occurs on the 3rd or 4th instance of the server when I connect all node people with 6 processes and 10 threads for every process for the router file.

This is clearly the source of the problem, the more capacity I have, the more node processes occur, and earlier it will overlap when trying to start a new server.

+7
javascript multithreading
source share
2 answers

You can use portfinder to discover available network ports on your system (it starts to detect from port 8000). Use is simple as:

 const http = require('http'); const portfinder = require('portfinder'); const pid = process.pid; portfinder.getPort((err, port) => { if (err) throw err; http.createServer((req, res) => { res.end(`Response from server ${pid}.\n`); }).listen(port, () => { console.log(`Server ${pid} running on port ${port}...`); }); }); 



** EDIT **

It seems that the same port returns several times from the portfolio, so an EADDRINUSE error occurs. My suspect was that the port is not yet listening when the portfolio tries to find a new one (thus returning the same port), but this seems to contradict the fact that starting multiple servers with a simple loop cycle seems to work fine: / p>

 for (let i = 0; i < max_number_of_servers; ++i) { this.add_server(); } 


A simple solution for your code could be to increase the base address of the portfolio with every call to add_server:

 portfinder.basePort = 8000; this.add_server = function() { portfinder.getPortPromise() .then((port) => { portfinder.basePort += 1; var server = { port: port, load: 0, process: cp.fork('server_instance.js', [], { env: { port: port } }) }; server.process.on('message', function(message) { server.load = message.load; console.log("message"); }); servers.push(server); }) .catch((err) => { console.log(err); }); }; 

This code works fine, at least on my machine.
In any case, I suggest you consider another implementation. Imho, if you find that in scenarios with the highest traffic, you need N servers to process all requests correctly, there is no need to create fewer servers, and then dynamically change it in the database of current traffic for several reasons:

  • A new process is an expensive operation, and it may take time to start and start.
  • In case of heavy traffic, your entire server is ready to serve requests without additional delay.
  • In the case of low / medium traffic, your servers will be less overloaded, but you will get in terms of upgradeability and availability (if for some reason the server server crashes, there are many other servers that can serve the request while you can start new server process, which takes some time).


You can use your own cluster module to easily create a distributed server application with automatic load balancing and fault tolerance. By default, the clusteer module performs a cyclic algorithm for distributing incoming requests among workers, so you get free load balancing!
Possible simple implementation (only for the test I used a different search port port ):

 // main.js const cluster = require('cluster'); const getPort = require('get-port'); const max_servers = 40; // master process if (cluster.isMaster) { for (let i = 0; i < max_servers; ++i) { getPort().then(port => { cluster.fork({port: port}); }) } // detect exit event on workers cluster.on("exit", (worker, errCode) => { console.log(worker); // start new worker in case of crashes if (errCode != 0 && !worker.suicide) { console.log("Worker-server crashed. Starting new worker..."); getPort().then(port => { cluster.fork({port: port}); }) } }); } // worker process --> start server else { require('./server_instance.js'); // [2] } 
 // server_instance.js const http = require("http"); const pid = process.pid; let port = process.env.port; console.log(`Starting server on process ${pid} running on port ${port}...`); let io = require('socket.io')(process.env.port), connections_current = 0, connections_made = 0, connections_dropped = 0; io.on('connection', function(socket) { console.log(`Socket.io on process ${pid} running on port ${port}...`); connections_current++; connections_made++; // ... service logic here, not relevant (like query db, send data to users etc) socket.on('disconnect', function() { connections_current--; connections_dropped++; }); }); 
0
source share

The best solution would be to create port numbers in your main process and then transfer them to work processes so that they do not overlap.

You can also check if the port is used and get a free port using the npm module, for example test-port-provider .

0
source share

All Articles