I recently ran into a problem with the node.js API, where my memory got bigger and bigger with every request. I host my server on Heroku using their free version, which is only 512 MB of RAM. After getting a lot of traffic over the weekend, I started getting memory errors from Heroku, so I started looking for a memory leak in my code to no avail. I did not hold around any objects, everything had to be cleaned and, frankly, I was lost.
However, after doing some research, I found that node.js starts the garbage collector when it reaches the max-old-space-size variable and the default is 1024 MB on 64-bit systems. I set this to 410 (80% of my available memmory), but wondering if I should just handle this in code? Obviously, it would be ideal to update my copy and just have a standard default cover, but now this is not an option.
Example:
function manipulateData(callback) {
apiGet(function(error, veryLargeObject) {
var users = veryLargeObject.data.users;
var usefulUsers = users.map(function(user) {
});
callback(null, usefulUsers)
});
}
, , , , "veyLargeObject" , , ( Users , ). , , , , ? veryLargeObject = null undefined ?
, , , . : null undefined, , ? , 512 , 8 ?