I assume you say https://github.com/pgriess/node-msgpack .
Just looking at the source, I'm not sure how it could be. For example, in src/msgpack.cc they have the following:
Buffer *bp = Buffer::New(sb._sbuf.size); memcpy(Buffer::Data(bp), sb._sbuf.data, sb._sbuf.size);
In terms of node, they allocate and populate a new SlowBuffer for each request. You can perform a comparative analysis of part of the distribution by doing the following:
var msgpack = require('msgpack'); var SB = require('buffer').SlowBuffer; var tmpl = {'abcdef' : 1, 'qqq' : 13, '19' : [1, 2, 3, 4]}; console.time('SlowBuffer'); for (var i = 0; i < 1e6; i++) // 20 is the resulting size of their "DATA_TEMPLATE" new SB(20); console.timeEnd('SlowBuffer'); console.time('msgpack.pack'); for (var i = 0; i < 1e6; i++) msgpack.pack(tmpl); console.timeEnd('msgpack.pack'); console.time('stringify'); for (var i = 0; i < 1e6; i++) JSON.stringify(tmpl); console.timeEnd('stringify'); // result - SlowBuffer: 915ms // result - msgpack.pack: 5144ms // result - stringify: 1524ms
Thus, simply by allocating memory for the message, they already spent 60% of the time stringify . There is only one reason why it is so slow.
Also JSON.stringify that JSON.stringify got a lot of love from Google. He is very optimized and will be difficult to win.
Trevor norris
source share