How to do bulk insertion using Sequelize and node.js

js + secelize to insert 280K rows of data using JSON. JSON is an array of 280K. Is there any way to do bulk insertion into pieces. I see that it takes a long time to update the data. When I tried to reduce the data to 40 thousand lines, it works quickly. I take the right approach. Please advise. I am using postgresql as a backend.

PNs.bulkCreate(JSON_Small)
        .catch(function(err) {
            console.log('Error ' + err);
        })
        .finally(function(err) {
            console.log('FINISHED  + ' \n +++++++ \n');

        });
+7
source share
4 answers

I used the utility cargofor the asynchronous library to load up to 1000 lines at a time. See the following code to load csv into the database:

var fs = require('fs'),
    async = require('async'),
    csv = require('csv');

var input = fs.createReadStream(filename);
var parser = csv.parse({
  columns: true,
  relax: true
});
var inserter = async.cargo(function(tasks, inserterCallback) {
    model.bulkCreate(tasks).then(function() {
        inserterCallback(); 
      }
    );
  },
  1000
);
parser.on('readable', function () {
  while(line = parser.read()) {
    inserter.push(line);
  }
});
parser.on('end', function (count) {
  inserter.drain = function() {
    doneLoadingCallback();
  }
});
input.pipe(parser);
+4

Sequelize bulkCreate.

User.bulkCreate([
  { username: 'barfooz', isAdmin: true },
  { username: 'foo', isAdmin: true },
  { username: 'bar', isAdmin: false }
]).then(() => { // Notice: There are no arguments here, as of right now you'll have to...
  return User.findAll();
}).then(users => {
  console.log(users) // ... in order to get the array of user objects
})

Sequelize |

+3

bulkInsert, . , ! . , , bulkCreate. , , , , . , - async.cargo, , db , .

, sequelize ( , - (. github issue # 2454)). db-streamer, pg. streamsql mysql.

+1

All Articles