Mongoose (mongodb) batch insert?

Does Mongoose v3 support . 6+ batch insert? I searched for a few minutes, but everything that matches this query was a couple of years, and the answer was unequivocal - no.

Edit:

For future reference, the answer would be to use Model.create() . create() takes an array as the first argument, so you can pass your documents to be inserted as an array.

See Model.create () documentation

+101
mongodb mongoose
May 24 '13 at 1:14
source share
9 answers

Model.create () vs Model.collection.insert (): faster approach

Model.create() is a bad way to insert if you are working with very large volumes. It will be very slow . In this case, you should use Model.collection.insert , which works much better . Depending on the size of Model.create() , Model.create() may Model.create() fail! I tried with a million documents, no luck. Using Model.collection.insert took only a few seconds.

 Model.collection.insert(docs, options, callback) 
  • docs - an array of documents to insert;
  • options is an optional configuration object - see documentation
  • callback(err, docs) will be called after all documents have been saved or an error has occurred. If successful, documents are an array of stored documents.

As the Mongoose author points out here , this method will bypass any validation procedures and gain access to the Mongo driver directly. This is a compromise that you must make because you process a large amount of data, otherwise you will not be able to insert it into your database at all (remember that we are talking here about hundreds of thousands of documents).

Simple example

 var Potato = mongoose.model('Potato', PotatoSchema); var potatoBag = [/* a humongous amount of potato objects */]; Potato.collection.insert(potatoBag, onInsert); function onInsert(err, docs) { if (err) { // TODO: handle error } else { console.info('%d potatoes were successfully stored.', docs.length); } } 

Update 2019-06-22 : although insert() can still be used just fine, it is deprecated in favor of insertMany() . The options are exactly the same, so you can just use it as a replacement for the insert, and everything should work just fine (well, the return value is slightly different, but you probably aren't using it anyway).

Link

+146
Jul 20 '14 at 6:54
source share

Mongoose 4.4.0 now supports bulk insertion

Mongoose 4.4.0 introduces a --true-- volume insert with the model .insertMany() method. This is faster than .create() on .create() or providing an array to it.

Using:

 var rawDocuments = [/* ... */]; Book.insertMany(rawDocuments) .then(function(mongooseDocuments) { /* ... */ }) .catch(function(err) { /* Error handling */ }); 

or

 Book.insertMany(rawDocuments, function (err, mongooseDocuments) { /* Your callback function... */ }); 

You can track it:

+105
Feb 03 '16 at 10:38
source share

In fact, you can use the "create" Mongoose method, it can contain an array of documents, see this example:

 Candy.create({ candy: 'jelly bean' }, { candy: 'snickers' }, function (err, jellybean, snickers) { }); 

The callback function contains inserted documents. You do not always know how many elements you need to insert (fixed argument length, as indicated above) so you can skip them:

 var insertedDocs = []; for (var i=1; i<arguments.length; ++i) { insertedDocs.push(arguments[i]); } 



Update: Best Solution

A better solution would be to use Candy.collection.insert() instead of Candy.create() - used in the example above - because it is faster ( create() calls Model.save() for each element to be slower).

For more information, see the Mongo documentation: http://docs.mongodb.org/manual/reference/method/db.collection.insert/

(thanks arcseldon for pointing this out)

+23
Mar 29 '14 at 20:30
source share

You can perform bulk insertion using the mongoDB shell using the insertion of values ​​into an array.

 db.collection.insert([{values},{values},{values},{values}]); 
+5
Aug 08 '14 at
source share

You can perform bulk insert using mongoose as the highest score. But the example cannot work, it should be:

 /* a humongous amount of potatos */ var potatoBag = [{name:'potato1'}, {name:'potato2'}]; var Potato = mongoose.model('Potato', PotatoSchema); Potato.collection.insert(potatoBag, onInsert); function onInsert(err, docs) { if (err) { // TODO: handle error } else { console.info('%d potatoes were successfully stored.', docs.length); } } 

Do not use a schema instance for bulk insertion; you must use a simple map object.

+4
Mar 16 '15 at 3:35
source share

When using mongoose, there seems to be a limit of more than 1000 documents when using

 Potato.collection.insert(potatoBag, onInsert); 

You can use:

 var bulk = Model.collection.initializeOrderedBulkOp(); async.each(users, function (user, callback) { bulk.insert(hash); }, function (err) { var bulkStart = Date.now(); bulk.execute(function(err, res){ if (err) console.log (" gameResult.js > err " , err); console.log (" gameResult.js > BULK TIME " , Date.now() - bulkStart ); console.log (" gameResult.js > BULK INSERT " , res.nInserted) }); }); 

But this is almost twice as fast when testing 10,000 documents:

 function fastInsert(arrOfResults) { var startTime = Date.now(); var count = 0; var c = Math.round( arrOfResults.length / 990); var fakeArr = []; fakeArr.length = c; var docsSaved = 0 async.each(fakeArr, function (item, callback) { var sliced = arrOfResults.slice(count, count+999); sliced.length) count = count +999; if(sliced.length != 0 ){ GameResultModel.collection.insert(sliced, function (err, docs) { docsSaved += docs.ops.length callback(); }); }else { callback() } }, function (err) { console.log (" gameResult.js > BULK INSERT AMOUNT: ", arrOfResults.length, "docsSaved " , docsSaved, " DIFF TIME:",Date.now() - startTime); }); } 
+3
Aug 2 '15 at 18:37
source share

Here are both ways to save data with insertMany and save

1) Mongoose save an array of documents with insertMany in bulk

 /* write mongoose schema model and export this */ var Potato = mongoose.model('Potato', PotatoSchema); /* write this api in routes directory */ router.post('/addDocuments', function (req, res) { const data = [/* array of object which data need to save in db */]; Potato.insertMany(data) .then((result) => { console.log("result ", result); res.status(200).json({'success': 'new documents added!', 'data': result}); }) .catch(err => { console.error("error ", err); res.status(400).json({err}); }); }) 

2) Mongoose save an array of documents using .save()

These documents will be saved in parallel.

 /* write mongoose schema model and export this */ var Potato = mongoose.model('Potato', PotatoSchema); /* write this api in routes directory */ router.post('/addDocuments', function (req, res) { const saveData = [] const data = [/* array of object which data need to save in db */]; data.map((i) => { console.log(i) var potato = new Potato(data[i]) potato.save() .then((result) => { console.log(result) saveData.push(result) if (saveData.length === data.length) { res.status(200).json({'success': 'new documents added!', 'data': saveData}); } }) .catch((err) => { console.error(err) res.status(500).json({err}); }) }) }) 
+3
May 16 '18 at 17:36
source share

I used the async-forEach link ( for documentation of the asynchronous I / O package for npm ) to achieve the same.

My code snippet is as follows. I get documents in req.body.

 var forEach = require('async-foreach').forEach; exports.save_Ctrl = function (req, res) { // var l=req.body; // console.log("length:",l.length); forEach(req.body, function(item, index, object,err) { console.log(req.body[index]); var post = new saveObj(req.body[index]); //save model to MongoDB post.save(function (err) { if (err) { console.log('error saving :' + err.message); return err; } else { console.log("Post saved"); } }); }); } 
+2
Nov 29 '16 at 6:34
source share

We share the working and relevant code from our project:

 //documentsArray is the list of sampleCollection objects sampleCollection.insertMany(documentsArray) .then((res) => { console.log("insert sampleCollection result ", res); }) .catch(err => { console.log("bulk insert sampleCollection error ", err); }); 
+1
Dec 26 '17 at 17:01
source share



All Articles