I turned around to find the right solution to insert a large number of documents into MongoDB using Mongoose.
My current solution looks like this:
MongoClient.saveData = function(RecordModel, data, priority, SCHID, callback){ var dataParsed = parseDataToFitSchema(data, priority, SCHID); console.log("Model created. Inserting in batches."); RecordModel.insertMany(dataParsed) .then(function(mongooseDocuments) { console.log("Insertion was successful."); }) .catch(function(err) { callback("Error while inserting data to DB: "+err); return; }) .done(function() { callback(null); return; }); }
But it seems to me that there are other proposed solutions. Like this one: http://www.unknownerror.org/opensource/Automattic/mongoose/q/stackoverflow/16726330/mongoose-mongodb-batch-insert
Using collection.insert . How is this different from Model.insertMany ?
The same goes for updating, my previous question: What is the correct approach to updating many records in MongoDB using Mongoose
asks how to update a large chunk of data using Mongoose, defined by _id . The answer is to use collection.bulkWrite while I am impressed with Model.insertMany too.
Ondrej Tokar Aug 04 '16 at 8:23 2016-08-04 08:23
source share