Here is a solution without adding previous documents to the new array and then processing them. (If the array gets too large, you can exceed the maximum BSON document size of 16 MB.)
Calculation of subtotals is also simple:
db.collection1.aggregate( [ { $lookup: { from: 'collection1', let: { date_to: '$time' }, pipeline: [ { $match: { $expr: { $lt: [ '$time', '$$date_to' ] } } }, { $group: { _id: null, summary: { $sum: '$value' } } } ], as: 'sum_prev_days' } }, { $addFields: { sum_prev_days: { $arrayElemAt: [ '$sum_prev_days', 0 ] } } }, { $addFields: { running_total: { $sum: [ '$value', '$sum_prev_days.summary' ] } } }, { $project: { sum_prev_days: 0 } } ] )
What we did: in the search, we selected all documents with less time and date and immediately calculated the amount (using $ group as the second step of the search pipeline). $ Lookup places the value in the first element of the array. We pull the first element of the array and then calculate the sum: the current value + the sum of the previous values.
If you want to group transactions by day and after calculating subtotals, we need to insert $ group at the beginning, and also insert it into the $ lookup pipeline.
db.collection1.aggregate( [ { $group: { _id: { $substrBytes: ['$time', 0, 10] }, value: { $sum: '$value' } } }, { $lookup: { from: 'collection1', let: { date_to: '$_id' }, pipeline: [ { $group: { _id: { $substrBytes: ['$time', 0, 10] }, value: { $sum: '$value' } } }, { $match: { $expr: { $lt: [ '$_id', '$$date_to' ] } } }, { $group: { _id: null, summary: { $sum: '$value' } } } ], as: 'sum_prev_days' } }, { $addFields: { sum_prev_days: { $arrayElemAt: [ '$sum_prev_days', 0 ] } } }, { $addFields: { running_total: { $sum: [ '$value', '$sum_prev_days.summary' ] } } }, { $project: { sum_prev_days: 0 } } ] )
Result:
{ "_id" : "2013-10-10", "value" : 3, "running_total" : 3 } { "_id" : "2013-10-11", "value" : 7, "running_total" : 10 } { "_id" : "2013-10-12", "value" : 5, "running_total" : 15 }