Handling large volumes of denormalized read model updates in CQRS

I am developing a system based on CQRS events (not the first), where my read models are denormalized and stored in a database of documents optimized for reading (MongoDb). Nothing special. Now this particular reading model is a document containing a user ID and a potentially large array of groups of which the user is a member:

{
  "userId": 1,
  "userName": "aaron",
  "groups": [
    {
      "groupId": 1,
      "name": "group 1"
    },
    {
      "groupId": 2,
      "name": "group 2"
    }
  ]
}

There can be 10 thousand users who are members of the same group (as one example: imagine a group of which each employee is a member).

, CQRS, , (, , -, ), . , CQRS event-sourcing, .

, - (, , ), . , 10 .

, , , .

SO, , , , . , . ( - , , , ) , - , . , , , ( , ), .

- , , , " , , , ". , , , , , , , , , , , , . read .

( , ):

  • . , , , (- ), . , , . ,

  • . , . , * tablename.

:

? , ?

, CQRS, , , , , , , .

+4
1

, , 10 , . , . , , , . .

+3

All Articles