Domain objects containing a lot of data

Our domain must deal with large volumes (possibly more than 1000 records) of objects as domain concepts. This is largely historical data that needs to be used by the domain business logic. Typically, this processing depends on the stored procedure or some other service to perform such work, but since it is completely related to the domain, and we want to preserve the fairness of the Model, we would like to find a solution that allows aggregating to manage all business logic and rules, necessary for working with data.

Essentially, we are talking about past transaction data. Our idea was to create an easy class and create an instance for each transaction that we need to work with from the database. This is inconvenient for us because of the volume of objects that we will create and potential performance, but it is also inconvenient for us to unload this domain logic into a stored procedure, since this will violate the consistency of our model.

Any ideas on how we can do this?

+4
source share
4 answers

"1000" is actually not so much when it comes to simple objects. I know that a given thread in the system in which I work can store up to tens of thousands of domain objects at a given time, while other threads do the same at the same time. By the time you look at all the different things going on in a rather complicated application, 1000 objects is a kind of fall in a bucket.

YMMV, depending on what resources these objects are located on, on system loading, on strict performance requirements or on any number of other factors, but if, as you say, these are just β€œlight” objects, I would make sure that you have actually a performance issue on hand before trying to get too fancy.

+1
source

Lazy loading is one way to mitigate this problem, and most of the popular Object Relational Management solutions are. It has detractors (for example, see this answer on Lazy loading - what's the best approach? ), But others find lazy loading to be irreplaceable.

Pros

  • May reduce the memory capacity of your units to the control level.
  • Allows your ORM infrastructure to manage your work units for you.
  • In cases where you do not need a lot of child data, it can be faster than fully materializing ("moisturizing") your common root.

vs

  • Chattier, which materializes your units immediately. You make many small trips to the database.
  • Typically, architectural changes are required in the domain entity classes that could compromise your own design. (For example, NHibernate just requires you to expand the default constructor so that your objects use lazy loading virtually - but I saw other solutions that are much more intrusive).

In contrast, another approach would be to create several classes to represent each object. These classes will essentially be partial aggregates adapted to specific use cases. The main disadvantage of this is that you risk inflating the number of classes and the amount of logic with which you need to have business clients.

0
source

When you say 1000 entries, do you mean 1000 tables or 1000 rows? How much data will be loaded into memory?

0
source

It all depends on the memory size of your objects. False loading can really help if the objects in question refer to other objects that are not of interest to your process.

If you are done working with hog, you should ask yourself (or perhaps your client) if the process should run synchronously or if it could be offloaded to a batch process somewhere else.

Using DDD, how to do batch processing?

0
source

All Articles