Your two questions:
- How to cache updates?
- How can I commit updates to related tables at the same time.
Cached updates can be performed in several different ways. Which one best depends on the specific situation:
ADO Service Packs
Since you have already stated that you are using ADO to access data, this is a smart option. You just need to set LockType to ltBatchOptimistic and CursorType to ctKeySet or ctStatic before opening the dataset. Then call TADOCustomDataset.UpdateBatch when you are ready to commit.
Note. The primary OLEDB provider must support batch updates to take advantage of this. The provider for SQL Server fully supports this.
I don’t know of any other way to ensure the connection between the master and the details when saving data, than calling UpdateBatch sequentially on both data sets.
Parent.UpdateBatch; Child.UpdateBatch;
Customer Datasets
Data caching is one of the main reasons for the existence of TClientDataset , and synchronizing the relationship between the master and the details is not difficult.
To do this, you define the relationship between the master and the parts on the two components of the dataset, as usual (in your case ADOQuery or ADOTable ). Then create a single provider and connect it to the main dataset. Connect a single TClientDataset to the provider and you TClientDataset done. TClientDatset interprets the part dataset as an embedded dataset field that can be accessed and linked to data controls just like any other dataset.
After that, you simply call TClientDataset.ApplyUpdates , and the client dataset will take care to properly configure updates for the master / part data.
ORMs
There is much to say about ORM. Too much to fit into StackOverflow's answer, so I will try to be brief.
ORM got a bad rap lately. Some research scientists have gone so far as to designate them as an anti-pattern. Personally, I think this is a little unfair. Object-relational mapping is an incredibly difficult task to solve correctly. ORMs try to help by diverting most of the complexity involved in transferring data between a relational table and an object instance. But, like everything else in software development, there are no silver bullets, and ORMs are no exception.
For a simple data entry application without a lot of business rules, ORM is likely to overdo it. But as the application becomes more complex, ORM begins to look more attractive.
In most cases, you will want to use a third-party ORM, rather than minimizing your own. Writing a custom ORM that fits your requirements perfectly sounds like a good idea and is easy to get started with simple comparisons, but you will soon start to encounter issues such as relationships between parents and children, inheritance, caching, and cache invalidation (trust me, I know this from experience). Third-party ORMs have already encountered these problems and spent a huge amount of resources to solve them.
With a lot of ORMs, you compromise code complexity for configuration complexity. Most of them are actively working to reduce the configuration of templates, referring to conventions and policies. If you name all your primary keys Id rather than matching each Id column with the corresponding Id property for each class, you simply indicate ORM about this convention and assume that all tables and classes are aware of subsequent conventions. You only have to redefine the agreement for specific cases when it does not apply. I am not familiar with all ORMs for Delphi, so I can’t say which support it is and which not.
In any case, you will want to design your application architecture so that you can refuse to solve any ORM framework (or, for that matter, any structure) to use as long as possible.