We have an object model used in three applications. Two programs collect data, others read it and generate reports. The system is very disconnected, so we cannot have a single database with which all programs talk.
Currently, programs simply use a shared library to populate the object model and serialize / deserialize to disk. In particular, we use XML serialization.
There are a couple of problems with this model. 1) XML can be considered wasteful. Files can become large and bulky. Honestly, file size is not a big concern right now. 2) My biggest problem is printing a stack of memory. The entire file is loaded into the object model, managed, and then saved.
I hope that I talked about my concern, at some point we will run into memory problems with this application at runtime. Sufficient data will be collected into a single "database" (xml file), which it cannot immediately load into memory.
I would like to have access to my object model supported by file memory, not memory. I want the object model changes to be minimal. When an object accesses it, it comes from the disk, and when it is installed, it is saved (automatically, if possible).
We covered NHibernate with SQLite, SQL Compact 4.0, and EF 4 and LINQ to XML (briefly). I also used db4o in the past to cache objects to disk, but it was an unrelated project.
Before diving in and devoting time to studying one of them, I would like to know if my idea makes sense. Can I have an object model that is "magically" cached to a storage medium, and not just inflate my memory size indefinitely? What is the shortest way for this, even if it is not the most elegant?
Are there any other technologies that could help me? Files with memory mapping, linq-to-sql, Lazy (T) (only to get objects from files, when possible).
I understand that this is an open question. I am looking for a great answer to the image and details if someone there has real world experience. Links would be helpful ...
Thanks.