I inherited a project in which the application data model is an XML document. The developers in front of me created an object model based on this xml-scheme, and then encoded against the object model.
After several years of service, this application gradually began to show its age. The team leader said the key reason for this is due to the "slow" xml serialization. I am tempted to call BS about this, but many of the xml files we are dealing with are larger than 2 MB and, considering the basics of what happens behind the scenes with objects labeled [Serializable] , 2 MB has a lot to think about that there may be some truth to the theory of slowness.
In your experience, is serialization really that slow / bad to select an XML β XPath model instead of an XML β POCO model?
By the way, this is a .NET 2.0 project, and our clients can upgrade to .NET 3.5 at the end of next year.
design-patterns serialization xml-serialization
Saint domino
source share