Several colleagues and I came across an architectural decision that has serious performance implications: our product includes a UI-based schema builder that allows non-programmers to create their own data types for a web application. He is currently building properly normalized schemas behind the scenes and includes some sophisticated logic to change the schema and transfer legacy data automatically if administrators make changes to data types.
Normalized schemas have encountered performance bottlenecks in the past, and major refactoring has been planned. One of the development teams wants to keep each property of the data types in a separate table, so changing data types will never require a schema change. (One property can be converted to a 1: n ratio, for example, only by changing the application logic.)
Because early benchmarking indicates that this will lead to a huge performance hit, they created a caching layer in the application code that supports denormalized versions of each data type. Although this speeds up requests, I am skeptical of the complexity that the application layer will apply, but I hope for feedback - am I pessimistic? Are other developers of this type confident? Should I stick to my guns or shift complexity from “Scheme Modification Tools” to “Mirror Tools” is a good thing?
, . , . ( 1: n, , .)
.
, , , . - , . "" javascript. , ; , ObjectTypeCode ObjectId; , ObjectId, .
. , , (, , ) . , , "". , ObjectAttributes ObjectProperties. . ( , ) .
, , " ". , . , " ", , .
, , , . , .
, , , , , -, . , , , , , .
, . , , , , .
, , . . , , , , . , , , , , . , , , .
, , . , . , . , , .
, , , . .
SO : , .
-
As the old saying goes, normalize until it hurts, de-normalize until it works .