The short answer is that NTFS maintains metadata that provides reliable metadata.
Other modifications (to the file body) are not registered, therefore they are not guaranteed.
There are file systems that log all write operations (for example, AIX has one if memory is used), but with them you usually get a trade-off between disk usage and write speed. IOW, you need a lot of free space to get decent performance - they basically just make all the records on the free space and associate this new data with the necessary points in the file. Then they go through and clean up the garbage (i.e., they release the parts that have since been overwritten, and at the same time merge the pieces of the file). It can slow down if they have to do it very often, though.
source share