DumpSave stores the values associated with the symbol, i.e. OwnValues , DownValues , UpValues , SubValues , NValues , FormatValues , FormatValues .
The entire evaluation was done in a Mathematica session, and then DumpSave saved the result.
These values are stored in internal format. Reading MX files creates only characters and fills them with these values, reading this internal format back bypassing the evaluator.
Perhaps you could share a problem that prompted you to ask this question.
[EDIT] Clarification of the issue raised by Alexei. MX files retain the internal representation of character definitions. Mathematica seems to be internally tracking:
f[x_Real] := x^2 + 1 DumpSave[FileNameJoin[{$HomeDirectory, "Desktop", "set_delayed.mx"}], f]; Remove[f] f[x_Real] = x^2 + 1; DumpSave[FileNameJoin[{$HomeDirectory, "Desktop", "set.mx"}], f]; setBytes = Import[FileNameJoin[{$HomeDirectory, "Desktop", "set.mx"}], "Byte"]; setDelayedBytes = Import[FileNameJoin[{$HomeDirectory, "Desktop", "set_delayed.mx"}], "Byte"];
You can use SequenceAlignment[setBytes, setDelayedBytes] to see the difference. I do not know why this is done, but my point is. All evaluation of values constructed using Set was already done in a Mathematica session before they were saved using DumpSave . When an MX file is read, the internal view is read back into Mathematica sessions, and evaluation of the loaded definitions is not actually performed.
Sasha
source share