Survive version versions gives a good overview.
Things I encountered - most of them have already been mentioned
Variable initialization is by far the most common. In Visual Studio, debugging assemblies explicitly initialize allocated memory for given values, see, for example, Memory Values here. These values, as a rule, are easy to detect, cause an error outside the boundaries when used as an index, or access violation when used as a pointer. However, an uninitialized logical value is true and can cause uninitialized memory errors that have not been detected for many years.
In a release where memory is not initialized explicitly, it simply stores the contents that were previously. This leads to “ridiculous values” and “random” crashes, but often to deterministic crashes that require an explicitly unrelated command to be executed before the command that actually crashes. This is caused by the first command “setting” a memory location with specific values, and when the memory cells are recycled, the second command will see them as initialization. This is more common with uninitialized stack variables than heap, but the latter happened to me too.
The initial initialization of the source memory can also be different in the release build, regardless of whether you start with a visual studio (an attached debugger) and start with the explorer. This makes the "nicest" kind of build errors that never appear under the debugger.
Real optimization is the second place in my experience. The C ++ standard allows for many optimizations, which may be unexpected, but quite fair, for example. when two pointers have the same location in memory, the initialization order is not considered or several threads change the same memory cells, and you expect a certain order in which stream B sees the changes made by stream A. Often the compiler is blamed for these. Not so fast, young yedi! - see below
Timing Graduation builds are not just “faster” for various reasons (optimization, logging functions that provide a stream synchronization point, debugging code such as statements are not executed, etc.), as well as the relative time between operations changes dramatically. The most common problem that is found in this case is the race conditions, as well as deadlocks and the simple "different order" of the message / timer / event code execution. Despite the fact that these are problems with synchronization, they can be surprisingly stable between assemblies and platforms with reproductions that "always work, except for PC 23".
Protection byte Debug builds often put (more) protective bytes around selected instances and distributions to protect against index overflows and sometimes overflows. In rare cases, when the code uses offsets or sizes, for example. serializing untreated structures, they are different.
Other Code Differences Some instructions — for example, state — do not evaluate anything in release builds. Sometimes they have different side effects. This is common with macro trickery, as in classic (warning: few errors)
#ifdef DEBUG #define Log(x) cout << #x << x << "\n"; #else #define Log(x) #endif if (foo) Log(x) if (bar) Run();
What's in the release build is rated as if (foo & bar) This type of error is very rare with regular C / C ++ code and macros that are correctly written.
Compiler Errors This never happens. Well, that’s true, but most of your career you prefer, believing that it’s not. In ten years of working with VC6, I found one where I am still convinced that this is an error of an uncommitted compiler, compared to dozens of templates (maybe even hundreds of copies) with insufficient understanding of the Scriptures (aka standard).