Your code basically boils down to two possibilities:
long LIMIT = 600851475143; x = LIMIT / i;
vs.
#define LIMIT 600851475143 x = LIMIT / i;
The first equivalent of casting a constant to long :
x = (long)600851475143 / i;
while the second will be precompiled into:
x = 600851475143 / i;
And here is the difference: 600851475143 too big for your long compiler, so if it gets thrown into long , it overflows and goes crazy. But if it is used directly in a section, the compiler knows that it does not fit into long , automatically interprets it as a literal long long , i progresses, and division is performed as long long .
Note, however, that even if the algorithm runs most of the time, you still have overflows elsewhere, so the code is incorrect. You must declare any variable that may contain these large values as long long .
source share