There is a lot of #define in my code. Now a strange problem has arisen.
I have it:
#define _ImmSign 010100
(I'm trying to simulate a binary number)
Obviously, I expect the number to become 10100. But when I use the number, it changes to 4160.
What's going on here? And how to stop it?
ADDITIONAL
Well, this is because the language interprets it as octal. Is there any smart way to make the language interpret numbers as integers? If leading 0 defines an octal and 0x defines a hexadecimal number now when I think about it ...
c c-preprocessor
Nomen
source share