I know this is a simple question, but I'm confused. I have a pretty typical gcc warning, which is usually easy to fix:
warning: comparison between signed and unsigned integer expressions
Whenever I have a hexadecimal constant with the most significant bit, like 0x80000000L, the compiler interprets it as unsigned. For example, compiling this code with -Wextra will raise a warning (gcc 4.4x, 4.5x):
int main()
{
long test = 1;
long *p = &test;
if(*p != 0x80000000L) printf("test");
}
I deliberately fixed a constant for so long, so why is this happening?
source
share