A 32-bit signed integer gives 0x7fffffff as a factor on PowerPC

I am debugging production code written in C and its simplest form can be shown as -

void
test_fun(int sr)
{   
    int hr = 0;
    #define ME 65535
    #define SE 256

    sr = sr/SE;             <--  This should yield 0
    if(sr == 1)
        hr = ME;
    else
        hr = (ME+1)/sr;     <-- We should crash here.
}

We pass sras 128, which ideally should give a division by zero error in the processor. I see that this division is successful with quotient as 0x7ffffffff ( hr- this is the value). This does not happen (it crashes when trying to divide by zero) when I compile and run the same on the Intel platform using gcc.

Want to know the principle of this big quotient. Not sure if this is just some other mistake that I still need to solve. Can someone help me with another program that does the same?

+4
2

undefined , . C11 6.5.5 # 5 ( ).

SIGFPE - CPU/OS. PowerPC RISC- , . x86 OTOH CISC.

, , , , . , POSIX SIGFPE, .

+3

PPC ( IBM), 0 PPC - ; undefined, . PPC, MAXINT ( ) 0.

+3

All Articles