I have a piece of code that behaves differently on Mac OSX and Linux (Ubuntu, Fedora, ...). This applies to type casting in arithmetic operations in printf reports. The code is compiled using gcc / g ++.
Following
#include <stdio.h>
int main () {
float days = (float) (153*86400) / 86400.0;
printf ("%f\n", days);
float foo = days / 30.6;
printf ("%d\n", (int) foo);
printf ("%d\n", (int) (days / 30.6));
return 0;
}
created on Linux
153.000000
5
4
and on mac osx
153.000000
5
5
Why?
To my surprise, this works on both Mac OSX and Linux.
printf ("%d\n", (int) (((float)(153 * 86400) / 86400.0) / 30.6));
printf ("%d\n", (int) (153 / 30.6));
printf ("%.16f\n", (153 / 30.6));
Why? I have no clue. thanks.
source
share