The original int value would be the size of the word โ the most efficient data processing size. However, all that has a tendency is that a huge amount of code is written, assuming that the size of the int is X bits, and when the hardware on which the code is executed moves to a larger word size, the carelessly written code will break . Compiler vendors should keep their customers happy, so they say, "OK, we will leave the int size as before, but now we will do a lot more." Or, "ahhh ... too many people complained that we will make longer longer, we will create a long long type, leaving sizeof (int) == sizeof (long)". So these days it's all a mess:
Does the ANSI C specification define an int size equal to the word size (32 bit / 64 bit) of the system?
Pretty much an idea, but it doesn't insist on it.
In other words, can I decipher the size of the system word based on the space allocated for int?
Not in practice.
Tony delroy
source share