Hex 0x0001 vs 0x00000001

often in code that uses permission checking, I see that some people use hex 0x0001, while others use 0x00000001. they both look like the equivalent of a decimal fraction, if I'm not mistaken.

why use one over the other, just a matter of preference?

+6
decimal hex
source share
2 answers

Assuming it's C, C ++, Java, C #, or something similar, they are the same. 0x0001 implies a 16-bit value, while 0x00000001 implies a 32-bit value, but the real word length is determined by the compiler at compile time when evaluating hexadecimal literals such as these. This is a coding style question, but it does not make any difference in the compiled code.

+12
source share

What happens here is a bitmask, for which it is customary to place leading zeros across the width of the bitmask. I would also suggest that the width of the bitmask changed at some point to add more specialized permissions.

+3
source share

All Articles