Assuming it's C, C ++, Java, C #, or something similar, they are the same. 0x0001 implies a 16-bit value, while 0x00000001 implies a 32-bit value, but the real word length is determined by the compiler at compile time when evaluating hexadecimal literals such as these. This is a coding style question, but it does not make any difference in the compiled code.
Tamas czinege
source share