This is an endianness problem. When you interpret char* as int* , the first byte of the string becomes the least significant byte of the integer (because you ran this code on x86, which is a little finite), while when manually converting the first byte becomes the most significant.
To put this in images, this is the original array:
abc \0 +------+------+------+------+ | 0x61 | 0x62 | 0x63 | 0x00 | <---- bytes in memory +------+------+------+------+
When these bytes are interpreted as an integer in a small final architecture, the result is 0x00636261 , which is the decimal number 6513249. On the other hand, placing each byte manually gives 0x61626300 - decimal 1633837824.
Of course, the relation char* as int* is undefined, so the difference in practice is not important, because in fact you are not allowed to use the first conversion. However, there is a way to achieve the same result, which is called type ping :
union { char str[4]; unsigned int ui; } u; strcpy(u.str, "abc"); printf("%u\n", u.ui);
Jon
source share