Use int instead of char in char array and mask

In the shown example of a bit-shift here :

unsigned long int longInt = 1234567890; unsigned char byteArray[4]; // convert from an unsigned long int to a 4-byte array byteArray[0] = (int)((longInt >> 24) & 0xFF) ; byteArray[1] = (int)((longInt >> 16) & 0xFF) ; byteArray[2] = (int)((longInt >> 8) & 0XFF); byteArray[3] = (int)((longInt & 0XFF)); 

Three questions:

  • Why is this (int) instead of (unsigned char)? I tried it with unsigned char and it seems to compile just fine.
  • Do I need 0XFF? Isn't the new bit shifted to 0 because Wikipedia says that C uses logical offset and logical shift offsets of 0? (EDIT: at least it doesn't seem necessary on one using → 24?)
  • Can't I make memcpy () to copy longInt to an unsigned char buffer? Isn't it because of the Endianness problem? Is there any other reason?
+4
source share
1 answer

1.

((longInt >> 24) & 0xFF) expression is of type unsigned long int . When converting to int expression is first converted to int , then to unsigned char . If you have not added to int , the expression will not be converted to int at first. There is no difference in the two situations, and casting is redundant.

2.

0xff not required. Converting to unsigned char actually does the same.

3.

You can use memcpy , but it is not portable, because it depends on the content capacity of the system. This will give different results if the system is a large entica or a small value, while solving the bitwise shift will give the same results.

+2
source

All Articles