Consider the following simple program:
#include <stdio.h>
int main () {
FILE *fp;
printf("sizeof(int)=%d\n", sizeof(int));
fp = fopen("test.bin", "wb");
fprintf(fp, "%d", 14);
fclose(fp);
return 0;
}
It outputs the following result to stdout:
sizeof(int)=4
test.bin has the following contents when viewed in any text editor:
14
When browsing with using the vihex dump ( xxd) option :
0000000: 3134 0a 14.
When viewing with hexdump -c:
0000000 1 4 \n
0000003
It is obvious that the integer on my machine is four bytes, but both hexdumpand vitell me to represent 14 only requires two bytes to represent a newline character used another byte. This is confirmed by the fact that test.bin has only three bytes. But the integer is four bytes, so why only two bytes represent it?
What obvious fact have I completely forgotten, and perhaps I don’t remember?