Endianness, “most significant” and “least significant”,

I read the descriptions online, describing the big and small endian. However, they all seem to mostly read the same way, and I'm still confused with the actual implementation regarding the “majority” and “least significant bytes”. I understand that small endian values ​​first evaluate the "least significant" values, and by and large, the "most significant" bytes are evaluated first. However, I do not understand what “majority” and “least” significant mean. I think this will help me understand if I am using the actual example, which I will give here:

I have an integer value: 12345

If I convert it to a hex value using a Windows calculator, I get the value: 3039 (basically a two-byte value). Is the value 3039 showing bytes representing the integer value 12345 stored as a small or large endian value and how can this be determined based on the value?

+5
source share
1 answer

Endian-ness refers to how numbers are stored in memory. It has nothing to do with the evaluation order of bytes. If the memory addresses increase from left to right on this page, then on the big end machine your number will be saved

30 39

and in a small car

39 30

, , , little-endian Intel, , , .

+9

All Articles