Significant decimal numbers of binary and binary

Accuracy in Wikipedia binary32 has from 6 to 9 significant decimal digit and 64 from 15 to 17.

I found that these significant decimal digits were calculated using Mantissa, but I did not understand how this can be calculated? Any ideas?

Mantissa 32-bit format = 24 bits, Mantissa 64-bit format = 53 bits.

+4
source share
1 answer

Firstly, for this question it is better to use the general values ​​of 24 and 53. The fact that the leading bit is not represented is only an aspect of coding.

, log2 (10) ( 3,32) . , 4 , , 3.32 .

53 /log 2 (10) → 15.95 (16- )

24 /log 2 (10) → 7,22 (7- )

, , . , , , , , , , , . , "6... 9" "15... 17". 6 - , 32, 9 - , ..

Exploring Binary . - , , log2 (10) = 3.32.

+6

All Articles