Another floating point question

I read most of the floating point messages, and I understand the main problem, which using IEEE 754 (and simply by the nature of storing numbers in binary format) cannot be represented by some fractions. I am trying to understand the following: if both Python and JavaScript use the IEEE 754 standard, then why is this done in Python

.1 +.1

Results at 0.20000000000000001 (which is to be expected)

Where, as in Javascript (at least in Chrome and Firefox), the answer is .2

However execution

.1 +.2

In both languages, the results are +0.30000000000000004

In addition, the execution of var a = 0.3; in javascript and print the results in 0.3

If, when you execute a = 0.3 in Python, you get 0.2999999999999999999

I would like to understand the reason for this difference in behavior.

, JavaScript Java BigDecimal, . - ?

+5
3

a = 0.3 Python +0,29999999999999999

- :

>>> a = 0.3
>>> print a
0.3
>>> a
0.29999999999999999

, print ing a 0.3 - print 6 7 , ( a ) ( , ).

Javascript , , , . , ( javascript Chrome):

> (1 + .1) * 1000000000
  1100000000
> (1 + .1) * 100000000000000
  110000000000000.02

? , ( ) .

+6

.

IEEE 754, , . , Javascript , .

, , .

+3

.

  • .

  • .

  • .

"Python" - ? C, Jython, IronPython? ?

It seems that Javascript users handle repeated binary fractions differently than Python people handle repeated binary fractions.

Sometimes Javascript silently suppresses the error bit at the end. This is sometimes not the case.

This is the reason.

You have the source code for both. If you want to know more, you can. However, knowledge of the source code changes little.

0
source

All Articles