I read most of the floating point messages, and I understand the main problem, which using IEEE 754 (and simply by the nature of storing numbers in binary format) cannot be represented by some fractions. I am trying to understand the following: if both Python and JavaScript use the IEEE 754 standard, then why is this done in Python
.1 +.1
Results at 0.20000000000000001 (which is to be expected)
Where, as in Javascript (at least in Chrome and Firefox), the answer is .2
However execution
.1 +.2
In both languages, the results are +0.30000000000000004
In addition, the execution of var a = 0.3; in javascript and print the results in 0.3
If, when you execute a = 0.3 in Python, you get 0.2999999999999999999
I would like to understand the reason for this difference in behavior.
, JavaScript Java BigDecimal, . - ?