Why does "decimal.Decimal ('0') <1.0" give False in Python 2.6.5
1 answer
From the documentation of the decimal module :
Changed in version 2.7: Comparison between an instance of float x and a Decimal instance of y now returns a result based on the values of x and y. In earlier versions, x <y returned the same (arbitrary) result for any Decimal instance x and any float instance y.
, , / , , , .
+13