This has something to do with my question Why ``> 0 True in Python?
In Python 2.6.4:
>> Decimal('0') > 9999.0
True
From the answer to my initial question, I understand that when comparing objects of different types in Python 2.x, the types are ordered by their name. But in this case:
>> type(Decimal('0')).__name__ > type(9999.0).__name__
False
Why Decimal('0') > 9999.0 == Truethen?
UPDATE: I usually work on Ubuntu (Linux 2.6.31-20-generiC # 57-Ubuntu SMP Mon Feb 8 09:05:19 UTC 2010 i686 GNU / Linux, Python 2.6.4 (r264: 75706, December 7, 2009, 18 : 45: 15) [GCC 4.4.1] on linux2). On Windows (WinXP Professional SP3, Python 2.6.4 (r264: 75706, November 3, 2009, 13:23:17) [MSC v.1500 32 bit (Intel)] on win32) my original expression works differently:
>> Decimal('0') > 9999.0
False
I am even more puzzled. % - (