I am writing a Haskell code generator for Javascript using GHC as a library. Since Javascript does not have an integer type, and its type Number can only represent integers up to 2β΅Β³, I ββrepresent integers as Numbers, explicitly performing all arithmetic values ββmodulo 2Β³Β². This works very well with a 32-bit GHC, but worse with a 64-bit version.
The GHC will gladly force Int64 values ββto Ints and interpret Int constants as 64-bit values ββ(for example, 0xffffffff turns into 4294967295, rather than -1) and causes all kinds of annoying problems.
The compiler works fine for "normal" web material even on a 64-bit system, provided that the standard libraries are built on a 32-bit machine, but "please do not use large ish digits, OK?". this is not what you want to see in your compiler guide. Some problems (but not all) can be fixed by compiling with -O0, but this (not surprisingly) creates code that is not only slow, but also too large.
So, I need to stop the GHC from assuming that Int and Int64 are equivalent. Is it possible?
source share