Convert string to base36 mismatch between languages.

I noticed some inconsistencies between Python and JavaScript when converting a string to base36.

Python Method:

>>> print int('abcdefghijr', 36) 

Result: 37713647386641447

Javascript Method:

 <script> document.write(parseInt("abcdefghijr", 36)); </script> 

Result: 37713647386641450

What causes different results between the two languages? What would be the best approach to get the same results regardless of language?

Thanks.

+6
source share
1 answer

This number takes 56 bits to represent. JavaScript numbers are actually binary numbers with double precision, double precision, or double for short. They are only 64 bits and can represent a much wider range of values ​​than 64-bit integers, but because of how they achieve this (they are numbers like mantissa * 2^exponent ), they cannot represent all numbers in this range, just those that are multiples of 2^exponent , where the multiplicity fits into the mantissa (which includes 2 ^ 0 = 1, so you get all the integers that the mantissa can handle directly). Mantissa - 53 bits, which is not enough for this number. Thus, it is rounded to a number that can be represented.

What you can do is use an arbitrary type of precision number defined by a third-party library like gwt-math or Big.js These numbers are not difficult to implement if you know your school arithmetic. To do this effectively is a different issue, but also an area of ​​extensive research. And not your problem if you are using an existing library.

+10
source

Source: https://habr.com/ru/post/927944/


All Articles