This number takes 56 bits to represent. JavaScript numbers are actually binary numbers with double precision, double precision, or double for short. They are only 64 bits and can represent a much wider range of values ββthan 64-bit integers, but because of how they achieve this (they are numbers like mantissa * 2^exponent ), they cannot represent all numbers in this range, just those that are multiples of 2^exponent , where the multiplicity fits into the mantissa (which includes 2 ^ 0 = 1, so you get all the integers that the mantissa can handle directly). Mantissa - 53 bits, which is not enough for this number. Thus, it is rounded to a number that can be represented.
What you can do is use an arbitrary type of precision number defined by a third-party library like gwt-math or Big.js These numbers are not difficult to implement if you know your school arithmetic. To do this effectively is a different issue, but also an area of ββextensive research. And not your problem if you are using an existing library.
source share