The central problem is that I have a number of doubles that I need to record, each with a different number of significant digits. Numbers vary greatly in the number of significant digits. Some of them have 0 (for example, 5257), some have 2 (for example, 1308.75), some of them are up to 7 (for example, 124.1171875). Basically, everything is from 0 to 7 significant digits after the decimal number.
The Double.toString () standard works fine for everyone, but with 7 significant digits. This is up to 6 digits; significant digits are printed without any minor digits. But for those who have 7 significant digits, toString () rounds the last digit. I.e.
5257 -> "5257"
1308.75 -> "1308.75"
124.1171875 -> "124.117188"
Of course, I tried to use DecimalFormat ("#. #######"), and this solved the problem with the absence of significant digits, but printed insignificant digits for many low-precision doubles. I.e.
1308.75 -> "1308.7499998"
This is also unacceptable, because 1) it allocates a significant amount of space (usually log> 2 GB of data per day) and 2) it messed up applications using logs.
DecimalFormat seems to suck compared to toString () when it comes to determining meaningful numbers, is there anyway to fix this? I just want to use the toString () style to handle significant digits and increase the maximum number of digits from 6 to 7.
Any ideas? Thanks
source
share