I read the timestamp values from the sensor readings, but since they are provided in nanoseconds, I thought that I would double them and make the conversion. The resulting number is a 17-digit value, plus a separator.
Trying to print it directly leads to a scientific notation that I don’t want, so I use the DecimalFormat class to output it to the expected value of 4 decimal places. The problem is that although the debugger shows the number of 17 decimal digits, even after calling "doubleValue ()", the output line shows me a number of 15 digits.
The code:
... Double timestamp = (new Date().getTime()) + // Example: 1.3552299670232847E12 ((event.timestamp - System.nanoTime()) / 1000000D); DecimalFormat dfmt = new DecimalFormat("#.####"); switch(event.sensor.getType()){ case Sensor.TYPE_LINEAR_ACCELERATION: case Sensor.TYPE_ACCELEROMETER: accel = event.values.clone(); String line = "A" + LOGSEPARATOR + dfmt.format(timestamp.doubleValue()) + // Prints: 1355229967023.28 ...
I thought this might be an android precision issue, but the debugger also has the wrong accuracy. I tested this in a local Java program, and both calls have the same number of digits.
Is this a DecimalFormat error / limitation? Or am I doing something wrong?
source share