I am trying to calculate the offset using an accelerometer on an Android device ( Sensor.TYPE_LINEAR_ACCELERATION). Here is my method OnSensorChanged():
public void onSensorChanged(SensorEvent event) {
accelX = event.values[0];
accelY = event.values[1];
accelZ = event.values[2];
long currentTime = System.currentTimeMillis() / 1000;
if(prevTime == 0) prevTime = currentTime;
long interval = currentTime - prevTime;
prevTime = currentTime;
velX += accelX * interval;
velY += accelY * interval;
velZ += accelZ * interval;
distX += prevVelX + velX * interval;
distY += prevVelY + velY * interval;
distZ += prevVelZ + velZ * interval;
prevAccelX = accelX;
prevAccelY = accelY;
prevAccelZ = accelZ;
prevVelX = velX;
prevVelY = velY;
prevVelZ = velZ;
}
The speed, however, does not return to 0 after any movement. Slowing has little effect if it is not very unexpected. This leads to some very inaccurate distance values.
The accelerometer is also very noisy, so I added the following:
accelX = (accelX * accelKFactor) + prevAccelX * (1 - accelKFactor);
accelY = (accelY * accelKFactor) + prevAccelY * (1 - accelKFactor);
accelY = (accelX * accelKFactor) + prevAccelY * (1 - accelKFactor);
Where accelKFactor= .01
Am I missing something obvious?
My ultimate goal is to simply say if the device was moved more than 10 feet or not, but now my data is practically useless.
EDIT:
. currentTime , 1000, . , 1 . currentTime double , .