We are thinking about detecting a hand, the user is holding his mobile device, his right or left hand, or both. As far as we know, this is impossible with 100% accuracy with the current equipment, we donβt even think that it will have an accuracy of more than 90%, but if you try to achieve this using the available sensor data that most smartphones have today. How would you process the sensor data and how would you decide?
Our initial thoughts:
- Checking the horizontal angle through a gyroscope,
- Solutions based on face recognition and eye angle using a camera,
If you ask why you should do this,
As devices become larger (e.g. samsung note-2, note-3), touching each side of the screen becomes more difficult, which causes user problems / ergonomic problems. We believe that if we can automatically detect this with reasonable accuracy, we can customize our layouts to better serve users.
Thanks to everyone who shares your thoughts,
android mobile detection android-sensors sensor-fusion
mehmet6parmak
source share