How to recognize a high-level screen

I have a client who wants to know when the user puts his screen with his whole hand, for example, five. I suspect Apple will not approve it, but turn away from it.

Although I use a four-finger recognizer, but that doesn't really cover it. A better approach would be to check if the user covers at least 70% of the screen with his hand, but I don’t know how to do it.

Can anyone help me out here?

+5
source share
4 answers

Sorting resolved. Proximix + accelerometer works quite well. Multi-touch does not work, as it ignores things that it does not consider brief.

import UIKit import CoreMotion import AVFoundation class ViewController: UIViewController { var lastHighAccelerationEvent:NSDate? { didSet { checkForHighFive() } } var lastProximityEvent:NSDate? { didSet { checkForHighFive() } } var lastHighFive:NSDate? var manager = CMMotionManager() override func viewDidLoad() { super.viewDidLoad() //Start disabling the screen UIDevice.currentDevice().proximityMonitoringEnabled = true NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(proximityChanged), name: UIDeviceProximityStateDidChangeNotification, object: nil) //Check for acceloremeter manager.startAccelerometerUpdatesToQueue(NSOperationQueue.mainQueue()) { (data, error) in let sum = abs(data!.acceleration.y + data!.acceleration.z + data!.acceleration.x) if sum > 3 { self.lastHighAccelerationEvent = NSDate() } } //Enable multitouch self.view.multipleTouchEnabled = true } func checkForHighFive() { if let lastHighFive = lastHighFive where abs(lastHighFive.timeIntervalSinceDate(NSDate())) < 1 { print("Time filter") return } guard let lastProximityEvent = lastProximityEvent else {return} guard let lastHighAccelerationEvent = lastHighAccelerationEvent else {return} if abs(lastProximityEvent.timeIntervalSinceDate(lastHighAccelerationEvent)) < 0.1 { lastHighFive = NSDate() playBoratHighFive() } } func playBoratHighFive() { print("High Five") let player = try! AudioPlayer(fileName: "borat.mp3") player.play() } func proximityChanged() { if UIDevice.currentDevice().proximityState { self.lastProximityEvent = NSDate() } } } 
+1
source

You can use the accelerometer to determine the influence of the hand and examine the front feed of the camera to find the corresponding dark frame due to the hand covering the camera *

* NB a human hand cannot be big enough to cover the front camera on the iPhone 6+

+2
source

I can not comment yet, and this is not an answer. But maybe you could add 3 recognizers. One on the bottom and 2 on the sides or some other option. And activate the event when all or at least 2 of them will be listened immediately.

0
source

You can detect the number of fingers when handling events with a few touches. check this answer

0
source

All Articles