Touch events are propagated as follows:
- UIViews is sent -hitTest: withEvent: recursively, until some UIView decides that the touch is inside its borders, and not inside any subtitle with user interaction enabled.
- UIWindow sends -touchesBegan: withEvent: and the friends in the UIView returned in step 1, ignoring any view hierarchy.
In other words, -hitTest: withEvent: is used to determine the target view to touch, after which the target view receives all -touches ...: withEvent: messages. If you need to intercept a gesture of gestures that can start in UIButton, you will have to override -hitTest: withEvent: to return self.
But there is a problem with this approach. Once you do this, your button will stop working because it will not receive any -touches ...: withEvent: messages. You will have to redirect the touch to sub-items manually if you do not find a swipe with a gesture. This is a serious pain in the butt and work is not guaranteed at all. What are UIGestureRecognizers intended for?
Another approach is to subclass UIWindow and override -sendEvent:, which might work better in your case.
In any case, be sure to carefully read the Event Handling documentation. Other scary warnings say:
The UIKit structure classes are not intended to receive strokes that are not related to them; in software terms, this means that the presentation property of a UITouch object must reference the structure of the object in order for the touch to be processed.
Costique
source share