HitTest returns invalid UIView

I have a view hierarchy that contains smaller views in a scroll view. Each view may contain objects such as buttons, etc.

For some reason, buttons on the view are not pressed; examining this further showed that although the touchBegan event appears in the scroll view, the button does not work. Calling the message hitTest: event: indicates that the button does not return, even if it is within.

I included the log output describing the location of the touch in the scroll view, the item returned from hitTest, the location of the touch if I called locationInView: using the expected item and the hierarchy of the expected item (with printed frames), From this output I can infer that the button should have to be called ...

Can anyone explain this? Did I miss something?

touched ({451, 309}) on <VCViewContainersView: 0x4b31ee0; frame = (0 0; 748 1024); transform = [0, 1, -1, 0, 0, 0]; autoresize = W+H; layer = <CALayer: 0x4b32130>> (location in expected item: {17, 7.5})
expected touched item is:
view: <UIButtonLabel: 0x482b920; frame = (32 5; 36 19); text = 'Click'; clipsToBounds = YES; opaque = NO; userInteractionEnabled = NO; layer = <CALayer: 0x4831370>>, layer transform: [1, 0, 0, 1, 0, 0]
 view: <UIRoundedRectButton: 0x482c100; frame = (50 50; 100 30); opaque = NO; layer = <CALayer: 0x482c450>>, layer transform: [1, 0, 0, 1, 0, 0]
  view: <UIImageView: 0x480f290; frame = (0 0; 320 255); opaque = NO; userInteractionEnabled = NO; layer = <CALayer: 0x480e840>>, layer transform: [1, 0, 0, 1, 0, 0]
   view: <VCViewContainer: 0x4b333c0; frame = (352 246.5; 320 471.75); layer = <CALayer: 0x4b33d50>>, layer transform: [1, 0, 0, 1, 0, 0]
    view: <UIScrollView: 0x4b32600; frame = (0 0; 1024 748); clipsToBounds = YES; autoresize = W+H; userInteractionEnabled = NO; layer = <CALayer: 0x4b32780>>, layer transform: [1, 0, 0, 1, 0, 0]
     view: <VCViewsContainerView: 0x4b31ee0; frame = (0 0; 748 1024); transform = [0, 1, -1, 0, 0, 0]; autoresize = W+H; layer = <CALayer: 0x4b32130>>, layer transform: [0, 1, -1, 0, 0, 0]
      view: <UIWindow: 0x4b1d590; frame = (0 0; 768 1024); opaque = NO; autoresize = RM+BM; layer = <CALayer: 0x4b1d6d0>>, layer transform: [1, 0, 0, 1, 0, 0]

Update . In addition to UIWindow and VCViewsContainerView, all views are created programmatically using initWithFrame: or in the case of the buttonWithType: button. The VCViewContainer is initialized using CGRectZero, and when creating the UIImageView, its frame is set to the image size + extra space for labels at the bottom of it.

2. [self.layer hitTest: location] ! ...?

+5
3

hitTest: withEvent: . .. . , userInteractionEnabled NO, nil hitTest: withEvent:, . , , , , , - .

UIScrollView userInteractinonEnabled, NO. , VCViewContainersView UIScrollView, UIScrollView nil, userInteractionEnabled - , VCViewContainersView pointInside: withEvent: , , ( ). , .

, , - , , , userInteractionEnabled, , . Layer hit-testing - kludge, , ( ), , . , hitTest: hitTest: withEvent:, , .

, , , hitTest: withEvent: UIScrollView, , , .

+5

, UIScrollView .

UIScrollView, , , .

, . SO , , - , .

.h:

#import <Foundation/Foundation.h>
#import <UIKit/UIKit.h>

@interface TouchScroller : UIScrollView 
{
}

@end

.m:

#import "TouchScroller.h"


@implementation TouchScroller

- (id)initWithFrame:(CGRect)frame 
{
return [super initWithFrame:frame];
}

- (void) touchesEnded: (NSSet *) touches withEvent: (UIEvent *) event 
{   
// If not dragging, send event to next responder
if (!self.dragging) 
    [self.nextResponder touchesEnded: touches withEvent:event]; 
else
    [super touchesEnded: touches withEvent: event];
}

@end

.

+1

If I understand the stack stack correctly, your button will obey some UIImageView - it has a property userInteractionEnabledset to NO (by default) - so viewing the image and all its subzones will not receive any touch events.

Setting an image property userInteractionEnabledfor a property YESshould solve the problem

0
source

All Articles