UIButton Clicks Being Intercepted by Programmatic Layer

283 Views Asked by At

I'm working on an app that includes the screen shown below. The panel with the list view is instantiated from a Nib, but the pale grey panel with the drawing in it is a dynamically generated UIView, which is a subview of a UIView subclass called FrameView (for the purposes of the question).

The red dot in the corner is a delete button for that drawing. The drawing is the content of a Drawing Object, which has a many-to-many relationship to the item selected in the list. When I select an item in the list, zero or more such panels, showing the drawings for that item are added as subviews of FrameView.

In order for those delete buttons to be clickable, FrameView has user interaction enabled. This happens when I select an item in the list. It's off when FrameView first appears.

At the bottom left is the key navigation button. it has a variety of gestures and clicks associated with it, that allow the user to move between different editors, that use the main screen. This button has a relatively high zPosition, in the main view.

But once FrameView has its user interaction turned on, it stops clicks and gestures from reaching the navigation button.

I would have thought that increasing the zPosition of the navigation button above FrameView would solve the problem, but it doesn't. How can I make the navigation button receive taps and gestures, even when FrameView has user interaction enabled? Or am I going about this the wrong way?

EDIT: meant to mention the navigation button is the only element added via Storyboard, in case that matters

enter image description here

EDIT 2: After some messing around, I'm overriding the hitTest, so:

    override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
        let view = super.hitTest(point, with: event)
        return view == self ? nil : view
    }

This wasn't sourced from SO, and although there were some answers here that vaguely suggested this approach, they were (as is common on SO) a) associated with obsolete versions of Swift, b) buried in a different context and c) not returned by any obvious searches.

This site has got to do something about obsolete, heavily upvoted answers. I think that Swift has got to be the worst case for this, since there really are so few users of the older versions, thanks to Apple's forced-upgrade policies.

Thanks to Ptit Xav for sticking your head into my mess.

1

There are 1 best solutions below

0
On

"I'm increasing the zPosition using .layer.zPosition"

OK, that's the issue.

Changing the .zPosition of a view's layer does NOT change its order in the view hierarchy.

Take a look at this layout, with Two buttons added in Storyboard / IB:

enter image description here

Using this basic code, we'll add a "frameView" subview, with user interaction enabled, so that it covers both buttons.

We'll use .layer.zPosition to make the First button visible, but we CAN'T tap it.

We'll use .bringSubviewToFront() to make the Second button not only visible, but it will also change the view Hierarchy so we CAN tap it.

class HierarchyViewController: UIViewController {
    
    @IBOutlet var firstButton: UIButton!
    @IBOutlet var secondButton: UIButton!

    override func viewDidLoad() {
        super.viewDidLoad()
        
        let frameView = UIView()
        frameView.backgroundColor = .systemBlue
        frameView.translatesAutoresizingMaskIntoConstraints = false
        view.addSubview(frameView)
        let g = view.safeAreaLayoutGuide
        NSLayoutConstraint.activate([
            frameView.topAnchor.constraint(equalTo: g.topAnchor, constant: 20.0),
            frameView.leadingAnchor.constraint(equalTo: g.leadingAnchor, constant: 20.0),
            frameView.trailingAnchor.constraint(equalTo: g.trailingAnchor, constant: -20.0),
            frameView.bottomAnchor.constraint(equalTo: g.bottomAnchor, constant: -20.0),
        ])

        // make sure our added view has user interaction enabled
        frameView.isUserInteractionEnabled = true
        
        // change the zPosition of the first button
        firstButton.layer.zPosition = 999
        
        // change the view Hierarchy for the second button
        view.bringSubviewToFront(secondButton)
    }
    
    @IBAction func firstButtonTap(_ sender: Any) {
        print("First Tapped!")
    }
    
    @IBAction func secondButtonTap(_ sender: Any) {
        print("Second Tapped!")
    }
    
}

Here's how it looks at runtime:

enter image description here

No visual difference, but if we use Debug View Hierarchy we can clearly see that the First button is behind the "frameView" while the Second button is in front of it:

enter image description here

While "frameView" has User Interaction disabled, we can "tap through it" to the First button. But once we enable User Interaction on "frameView", it will now grab the touch and we can't tap the First button. Changing the view Hierarchy with .bringSubviewToFront() resolves the issue. (Note, I used a blue background to make it easy to see "frameView" ... the same applies if it has a clear background).