Buttons Over Views
One enemy of accessible apps is the use of views instead of buttons. A button,
NSButton, handles taps and clicks from users. For users who have trouble doing that, iOS and MacOS have accessibility support to enable those users to get their work done. Many developers (me included before) have thought to themselves, "Oh hey, there's this tap gesture recognizer, I can just add this to a view rather than user a button!"
At first glance, this seems to do the same thing: if a user taps or clicks on the view, the gesture recognizer will execute your code. However, this is only one part of a button's job replaced. The accessibility system doesn't know about your fake button! If a user is blind, for example, and tries to use the feature hidden behind this gesture recognizer, they are out of luck. They won't hear the hints VoiceOver supplies, they won't be able to ask their Mac to click on the view with Voice Control.
It can be easy to only consider the happy path for our code. This includes the able-bodied-user path. As many complaints as we may have about them, engineers at Apple worked hard for a long time on UIKit and AppKit. They put together such a comprehensive system to make it easy for us to help out users with low/no vision, low mobility, etc. Wouldn't it be a shame to throw out that work because we were clever and thought we could replace the button?