I recently needed to support some basic gesture recognition in an iPhone app. As it turns out this is a little bit tricky to do when the touches occur within the same responder chain as a view which already responds to touches (like a UIScrollView or UIWebView which might need to scroll or zoom in addition to handling my custom gestures).
When the iPhone detects a touch it determines the first responder for that event by recursively calling hitTest:withEvent: to descend down the tree of UIResponder objects in the application’s window. The first reponder is then sent the event and can either respond to it or pass it up the responder chain from view to view controller to parent view. (See “Event Delivery” in the iPhone Application Programming Guide.)
I wanted to be able to detect gestures anywhere in the app so I chose to override sendEvent: in UIWindow. This allowed me to intercept touch events before they were sent to the first responder. Basic left/right swipe detection similar to Apple’s “detecting swipe gestures” example is included below.
Unlike Apple’s example, I track the original touch in the gesture rather than just comparing the last two touches. This way a slow swipe where the individual touch events may be close together will still trigger a swipe an a slight reverse of direction at the end of the touch will not unexpectedly reverse the direction of the swipe. Depending on the needs of the app and the type of gesture it needs to support it might also be worth also considering the velocity or acceleration between touches, the overall shape of the touch path, or other factors.