These notes relate to the gesture and inking functions.
- Positional Touch Events
Positional touch events: Under Mac OS X 10.10.2 and above we discovered that applications were no longer receiving the OS X positional touch events created by UPDD Gestures, only the gesture events.
Our investigations revealed that positional touch events simply weren't being passed along to applications unless they came from a genuine Apple trackpad. When force click events were introduced in 10.10.2 we also discovered that our attempts to send force click events (from touch screens that support pressure) were also blocked. We suspect that the 10.10.2 update blocks both force click and now also positional events unless they come from a trackpad and ignores any generated from a different source.
As it stands, we believe there will not be many OS X apps, with the exception of drawing apps, that use the position of individual touches and therefore we are not sure if this restriction will caused any real issues. Further, OS X's positional touch events were always meant to represent a relative ‘touch’ from a trackpad, not an absolute touch from a touch screen, and therefore not related to an absolute position on the desktop/within an application. Many applications rely on OS X gesture events and these are still processed which will satisfy gesture events such as trackSwipeEventWithOptions, rotateWithEvent, magnifyWithEvent to receive gesture information like rotation direction, swipe direction, zoom amount, etc. Applications that are responding to the low level multi touch events, such as touchesBegan, touchesMoved, touchesEnded will not be satisfied.
- Gesture renaming
With gesture version 2.6.2 we renamed "Swipe between pages" to "Navigate between pages" as customers were configuring "Swipe between pages" for two finger swipe gestures but it did not produce the required behavior. We concluded the name of this action was misleading and need to be changed.
To explain why, it's important to note that Apple's own nomenclature regarding swiping between pages in Safari is inconsistent and a bit confusing. There are actually two different trackpad gestures that will go back and forward in Safari. One is by swiping left or right with two fingers which produces a nice horizontal scrolling effect. The second is swiping left or right with three or four fingers, which does not have a transition, and is equivalent to clicking the back or forward buttons. Both of these gestures are called "Swipe between pages" by Apple, but internally they are two very different gestures. You can only configure a trackpad to use one or the other.
In UPDD Gestures, "Swipe between pages" refers to the second gesture that does not have a transition. The first gesture that does have a transition was called "Scroll" because internally that gesture is actually a special version of a scroll gesture.
Therefore in 2.6.2 we have renamed "Scroll to Scroll & Swipe between pages" and "Swipe between pages to Navigate between pages" to make it more clear what they actually do.
- Really important point: When using gestures in Mac OS X the gestures are processed by the application window under the mouse cursor. Dual and multi-touch gestures can be performed on any part of the touch screen but will be processed by the target area. So, for example, if you have a Preview window open and the cursor is in the viewing area then the area will respond to gestures. If the cursor is on the Preview dialog but not in the view area then gestures will be ignored. Since gesture version 2.0.47 zoom, rotate and scroll actions have an option for either performing the gesture on the item under the mouse cursor or under the position where the gesture occurs on the screen (the mouse cursor is automatically re-positioned): e.g.
We were asked if this setting could be made available for any gesture. Unfortunately the current design of the settings dialog does not readily cater for this so although the code is in place to facilitate this request the setting has to be set manually in the settings file.
The setting is gesture related. The command line interface can be used to create a 'nodevice' setting with a name like:
gesture action option [gesture] override reposition cursor ...where [gesture] is replaced with the name of a gesture, like "tap", "three finger swipe left", etc.
If you set it to 1 then the cursor will always be re-positioned underneath of a gesture before UPDD Gestures performs an action. Currently there is no way to prevent it from re-positioning the cursor for any gesture actions where that always occurs, such as any of the mouse related actions.
- Enabling accessibility features
Most gestures require that accessibility features are enabled for the gesture application and this will be requested during install. If this is not enabled then the Gesture GUI will carry an Enable Accessibility option that can be selected to invoke the system dialog used to enable accessibility.
Whilst ‘accessibility access' is disabled the Gesture GUI will disable any functions that require it to be enabled as seen below:
- Touches on non-tablet devices simulate tablet input
When this setting is enabled:
- Any mouse events created by Gestures through "Click" and "Click and drag" actions are also tablet events. This is because in OS X tablet events are actually special mouse events with extra tablet data included.
- The "Ink" preference pane will be visible in System Preferences, if it was not already, allowing Inking to be configured. If Inking is enabled, it can be used with Gestures through the "Click" and "Click and drag" actions.
- Reset Mouse Cursor
The 'reset mouse cursor' feature has some limitations. It may interfere with normal interaction with menus, since menus respond to the position of the mouse cursor. For example, you could tap a menu, and then the cursor could reset to a position inside the menu but over an item you do not wish to select. It's a minor annoyance and one we could address if it crops up in ‘real world’ usage. Gestures could recognize when a menu is being used and wait to reset the cursor once it closes.
- Accessibility API issues causing gesture processing delay in 10.11
The appears to be a bug in the Accessibility API whereby querying which UI element is under a point on the screen can take a second or two the first time its done over a particular windows or application. If it is an accessibility bug then there is not much we can do about it and we can just hope it is addressed by Apple in a future update. Gestures uses accessibility in a few specific places: when performing any window-related actions (e.g. close window, minimize window, zoom window), when performing the "Toggle notification center" action, for its "iOS simulator mode" feature, and when scrolling, in order to tell when the element that's being scrolled has reached the top or bottom.
The main concerned is the detrimental effect this could have when scrolling as this is one of the most oft-used actions and the delay could occur anytime scrolling is attempted in an application window that hasn't been scrolled before. All of the rest of the actions are either less critical or the slowdown will only happen once the first time its used.
The biggest culprit is the "iOS Simulator mode" feature, since it uses the accessibility API every single time a new touch begins - this *really* slowed things down in our tests. Given that this is a rarely used feature we've changed it to default to off.
At this point we don't know how widespread this issue is in OS X 10.11 or whether it's likely to affect any systems other than our own test systems. Please advise if you experience this issue.