Multi Touch


One of the most often asked questions is 'do your drivers support multi-touch?'. Simple question with an apparently simple answer, 'Yes'.

However, this does not really answer the question fully, especially as often the real question is...

"will my application work with multiple touches?"
or
"I have multi-touch when I run my application in Windows but not MacOS?"
or
"Can I move more than one fader at a time on my DAW system"
or
"can I get multi-touch in a browser under MacOS?",  read on...

These are the multi-touch issues to consider...

Hardware
The most obvious requirement in supporting multi-touch is that the touch device in use physically supports multi-touch and is capable of transmitting independent touch data per stylus to waiting drivers.

Broadly speaking touch devices either offer single, dual or multi-touch functionality. Dual touch systems are often advertised as 'multi-touch' but of course would not be useful if your usage requires more than 2 touches.

Dual touch systems can support many of the widely used gestures such as pinch, zoom, rotate, 2 finger scroll but of course cannot be used for any gestures requiring more than 2 touches. For example our Mac driver supports the 2 to 5 finger gesture range implemented in MacOSX so you could not invoke 3,4,5 finger gestures with a dual touch device.

True multi-touch devices tend to offer 5 to 10 touch support but some technologies now offer much greater multi-touch capabilities. We are not too sure what you might do with 100 simultaneous touches but if you need it, its available!

For the purposes of this document multi-touch will now refer to any device that supports more than 1 touch.

Cautionary note: Some multi-touch devices are aimed solely at multi-touch aware operating systems (e.g. Windows 7 and above) and will always operate in multi-touch mode and are not aimed at non multi-touch aware operating systems such as MacOS, CE, Solaris, some Linux distributions. Other multi-touch devices will operate in single touch mode unless they receive a request from the OS to enter multi-touch mode so although they are multi-touch devices they can in some cases default to running in single touch mode. When running in single mode the touch screen has a better chance of functioning correctly in non multi-touch aware systems than if it sends out multi-touch data hence the dual mode capability. The native touch drivers in Windows 7 and above can determine that a device is multi-touch capable and will send USB requests to establish the maximum no. of contacts it supports and to enter multi-touch mode. Most multi-touch devices, when they receive these requests, will switch into multi-touch mode if they have defaulted to single touch mode.

Fortunately, our driver can also determine if a device is multi-touch capable and will send the same USB multi-touch requests, irrespective of operating system, hence we can induce multi-touch functionality in all OS that we support. However, this does rely on the ability of the device to switch into multi-touch mode when these requests are received.  We know of one multi-touch device that ignores these requests to switch into MT mode and will always operate in single touch mode in non Windows OS but it is very rare that we cannot get a multi-touch capable device to switch into multi-touch mode when using our driver.

Driver
Our driver supports multi-touch input. When two or more stylus are in contact with the device the driver will receive touch data relating to the individual touch points, each data stream will contain a unique contact id, its X and Y co-ordinates and various other bits of related data, such as pressure if the device supports pressure etc. The driver is also informed of the number of current stylus contacts.

The driver's function is to interface with the hardware to receive touch data and then post it to various interfaces. The interface(s) defined to receive this touch data can be configured within the driver's settings and will be dictated by the system and/or applications touch requirements.

Interfaces
There are various interfaces that will receive the touch data from the driver and although they are implemented differently across the different operating systems broadly speaking they perform similar functions.
Interface Setting Description 
UPDD API Always processed In all cases our driver will post the touch data on it's own Application Programming Interface. The API will pass over touch position, touch status, pressure and gesture information.

Any application that uses the UPDD API functions (known as a UPDD Client program) will receive the touch data directly into the application.

Our test program is a UPDD client application and it will show you the touches being received from the API interface:

In this instance the touch data path from the physical touch screen to the application is touch hardware > driver > UPDD client application. It is not utilising any operating system interfaces, functions or API's to receive touch data. It is a multi-touch application purely because it is processing all touches coming directly from the driver. The system pointer is not affected because the data is direct from the driver.

Applications utilising the UPDD API operate independently of the system mouse movements and click and can be written to control all aspects of the touch interface, be it multi-touch or multi-user.

 Mouse Active_touch_interface
Windows - tbupddsu
MacOS - simpletouch_cg
Linux - xtouch
Android - xtouch
All operating systems offer an interface for the posting of 'mouse' type data. Mouse data is, by its very nature, is single entity in that it relates to the system mouse device that controls the system pointer. When a touch screen is used to move the system mouse pointer and generate a click this is commonly known as 'Mouse emulation'. The main difference between a real mouse and a touch screen is that the mouse/trackpad moves the cursor relative to its movement whereas a touch screen moves the cursor to the absolute point of touch.

Any application written to be driven via the mouse movement and clicks will work via this interface.

Our test program reacts to mouse events and it will show you the touches being received via this interface:

If the driver is configured to post touch data on the system's mouse interface the first or only data stream is used to control the mouse pointer.

In this instance the touch data path from the physical touch screen to the application is touch hardware > driver > mouse 'emulation' interface > desktop / application.

Changing the setting active_touch_interface to a 'non' mouse interface setting will effectively disable the mouse interface.

 Multi Active_touch_interface
Windows - upddvh
MacOS - n/a (gestures)
Linux - uinput (multi)
Android - uinput (multi)

Some operating systems offer an interface for the posting of multi-touch data, some have open interfaces, others closed, propriety interfaces.

Windows: Offers a HID interface to post multi-touch data into the OS. The UPDD driver posts touch data via a virtual HID device such that desktop and applications will receive multi-touch data as native touch events. Windows caters for both gesture and positional touch data.

Linux/Android: Offers a input subsystem for the posting of multi-touch data into the OS. The UPDD driver posts touch data via the Uinput interface such that desktop and applications will receive multi-touch data as native touch events. Uinput offers a user mode bridge to the Linux input subsystem and allows for the posting of both single and multi touch data events. This interface requires uinput support being part of the Linux kernel and should be available for most modern distributions.

In this instance the touch data path from the physical touch screen to the desktop/application is touch hardware > driver > multi-touch interface > desktop / application.

MacOS: Because this operating system does not natively support touch screens, the desktop and applications have mainly been designed for multi-touch gesture control, not multi-touch positional data. Apple offers multi 'touch' trackpads and magic mice on which gestures are performed. Multiple touches on these devices are converted into gestures that are the processed by Apple's propriety device drivers. Therefore desktop control and most 'multi-touch' applications are actually designed to support gestures and not positional touch data. Controlling a Mac system using multi-touch and UPDD can be seen in this video.

Drawing apps do support positional data and these work fine with Apple proprietary hardware (drivers) but unfortunately positional data posted to the OS from our software is ineffective since MacOS 10.10.2.

To implement gesture support we have created our own gesture application, UPDD Gestures and latterly UPDD Commander. These applications are UPDD API and TUIO client applications in that they can receive touch data from either interface, calculate the gesture being performed and post this data into the OS in exactly the same manner as the native Apple gesture interface utilised by Apple's multi-touch trackpads and Magic mice.

In this instance the touch data path from the physical touch screen to the desktop/application is touch hardware > driver > gesture interface > desktop / application.

Calculated gestures are also posted to the UPDD driver which makes them available on the driver's API.

Our test program uses the gesture API's to receive and display gesture information:

 TUIO

UPDD TUIO server running
UPDD Commander

TUIO is an industry standard interface for 2D and 3D objects. The two dimensional specification caters for X and Y co-ordinates so lends itself very well for touch data.

Applications can be written to use the TUIO interface to receive touch data - these applications are referred to as TUIO client applications.

Our test program is a TUIO client application and it will show you the touches being received from the TUIO server:

The UPDD program suite offers a TUIO server. When loaded the TUIO server utilised the UPDD API to receive touch data which is then broadcast over IP to waiting TUIO client applications.

In this instance the touch data path from the physical touch screen to the application is touch hardware > driver > TUIO Server > TUIO Client application. It is not utilising any operating system interfaces, functions or API's to receive touch data. It is a multi-touch application purely because it is processing all touches coming directly from the TUIO server. The system pointer is not affected because the data is direct from the driver via the TUIO server.

Applications utilising the TUIO interface operate independently of the system mouse movements and click and can be written to control all aspects of the touch interface, be it multi-touch or multi-user.

Desktop control

If a desktop has been written to support multi-touch then most likely it will be controlled by various gestures. It could be that the desktop 'application' is itself calculating the gesture from the received touches or, as in the case of MacOS, the touch driver calculates the gesture and invokes the system function associated with the gesture. Windows, MacOS and Linux (via various utilities) all have some form of desktop gesture control.

Application consideration

Multi-touch applications can be written to receive touch data via the operating system's native touch interface, as a TUIO client or as a UPDD Client utilising the UPDD API interface. When using the native touch interface touch data will be received via the native multi-touch interface described above and will be subject to any restrictions imposed by this interface, such as single user usage. TUIO and UPDD client applications are in full control of the touch interface with no imposed restrictions. TUIO servers are available for many multi-touch devices and of course any that are support by our driver by using our own TUIO server. TUIO and UPDD API interfaces tend to be utilised in cross-platform applications, those that have been written to function across many different OS so that an application's multi-touch functionality works the same across different operating systems.

Applications that use native touch interfaces tend to only be written for one operating system or differ in their multi-touch support in different OS.

A good example of this are Digital Audio Workstations (DAW) and mixer type applications used in music production and graphically create virtual system that use knobs and sliders. A DAW that uses native touch events will receive positional touch data when used in Windows so, for example, you could move multiple sliders at once. However, in MacOS native touch events will only give single touch or gesture information as only a single slider can be used. So, for example, a multi-touch application that works as expected in Windows may not be multi-touch in MacOS.

Unfortunately since MacOS 10.10.2 the OS touch interface will not process positional touch data generated by our software so any application functionality reliant on positional data will not work with the touch interface, see here for more details.

When dealing with a 3rd party application that you believe to be multi-touch you need to determine how multi-touch has been implemented so you can determine if it will work in your OS. Its always worth asking if its a TUIO or UPDD client application.

Programming multi-touch

Given the limitatons with multi-touch, either at an application or OS level, it is often possible to add specific multitouch support with some additional custom functionality or application programming.

Applications can interface with the driver using the driver's API, to receive touches directly into the application. An application can also receive touch data via the TUIO protocol, using the TUIO server built into UPDD Commander.

In cases of very specific multi-touch support, we have added very targeted support in UPDD Commander, such as moving multiple sliders in Logic Pro and ProTools mixing desks, via Apple's accessibiliy API.

To cater for satisfying touch events in Qt touch aware client applications, that use the the QTouchEvent api, we have created a Qt Plugin to satisfy the API touch function in macOS. The Qt touch functions work natively in Windows but not in macOS.

There are a number of example programs in the UPDD API documentation that show how to utilise the driver's API, but we also have an example of a multi-touch slider application that uses to receive touches and move multiple sliders.

For further details can be found here.

MacOSX browser consideration 

The problem is that OS X doesn't pass its system-level touch events to web browsers as HTML5 / javascript touches. So while these pages will work with Windows, iOS they do not by default work in OS X, even when using Apple’s magic trackpad.

However, we have created browser extensions for Firefox and Chrome that caters for multi-touch input as described here.

Summary

The answer to the question ' does your software support multi-touch' is a straight-forward... YES!
The answer to the question 'will my application work in multi-touch?' is not so straight-forward and needs to consider...

Is the touch device capable of multi-touch?
Is the driver that supports the device able to handle multi-touch data?
Is the application a multi-touch application?
Was it written using a multi-touch interface in use or available on the operating system in question (Qt touch events, TUIO, UPDD API etc)?

If the answer is yes to all the above then it just might work with multi-touch!

Posted 7 Years Ago, Updated 2 Years Ago
https://support.touch-base.com/Documentation/50502/Multi-Touch