Dealing with touch sensitive areas of graphical objects

Niall Murphy

June 28, 2009

Niall MurphyJune 28, 2009

Application level handling of a touchscreen is fundamentally different to an interface which uses a mouse, trackball or other off-screen device. This article will explore different algorithms for establishing the exact point of intended touch, and whether those touch should be applied to an object at or near that location.

At a lower level the conversion of a touch event to an x,y location on the display is achieved by a calibration process (1) and a driver(2) to convert the analog signals from the touch sensitive device into coordinates on the output display as an x,y position measured in pixels.

For the purposes of this piece, we will be assuming that we have an x,y position, but then the software has decide how to trigger an event based on that reading. There are filters and modifications that can be applied to that x,y position to allow for the fact that it originated from a touchscreen and not a mouse.

There is a fundamental difference between input from a touchscreen and the input from a mouse, trackball or other off-screen device. The mouse produces a location which is then displayed, usually with a small arrow, or cursor, pointing at the current location.

Because the location is displayed, and the user can see that location, it is an absolute measurement, with no error. If there is an error in the reading of the mouse movement, that error is reflected in the position of the cursor, so the user gets to see the new position.

If the error requires correction, the user simply moves the mouse in the appropriate direction to correct the overshoot. The human is in the feedback loop, and that provides correction for errors from the analog reading of the mouse, or errors generated by the user over or under shooting with their hand movement.

The touchscreen does not have this closed loop control. When the user touches the screen with his finger, he can see the thing that he was trying to touch, but there is no indication of where the software believes the touch occurred.

There will always be an element of error due to temperature effects on the screen, inaccurate calibration, or non-linear electronics. This error is the distance from where the user physically touched and the x,y location that the touch driver returned to the application. The error can be seen if the application makes a cursor visible, as shown in Figure 1 below.

Figure 1: a) accurately shows the cursor position where the finger actually touched, while b) shows error in both the x and y direction so the cursor is displayed away from the center of the fingers touch on the screen

If there is a small vertical error, then as the user moves their finger around, the cursor will be displayed a small distance above the finger. If you get to try this experiment, then it can also be interesting to see how sliding the finger quickly may display a cursor that follows behind.

This is due to filtering, usually performed in the driver, which adds a time delay to the touch position. Filtering makes the cursor move more smoothly, since it eliminates some electrical noise, but can also add a lagging effect.

Another artifact that may be visible in this experiment is that the cursor may shake. This noise can be due to a number of factors. One is that the human finger is just not that steady. It may be partly noise in the analog electronics which can be reduced with filtering, either electronically or in software.

Another possibility, with resistive screens, is that variations in the pressure applied by the user are varying the cursor position slightly. If this is the case then most of the jitter can be removed by ignoring touch events where the pressure is below a certain threshold.

This means that very gentle touches on the screen will not register at all, but when the touches are detected they will be stable and accurate. Finding a good balance here is important because you do not want to force your user to have to exert a lot of pressure for the touch to register.

Test Screens
I am not suggesting that your touchscreen applications should have a visible cursor, but it is very useful to have the ability to turn it on during test and debug of the touch event handling algorithms.

If you are implementing this cursor yourself then you may find that a large cross works better than the small arrow used in windows " the arrow is likely to be completely covered by the user's finger making it difficult to assess accuracy.

It is useful to print the x,y position on the display as well. When you see the cursor on the screen you will know its precise position in pixels. If you draw a few target objects on the screen, for example a small circle with its center at 100,200.

If you press on that circle and the position detected by software is 105, 198, then you have an error of 5,-2. This error may vary across different parts of the screen. Measuring this error using a finger is not very exact.

A stylus will be a bit more precise and in some cases it is worth constructing a mechanical test jig which will guarantee reproducible touching for a few predefined points.

< Previous
Page 1 of 5
Next >

Loading comments...