Re: touch events



On Tue, Feb 07, 2012 at 07:58:23PM +0100, Benjamin Otte wrote:
> So, here's few interesting things I learned in the last few days while
> talking to people about touch (mostly Peter Hutterer and Chase
> Douglas, more power to them for getting my X up and running with touch
> events).
> 
> 1) There is a huge difference between touchscreens and touchpads.
> 
> No platform exists that caters to both touchscreens and touchpads
> equally. OS X is a touchpad world, iOS is a touchscreen world. Windows
> is a touchscreen world. (I have no idea if and how Qt caters to both,
> but my guess is it doesn't.) So XInput 2.2 seems to be the first
> platform that tries.
> Now, if you get a touch event sequence, those events have X and Y
> coordinates. Unfortunately, these coordinates refer to the position of
> the mouse pointer and not the position of the touch. And for
> touchpads, the mouse pointer doesn't change. So you get the same x/y
> coordinate all the time, for every event. For touchscreens of course,
> that is entirely different as touchscreens move the mouse pointer to
> wherever you touched last.
> Of course, on touchpads you also get the position of the touch on the
> touchpad. These coordinates refer to the physical touchpad and are not
> related to the screen at all. What does that mean? Exactly, it means
> that if you start doing math based on these coordinates, you are
> wrong. How many units does one have to move to get a noticable effect?
> Who knows, it's not related to pixels anyway! Unless you're thinking
> of touchscreens again, where things match the screen perfectly fine
> for these axes.

Note that we do export the axis range and the resolution of the axis for
x/y, so you can calculate based on that. I just checked that on my synaptics
and it tells me:
        Detail for Valuator 0:
                          Label: Rel X
                          Range: 1472.000000 - 5472.000000
                          Resolution: 75000 units/m

> Last but not least, touchpads don't emit touch events for single
> finger touches at all. For those touches, you use regular mouse events
> - GdkEventButton and GdkEventMotion are your friend.

This is technically an implementation issue, but can be assumed for sanity.
We require only that touches "that only serve to move the pointer" do not
send touch events, so in theory you _may_ get touches from a device where
these are different.

> Of course, developers usually only have either a touchpad or a
> touchscreen. So when they will be developing code, they will only
> check one of the two. And I guess this usually means that the other
> type of device will be broken. In fact, I have been using a test
> program written by Peter himself, and even that one's very useless for
> touchpads.
> So my idea so far - though I'm not sure about this yet, which is why
> I'm typing all this - is to make this difference very prominent inside
> GTK and to use different event types. So for touchscreens, you'd get a
> GDK_DIRECT_TOUCH_BEGIN/UPDATE/END and for your pads, you'd get
> GDK_INDIRECT_TOUCH_BEGIN/UPDATE/END events. And we might even end up
> giving you different events - a struct GdkEventDirectTouch vs a struct
> GdkEventIndirectTouch - so we have the useful information in there,
> and don't get people to write events that look at x/y.

I think separating direct and indirect touch out is a good idea since I
expect the actual event handling will be quite different.

> 2) system-level gestures?
> 
> If you've read http://who-t.blogspot.com/2012/01/multitouch-in-x-touch-grab-handling.html
> (if not, it may help to read it to understand this paragraph), you
> know that touch events are only forwarded to the application when all
> grabs have rejected the touch event.

This is either a bad explanation by me or a misunderstanding. Touches can be
accepted and rejected on a per-touch basis and the event will be passed on
to the next client as such.

> Now, the WM will probably want to do a touch grab, so that the usual
> window management gestures (OS X users will recognize 4 finger swipes
> for expose for example) can be recognized. And this means that until
> the WM rejects the touch event (because it realized it's not a
> relevant gesture), the application get those touch events delivered.
> And that can cause lags.
> Now there's multiple ways to cope with this:
> - Make the WM reject touch events quickly
> This requires a well-written WM (Ce Oh Em Pe I Zeeeee!!!) that rejects
> touches quickly. So quickly in fact that it's not noticable for a
> regular person that the touches weren't sent through to the app
> immediately. Even when they use it to draw squiggly lines on the
> screen. I'm not sure if that is even possible.
> - Use a different model in the WM
> The WM could not do a grab, but just listen for touch events on the
> root window. In that case it'd only get touch events for all the
> touches that applications haven't accepted. But applications accept
> touches by default. So unless applications are well-written and
> carefully reject all touches they don't care about, your system-level
> gestures won't trigger if apps listen for touch events...

This cannot work. The X event delivery model requires that events are
delivered to the first window in the stack, starting from the bottom-most
window. i.e. if any window listens to touch events, the root window will not
get events.

> - Use XI_TouchOwnership
> This way the app would get pending touch events even when it's not the
> owner and could already do things. But it would require a very
> prominent GDK_TOUCH_EVENT_CANCEL so that when the WM accepts the
> touch, the app can revert everything it already did for the current
> touch.
> - something else?
> Did I miss something here? Or is getting sytem-level gestures right
> really complicated?

yes.

> 
> 3) Drivers still do gesture recognition
> 
> You all know and love this feature of the synaptics driver. 2 finger
> swipes cause scrolling, 3 finger taps cause a middle mouse click. And
> that's not gonna change: If you tap with 3 fingers in the future, you
> get a bunch of touch events _and_ a button press/release event combo.
> Why is that bad? Because if you implement a 3-finger tap gesture in
> your widget that does a clipboard paste, you will now suddenly be
> doing that paste twice.
> Can't you just disable one of those? No, not really, because some
> drivers (like non-synaptics) may not be doing gestures, so a 3-finger
> tap will do nothing; likewise, some old widgets (like all of GTK 2)
> may not be listening to touch events.
> Couldn't the X server do it for you then? No, not really, because if
> you have an old widget (for example a GtkHTML widget) and a new widget
> and put them in the same GtkWindow, there'll only be one X window
> because of client-side windows. And the X server wouldn't know which
> event to send.
> So how do we fix this? I have no idea, but I suppose it'll cause lots
> of madness.

This situation is the main source for headaches right now. More
specifically: we have the choice between accepting that some gestures just
won't work as touch gestures (if interpreted by the driver) or that we loose
these same gestures on legacy apps (if interpreted by the toolkit).

To expand: if we interpret a gesture in the driver and convert it to any
other event E, we have no mechanism to tell a client that the new event E
corresponds to some other touch event and should be ignored. This will
likely lead to mishandling of some events. This is why I'm advocating a
either-or approach: if you enable in-driver gestures you're stuck with those
and you won't get touch events for them. if you want toolkit gestures, well,
you'll need all apps to support them.

This isn't a nice solution but it is at least predictable.

Cheers,
  Peter


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]