Re: RFC: gestures
- From: Bastien Nocera <hadess hadess net>
- To: Carlos Garnacho <carlosg gnome org>
- Cc: gtk-devel-list <gtk-devel-list gnome org>
- Subject: Re: RFC: gestures
- Date: Tue, 04 Mar 2014 02:28:14 +0100
On Tue, 2014-03-04 at 00:55 +0100, Carlos Garnacho wrote:
Hey everyone,
In the past days I've been hacking again on the gestures branch, and
it's reaching an state where I feel it's getting quite solid, so I would
like to get discussion started, tentatively aiming to get this included
early in 3.13.
Overview
========
The two object types this relies on are GtkEventController and
GtkGesture. GtkEventController is a very lowlevel abstraction for
something that just "handles events". GtkGesture is a subclass very
centered around handling single or multiple sequences of
press/update.../release events, by default it's restricted to handling
touch events, although can be made to listen to mouse events, either
though API or through the GTK_TEST_TOUCHSCREEN envvar (a NULL
GdkEventSequence is used in those cases).
Multiple GtkGesture implementations are offered in the branch:
* Drag: keeps track of drags, reporting the offset from the drag
start point.
* Swipe: reports x/y velocity at the end of a begin/update/end
sequence.
* LongPress: reports long presses, or those being canceled after
threshold/timeout excess.
* MultiPress: reports multiple presses, as long as they're within
double click threshold/timeout
* Rotate: reports angle changes from two touch sequences
* Zoom: reports distance changes from to touch sequences as a
factor of the initial distance.
What about the single tap/press? Do the gestures for which it makes
sense also give back the center of the operation? Do the gestures know
about each other? (like, if there's no long-press in my widget, will it
understand that I'm starting a drag or I'm slow at tapping?)
Cheers
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]