Touch Gestures in Windows Embedded Compact 7
Touch screen and gesture support a few years ago a niche technology that was found in very few specialized businesses and industrial applications has suddenly seems to be everywhere. One day in the near future, operating a gadget, computer or other electronics devices without a touchscreen may seem as obsolete one to us like the idea of using a PC without a mouse does today.
Moreover, gesture support has been the most promising feature of WEC7 from the User Interface perspective. For some time now, we had wanted to develop some real cool graphical user interface with Windows CE and every time in our brain storming sessions someone would come up with the lack of support of a full-fledged gesture support in Windows CE 6.0. Without gestures, I don’t think we can associate cool and UI anymore. So as soon as I encountered the fact that WEC 7 has complete gesture support, the immediate thought of using gestures and Silverlight made my imagination run wild.
Windows Embedded Compact 7 supports gestures that are single-touch, dual-touch symmetrical, and multi-touch with two contact points. This blog explains how to use the Touch Gestures in the Compact 7 WIN32 applications. This blog explains the Gesture events that are already supported by Windows Embedded Compact 7. It does not explain how to customize the touch gestures.
So, I started to see what Microsoft says about gestures in WEC 7. In this blog post, I shall highlight what Microsoft says is possible with Gestures in WEC7 on the following lines:
1. Touch Gestures in Compact 7
2. OS Design Requirements
3. WIN32 Gesture Application
- Enabling the Gestures
- Processing the Gestures
- Multi-touch Gestures
- Auto Gestures
4. Building WIN32 Gestures Application
Touch Gestures in WEC 7
Compact 7 supports the following gestures:
- Direct Manipulation – The user manipulates an object on the screen, which reacts so that the same point on the object always remains under the same finger. Direct manipulation can be single-touch, dual-touch symmetrical, or multi-touch with two contact points.
- Double Tap – A double-tap represents the left double-click of a mouse.
- Flick – The user presses a finger on the screen, moves in any direction, and then lifts up the finger to initiate scrolling that continues for a short time.
- Hold – The user presses and holds a finger on the screen. This represents the right-click of a mouse
- Pan – The user presses and holds a finger on the screen and then drags the finger in any direction. This represents a mouse move event. Panning can occur after a hold gesture.
- Tap – A tap represents the left click of a mouse.
OS Design Requirements
To use the Touch Gestures in Compact 7, the OS image must include the following sysgen variables.
- SYSGEN_TOUCHGESTURE – Core OS -> Windows Embedded Compact –> Shell and User Interface -> Graphics, Windowing and Events -> Touch Gesture GWES component.
- SYSGEN_PHYSICSENGINE – Core OS -> Windows Embedded Compact –> Shell and User Interface -> Graphics, Windowing and Events -> Gesture Physics Engine
The OS image must also include properly working touch driver.
We can use CETouchView, a standalone tool in the Windows Embedded Compact Test Kit (CTK), to verify the raw touch data and gesture messages generated by touch device.
WIN32 Gesture Application
Enabling the Gestures
All gesture events must be enabled for every window by calling EnableGestures API. QueryGestures API can be used to identify the gestures that are already enabled. If the Gesture event is enabled, the gesture engine posts the WM_GESTURE message to the Window Procedure to indicate that a gesture has occurred or is in progress.
Processing the Gestures
wParam – ID of the gesture command
lParam – HGESTUREINFO handle to the gesture information
If the application processes the gesture, then it returns a nonzero value. If the application does not process the gesture, it must pass the message to DefWindowProc.
The gesture ID can be used to determine whether the window procedure has to handle the gesture or DefWindowProc to handle the gesture. If the window procedure handles the gesture, GetGestureInfo API must be called to obtain more information about the gesture. If window procedure processes the gesture, it must not pass the message to the default window procedure. After processing the message, CloseGestureInfoHandle must be called to close the handle.
The following table lists the commands that are supported by the WM_GESTURE message:
|GID_BEGIN||Marks the beginning of each touch gesture, when the user first touches the screen|
|GID_END||Marks the end of each touch gesture, when the user lifts their finger from the screen|
|GID_PAN||A pan gesture occurs when the user touches the screen and moves their finger in any direction. This represents a mouse move event.
When the finger moves a distance equal to or greater than the pan threshold, the gesture recognizer sends an initial GID_PAN message that contains the current location of the finger. For more information about defining the pan threshold, see GESTUREMETRICS.
The gesture recognizer sends a new GID_PAN message for each mouse move message until the user lifts up their finger. GID_END marks the end of the pan movement. Mouse messages are interleaved for backward compatibility, but the gesture messages are always received before the corresponding mouse events.
You can calculate the movement delta by comparing two consecutive pan messages.
Panning can occur after a hold gesture.
|GID_SCROLL||A flick gesture occurs when the user touches their finger to the screen and then moves the finger quickly in any direction before lifting the finger.
The gesture recognizer sends the GID_SCROLL message after a flick.
The ullArguments member of GESTUREINFO contains the following information:
The gesture recognizer sends the GID_SCROLL message to the window that received the first gesture message, usually a pan or a hold message, for the current touch session.
A flick frequently occurs after a pan (one or more GID_PAN gesture messages followed by a GID_END message immediately before the flick).
|GID_HOLD||A hold gesture occurs when the user touches the screen and holds their finger down for more than the hold time-out period. This represents the right click of a mouse.
The gesture recognizer sends a GID_HOLD gesture message and then sends a GID_END message when the user lifts their finger or at the end of the hold time threshold.
The hold gesture can be followed by a panning movement that generates several GID_PAN messages, but the gesture recognizer never sends a GID_HOLD message after a GID_PAN message. The gesture recognizer always sends a GID_HOLD message first after the user touches the screen, if the gesture parameters meet the hold time-out recognition values.
For more information about defining the hold time-out period, see GESTUREMETRICS.
|GID_SELECT||A select, or tap, gesture occurs when the user taps on the screen for a period of time that is less than the hold time-out period. A tap represents the left click of a mouse.
There can be several WM_MOUSEMOVE messages after the WM_LBUTTONDOWN event and before the GID_SELECT message
|GID_DOUBLESELECT||A double select or double tap, gesture occurs when the user taps twice on the screen within a period of time that is less than the hold time-out period. A double tap represents the left double-click of a mouse.|
|GID_DIRECTMANIPULATION||Objects on the screen react so that the same point on an object always remains underneath the same finger. Direct manipulation maps points in an object’s local space and points in screen space, without the need for gesture processing. During direct manipulation, the gesture recognizer does not send GID_* messages.|
Depending on the touch screen driver, Compact 7 supports multi-touch gestures with two contact points, or dual-symmetrical gestures. If the touch screen driver supports dual-symmetrical gestures, the gesture engine will try to determine how the X and Y coordinates should be paired. For both dual-touch symmetrical and multi-touch gestures, the gesture recognizer designates one contact point as the primary contact and keeps track of the distance between the primary contact and the secondary contact. The gesture engine posts the GID_DIRECTMANIPULATION command through the WM_GESTURE message when it detects the multi-touch or dual symmetrical.
Automatic handling for flick gestures can be enabled in all Win32 windows that allow scrolling by including the following sysgen variable in OS image.
- SYSGEN_GESTUREANIMATION – Core OS -> Windows Embedded Compact –> Shell and User Interface -> Graphics, Windowing and Events -> Default Gesture Response
Window auto gestures only work with windows that have the WS_HSCROLL or WS_VSCROLL styles.
If automatic gestures are enabled, no need to process the WM_GESTURE window message for the scroll message, because the window procedure automatically passes the WM_GESTURE message to the default window procedure. Instead, the windows procedure can process WM_SCROLL and the WM_VSCROLL messages. The window auto gesture processes the flick gesture messages, and generates the horizontal or vertical scroll messages with the physics engine effect applied.
Window auto gesture does not process the pan gesture. The application must process the pan gesture in order to respond to it. This gives the application the flexibility to determine how to respond to the gesture.
Building WIN32 Gestures Application
All the Gestures macros and functions are defined in WinUser.h. Add the TouchGesture.lib to the Project -> Application Properties -> Configuration Properties -> Linker ->Input field.
I am rigging up my development board (e-con’s Alioth – PXA300 based reference platform running Windows Embedded Compact 7.0) with some multi touch sensor and am going to play around with gestures in WEC 7. Stay tuned to this blog to hear more from my experimentations with gestures.
You want to know anything particular about gestures, go ahead and post a comment here and maybe I would include that in my experimentation.
Ok!! Let’s get my hand dirty on this one now…..