LightTouch Enhancements–Popups and “Layered” Controls

While I am actively working on adding new functionality to the LightTouch library (such as adding the Pinch Gesture to bring parity with the GestureListener for Windows Phone from the Silverlight for Windows Phone Toolkit) for an upcoming release, a couple of issues came to light that made adding a new release appropriate.  As the title of this post suggests, these issues relate to support for touch in Popups (especially ChildWindow controls) and for controls where different layers of the control’s visual tree make use of different behaviors.

Support for Popups / Child Windows

At the heart of LightTouch is the Manipulation2DBehavior.  Much of the functionality behind this behavior is discussed here.  One of the basic methods’ responsibilities is to determine, based on the coordinates of the touch event, what controls are under that point and which of those controls have the Manipulation2DBehavior attached.  To do this lookup, the VisualTreeHelper.FindElementsInHostCoordinates function is used, with Application.Current.RootVisual supplied for the relative UIElement.  This is where difficulty with popups occurred.  Popup controls exist outside of the Visual Tree of the application’s RootVisual.  This can be seen in the following UI, with its Visual Tree as reported by Silverlight Spy.  Note how the Popup control appears as a sibling to the MainPage element.

image

image

So in order to locate the controls with behaviors in the popup, we have to use the Popup as the relative UIElement for the call to VisualTreeHelper.FindElementsInHostCoordinates.  Now to find the popup in the Visual Tree…this can be accomplished with the VisualTreeHelper.GetOpenPopups call.  Unfortunately, this call does not document how the list is populated relative to the Z-order of the ChildWindow (in case multiple ChildWindows are being displayed, issues of UI-propriety with this approach notwithstanding.)  Empirically, it seems that the topmost child window (if present) is the first in the list, so if any ChildWindows are being shown, we locate the first one in the list, and use it as the argument to FindElementsInHostCoordinates.  From there on, things progress as they had previously.

Supporting “Layers”

Shrek: “NO!  Layers.  Onions have layers.  Ogres have layers.   Onions have layers.  You get it?  We both have layers”
Donkey: “Oh, you both have layers.  Oh.  You know, not everybody likes onions.  What about cake?  Everybody loves cake!”
– Shrek (2001)

So what do I mean about layers?  In this case, I am referring to a situation where multiple related controls under a given point in a visual tree have Manipulation Behaviors attached.  An example would be a ListBox control with the ListBoxTouchScrollBehavior attached, and the underlying data template for the items in the list box having a GestureListener attached with a handler for a DoubleTap event.  The original implementation stopped at the first matching behavior, resulting in the ListBox scrolling, but ignoring the DoubleTap event.  The new implementation now identifies and processes all Manipulations under the given touch point.  This is similar to a Bubbling event, with a couple of key differences.  First, there is no notion of “handled.”  Second, the relationships do not have to exist in common branches of the same visual tree.

Available for Download

The latest code is up on CodePlex and the Alpha-2 release has been built based on these changes, which can be downloaded here.  I hope people find it helpful.

We deliver solutions that accelerate the value of Azure.

Ready to experience the full power of Microsoft Azure?

Start Today

Blog Home

Stay Connected

Upcoming Events

All Events