Enhanced Touch Manipulations for Silverlight and Windows Phone

Following up on Jeff’s series on Touch Interfaces for Windows Phone, I wanted to show an additional technique that allows support for Manipulations in Silverlight. Furthermore, when this is extended to the Phone, it allows support for moving two or more UI elements independently, which as Jeff noted, is not possible with the Manipulation events exposed in Silverlight for Windows Phone.

The article by Charles Petzold that Jeff refers to discusses the fact that while Manipulation events are present in Silverlight 4, they are not available for use. Attempts to hook these events either in code or in markup will result in a runtime exception. The good news is that there is a downloadable assembly that, with some additional work on the developers’ part, can provide the benefits of Manipulations to Silverlight 4 applications. This assembly is available as part of the Microsoft Surface Manipulations and Inertia Sample for Microsoft Silverlight, which is available for download here. The download includes the System.Windows.Input.Manipulations.dll assembly (hereafter the “Manipulations Assembly”) and a sample application, including code, which offers guidance as to how it can be used.

The Manipulations Assembly is a Silverlight version of the equivalent assembly included in .Net 4.0, and exposes a set of manipulation and inertia-related classes. Besides the inclusion of the letters “2D” in the name of each of the namespace members, these items work together to provide functionality that is similar to what Jeff described. Documentation for these elements is available at http://msdn.microsoft.com/en-us/library/system.windows.input.manipulations(VS.100).aspx.

As I mentioned above, using the Manipulation2D classes will require some additional work. Specifically, your code will need to provide input to the manipulation engine, provided by the ManipulationProcessor2D class, converting events of your choosing into individual manipulation actions and indicating when a particular manipulation has been completed. The manipulation engine will take that input and convert it into the appropriate Started, Delta, and Completed events, including calculating offsets, velocities, etc.

With respect to Christopher Walken, “Enough talkie-talkie, more ping-pong” (Let’s see some code!)

First we’ll start with a Silverlight 4.0 project and add a reference to the Manipulations Assembly (locate the folder the content was downloaded and extracted into from the link above.) As has been discussed, Silverlight UIElements expose several Manipulation events, none of which can be used. However, it would be handy if the Manipulations Assembly could be used to simulate the presence of those events on these types of objects. So we turn to Behaviors, with the intent of providing a behavior that allows us to raise these events out of instances of UIElements in our markup. This means we also need a reference to System.Windows.Interactivity.dll.

clip_image002

Moving from Touches to Manipulations

Once we have our references figured out, we can get to work. As I mentioned above, one of the caveats of working with the Manipulations Assembly is that some extra work is required. Specifically, you need to catch the application-wide Touch events and then feed them to the classes in the Manipulations Assembly. That’s a multi-step process, because Touch events are application-wide and we want manipulations to appear as if they were sourced from individual UIElements. So step 1 is to hook the touch events when the behavior is attached to its UIElement.

First when the behavior is attached:

Code Snippet
  1. protected override void OnAttached()
  2. {
  3.     base.OnAttached();
  4.     HookTouchEvents();
  5.     HookMouseEvents();
  6.     _subscribedBehaviors.Add(this);
  7. }

And the actual event subscription:

Code Snippet
  1. private static void HookTouchEvents()
  2. {
  3.     if (!_touchEventsHooked)
  4.     {
  5.         _touchEventsHooked = true;
  6.         System.Windows.Input.Touch.FrameReported += OnFrameReported;
  7.     }
  8. }

There are two important things to note here. First, when attaching the behavior, the current instance is being added to a static list of attached behaviors (and removed from the list in the OnDetaching override)…more on this in a minute. Also, notice that the Touch events are only being hooked per type, instead of per class. This prevents the touch event from firing once per behavior instance.

The majority of the hard work happens in the OnFrameReported handler and the associated ProcessTouchPoints method, which does three key processing steps.

First, it identifies any new touch-point/behavior associations:

Code Snippet
  1. // Find any new touch-point/behavior associations.
  2. foreach (var touchPoint in touchPoints.Where(x => x.Action == TouchAction.Down))
  3. {
  4.     // There shouldn’t be any of these!
  5.     if (!_capturedBehaviors.ContainsKey(touchPoint.DeviceId))
  6.     {
  7.         // “Unowned” touch device – see if a behavior can be found for it to associate with at its current position…
  8.         var matchingBehavior = GetBehaviorAssociatedWithTouchPoint(touchPoint.Position, _subscribedBehaviors);
  9.         if (matchingBehavior != null)
  10.         {
  11.             _capturedBehaviors.Add(touchPoint.DeviceId, matchingBehavior);
  12.         }
  13.     }
  14. }

This is done by examining all of the incoming touch points for those whose action type is “Down.” For each of those touch points, we check to find any UIElements with an associated behavior that are directly under the touchpoint. For this, we use the VisualTreeHelper.FindElementsInHostCoordinates call and iterate over those elements to see if they match the AssociatedObject of any of the behaviors that are in the list that we maintain when attaching/detaching the behaviors.

Code Snippet
  1. private static Manipulate2DBehavior GetBehaviorAssociatedWithTouchPoint(Point touchPointPosition, IEnumerable<Manipulate2DBehavior> behaviorsToCheck)
  2. {
  3.     Manipulate2DBehavior result = null;
  4.     IEnumerable<UIElement> elements = VisualTreeHelper.FindElementsInHostCoordinates(touchPointPosition, Application.Current.RootVisual);
  5.     foreach (var element in elements)
  6.     {
  7.         result = behaviorsToCheck.FirstOrDefault(x => x.AssociatedObject == element);
  8.         if (result != null) break;
  9.     }
  10.     return result;
  11. }

If a matching behavior is found, it is considered to be “captured” and is added to a static dictionary that gathers the touchpoint’s DeviceId and the behavior. This step is what actually enables us to track multiple concurrent manipulations, as we are not able to react to multiple “captured” manipulations, as will be explained shortly.

The next step in the Touch Point processing is to actually process the touch events for all of the captured behaviors. This includes those who were just gathered in the previous step, as well as any previous touches for which a touch-down has been received without a touch-up to end it.

Code Snippet
  1. // Process any current touch-point/behaviors
  2. foreach (var capturedBehavior in _capturedBehaviors.Values.Distinct())
  3. {
  4.     var associatedTouchDeviceIds = _capturedBehaviors.Where(x => x.Value == capturedBehavior).Select(x => x.Key);
  5.     var associatedTouchPoints = touchPoints.Where(x => associatedTouchDeviceIds.Contains(x.DeviceId));
  6.     capturedBehavior.HandleTouchEvents(associatedTouchPoints);
  7. }

To do this, the current list of touch points associated with any one behavior (in case multiple fingers are touching the same UIElement) are passed to the capturedBehavior’s HandleTouchEvents method.

Code Snippet
  1. private void HandleTouchEvents(IEnumerable<TouchEventData> points)
  2. {
  3.     IEnumerable<Manipulator2D> manips = points.Select(p => new Manipulator2D
  4.                                                                {
  5.                                                                    Id = p.DeviceId,
  6.                                                                    X = p.PositionX,
  7.                                                                    Y = p.PositionY
  8.                                                                });
  9.     _manipulationProcessor.ProcessManipulators(DateTime.Now.Ticks, manips);
  10. }

The HandleTouchEvents method introduces an instance of the Manipulation2DProcessor, which is the “engine” that drives the manipulation calculations. Every instance of the behavior class has its own instance of the Manipulation2DProcessor called _manipulationProcessor that is responsible for handling the manipulation inputs (via the Manupulator2D class and ProcessManipulators method), as well as raising the appropriate manipulation events. Once the correct behavior (and hence Manipulation2DProcessor instance) is determined from the location of the input touchpoints, those touchpoints can be provided to the manipulation engine for it to handle the rest.

The final task of the ProcessTouchPoints method is to remove any “captured” behaviors based on the corresponding touchpoint TouchUp actions.

Code Snippet
  1. // Find and remove touch point/behavior associations as needed
  2. foreach (var touchPoint in touchPoints.Where(x => x.Action == TouchAction.Up))
  3. {
  4.     if (_capturedBehaviors.ContainsKey(touchPoint.DeviceId))
  5.     {
  6.         var matchingBehavior = _capturedBehaviors[touchPoint.DeviceId];
  7.         _capturedBehaviors.Remove(touchPoint.DeviceId);
  8.         if (!_capturedBehaviors.ContainsValue(matchingBehavior))
  9.         {
  10.             //That was the last device holding on to the specific behavior…
  11.             matchingBehavior.CompleteManipulation();
  12.         }
  13.     }
  14. }

For each touchpoint whose action is “Up”, the corresponding touchpoint/behavior entry is removed from the “captured” dictionary. If the removal causes no more references to a particular behavior to be in the dictionary, CompleteManipulation is called, which simply calls CompleteManipulation on the behavior’s manipulation engine, signaling it that the manipulation it was tracking is now complete.

Observant readers may have noticed that the method for associating behaviors and touches within the OnFrameReported handler means that the previous discussion treats multiple finger presses on a UIElement as a single unit – there is no support for Pinch and Rotate, which is being left as an exercise for another day…

Raising the Events

Once all of this plumbing is done, raising the events is simple enough – it is just a matter of subscribing to the manipulation engine’s events and echoing them to the behavior’s consumers. The manipulation engine events are hooked in the behavior instance’s constructor (the event declaration with the delegate {} assignment avoids all that nasty boilerplate check-for-null-when-firing-an-event code…)

Code Snippet
  1. public event EventHandler<Manipulation2DStartedEventArgs> ManipulationStarted = delegate { };
  2. public event EventHandler<Manipulation2DDeltaEventArgs> ManipulationDelta = delegate { };
  3. public event EventHandler<Manipulation2DCompletedEventArgs> ManipulationCompleted = delegate { };

Code Snippet
  1. /// <summary>
  2. /// Initializes a new instance of the <see cref=”Manipulate2DBehavior “/> class.
  3. /// </summary>
  4. public Manipulate2DBehavior()
  5. {
  6.     _manipulationProcessor = new ManipulationProcessor2D(Manipulations2D.Translate);
  7.     _manipulationProcessor.Started += (o, e) => ManipulationStarted(AssociatedObject, e);
  8.     _manipulationProcessor.Delta += (o, e) => ManipulationDelta(AssociatedObject, e);
  9.     _manipulationProcessor.Completed += OnManipulationCompleted;
  10. }

Hooking it All Up

To wire up the behavior, simply attach it to your desired UIElement and provide handlers for the events, as in the following code, adapted from Jeff’s articles…sorry, no animated penguins

Markup:

Code Snippet
  1. <navigation:Page x:Class=”Wintellect.Touch.MultiManipulatePage”
  2.                  xmlns=”http://schemas.microsoft.com/winfx/2006/xaml/presentation”
  3.                  xmlns:x=”http://schemas.microsoft.com/winfx/2006/xaml”
  4.                  xmlns:d=”http://schemas.microsoft.com/expression/blend/2008″
  5.                  xmlns:mc=”http://schemas.openxmlformats.org/markup-compatibility/2006″
  6.                  xmlns:navigation=”clr-namespace:System.Windows.Controls;assembly=System.Windows.Controls.Navigation”
  7.                  xmlns:i=”clr-namespace:System.Windows.Interactivity;assembly=System.Windows.Interactivity”
  8.                  xmlns:Touch=”clr-namespace:Wintellect.Touch;assembly=Wintellect.Touch”
  9.                  mc:Ignorable=”d” d:DesignWidth=”800″ d:DesignHeight=”600″
  10.                  Title=”MultiManipulatePage Page”>
  11.     <Grid x:Name=”LayoutRoot” Background=”White”>
  12.         <Rectangle x:Name=”RedRect” Width=”200″ Height=”200″ Fill=”Red”>
  13.             <Rectangle.RenderTransform>
  14.                 <TranslateTransform Y=”-200″/>
  15.             </Rectangle.RenderTransform>
  16.             <i:Interaction.Behaviors>
  17.                   <Touch:Manipulate2DBehavior
  18.                     IsInertial=”True”
  19.                     ManipulationStarted=”TouchShapeBehavior_ManipulationStarted”
  20.                     ManipulationDelta=”TouchShapeBehavior_ManipulationDelta”
  21.                     ManipulationCompleted=”TouchShapeBehavior_ManipulationCompleted”/>
  22.             </i:Interaction.Behaviors>
  23.         </Rectangle>
  24.         <Ellipse x:Name=”BlueCircle” Width=”200″ Height=”200″ Fill=”Blue”>
  25.             <Ellipse.RenderTransform>
  26.                 <TranslateTransform Y=”200″ />
  27.             </Ellipse.RenderTransform>
  28.             <i:Interaction.Behaviors>
  29.                 <Touch:Manipulate2DBehavior
  30.                     IsInertial=”True”
  31.                     ManipulationStarted=”TouchShapeBehavior_ManipulationStarted”
  32.                     ManipulationDelta=”TouchShapeBehavior_ManipulationDelta”
  33.                     ManipulationCompleted=”TouchShapeBehavior_ManipulationCompleted”/>
  34.             </i:Interaction.Behaviors>
  35.         </Ellipse>
  36.     </Grid>
  37. </navigation:Page>

Codebehind:

Code Snippet
  1. public partial class MultiManipulatePage : Page
  2. {
  3.     public MultiManipulatePage()
  4.     {
  5.         InitializeComponent();
  6.     }
  7.     // Executes when the user navigates to this page.
  8.     protected override void OnNavigatedTo(NavigationEventArgs e)
  9.     {
  10.     }
  11.     private void TouchShapeBehavior_ManipulationStarted(Object sender, Manipulation2DStartedEventArgs e)
  12.     {
  13.         var senderShape = (Shape)sender;
  14.         senderShape.Tag = senderShape.Fill;
  15.         senderShape.Fill = new SolidColorBrush(Colors.Yellow);
  16.     }
  17.     private void TouchShapeBehavior_ManipulationDelta(Object sender, Manipulation2DDeltaEventArgs e)
  18.     {
  19.         var senderShape = (Shape)sender;
  20.         var translateTransform = (TranslateTransform)senderShape.RenderTransform;
  21.         translateTransform.X += e.Delta.TranslationX;
  22.         translateTransform.Y += e.Delta.TranslationY;
  23.     }
  24.     private void TouchShapeBehavior_ManipulationCompleted(Object sender, Manipulation2DCompletedEventArgs e)
  25.     {
  26.         var senderShape = (Shape)sender;
  27.         //TODO: This should really be done at the end of inertia, otherwise it will take the original color and keep moving…
  28.         senderShape.Fill = senderShape.Tag as Brush;
  29.     }
  30. }

That’s Interesting – Now What about the Phone?

What we have seen so far is the use of Behaviors and the Manipulations Assembly to provide manipulation event support in Silverlight 4 similar to that available in Windows Phone – in fact, better than that available in Windows Phone, since it can handle the simultaneous manipulation of multiple objects. But I said at the beginning that this could be extended to the phone – soooo…

It turns out this is quite simple to add to the phone. Because Windows Phone can consume regular Silverlight assemblies (as long as they do not stray outside of the phone’s own unique ‘sandbox’), the process is quite straightforward. Starting with a regular Silverlight for Windows Phone project, once again add references to System.Windows.Interactivity and to the Manipulations Assembly System.Windows.Input.Manipulations.dll. Then create the exact same Manipulate2DBehavior class in the phone project (there are techniques for sharing this code, including compiling it into its own assembly or using Add Existing as Link, for which information can be found elsewhere.) Using the same markup and codebehind content will provide similar results on the phone.

Markup:

Code Snippet
  1. <phone:PhoneApplicationPage
  2.     x:Class=”MultiManipulate.Views.MultiManipulatePage”
  3.     xmlns=”http://schemas.microsoft.com/winfx/2006/xaml/presentation”
  4.     xmlns:x=”http://schemas.microsoft.com/winfx/2006/xaml”
  5.     xmlns:phone=”clr-namespace:Microsoft.Phone.Controls;assembly=Microsoft.Phone”
  6.     xmlns:shell=”clr-namespace:Microsoft.Phone.Shell;assembly=Microsoft.Phone”
  7.     xmlns:d=”http://schemas.microsoft.com/expression/blend/2008″
  8.     xmlns:mc=”http://schemas.openxmlformats.org/markup-compatibility/2006″ xmlns:i=”clr-namespace:System.Windows.Interactivity;assembly=System.Windows.Interactivity” xmlns:Touch=”clr-namespace:Wintellect.Touch” FontFamily=”{StaticResource PhoneFontFamilyNormal}
  9.     FontSize=”{StaticResource PhoneFontSizeNormal}
  10.     Foreground=”{StaticResource PhoneForegroundBrush}
  11.     SupportedOrientations=”Portrait” Orientation=”Portrait”
  12.     mc:Ignorable=”d” d:DesignHeight=”768″ d:DesignWidth=”480″
  13.     shell:SystemTray.IsVisible=”True”>
  14.     <!–LayoutRoot is the root grid where all page content is placed–>
  15.     <Grid x:Name=”LayoutRoot” Background=”Transparent”>
  16.         <Grid.RowDefinitions>
  17.             <RowDefinition Height=”Auto”/>
  18.             <RowDefinition Height=”*”/>
  19.         </Grid.RowDefinitions>
  20.         <!–TitlePanel contains the name of the application and page title–>
  21.         <StackPanel x:Name=”TitlePanel” Grid.Row=”0″ Margin=”12,17,0,28″>
  22.             <TextBlock x:Name=”ApplicationTitle” Text=”MY APPLICATION” Style=”{StaticResource PhoneTextNormalStyle}“/>
  23.             <TextBlock x:Name=”PageTitle” Text=”page name” Margin=”9,-7,0,0″ Style=”{StaticResource PhoneTextTitle1Style}“/>
  24.         </StackPanel>
  25.         <!–ContentPanel – place additional content here–>
  26.         <Grid x:Name=”ContentPanel” Grid.Row=”1″ Margin=”12,0,12,0″>
  27.             <Rectangle x:Name=”RedRect” Width=”100″ Height=”100″ Fill=”Red”>
  28.                 <Rectangle.RenderTransform>
  29.                     <TranslateTransform/>
  30.                 </Rectangle.RenderTransform>
  31.                 <i:Interaction.Behaviors>
  32.                     <Touch:Manipulate2DBehavior
  33.                         IsInertial=”True”
  34.                         ManipulationStarted=”TouchShapeBehavior_ManipulationStarted”
  35.                         ManipulationDelta=”TouchShapeBehavior_ManipulationDelta”
  36.                         ManipulationCompleted=”TouchShapeBehavior_ManipulationCompleted”/>
  37.                 </i:Interaction.Behaviors>
  38.             </Rectangle>
  39.             <Ellipse x:Name=”BlueRect” Width=”100″ Height=”100″ Fill=”Blue”>
  40.                 <Ellipse.RenderTransform>
  41.                     <TranslateTransform />
  42.                 </Ellipse.RenderTransform>
  43.                 <i:Interaction.Behaviors>
  44.                     <Touch:Manipulate2DBehavior
  45.                         IsInertial=”True”
  46.                         ManipulationStarted=”TouchShapeBehavior_ManipulationStarted”
  47.                         ManipulationDelta=”TouchShapeBehavior_ManipulationDelta”
  48.                         ManipulationCompleted=”TouchShapeBehavior_ManipulationCompleted”/>
  49.                 </i:Interaction.Behaviors>
  50.             </Ellipse>
  51.         </Grid>
  52.     </Grid>
  53. </phone:PhoneApplicationPage>

Codebehind:

Code Snippet
  1. public partial class MultiManipulatePage : PhoneApplicationPage
  2. {
  3.     public MultiManipulatePage()
  4.     {
  5.         InitializeComponent();
  6.     }
  7.     private void TouchShapeBehavior_ManipulationStarted(Object sender, Manipulation2DStartedEventArgs e)
  8.     {
  9.         var senderShape = (Shape)sender;
  10.         senderShape.Tag = senderShape.Fill;
  11.         senderShape.Fill = new SolidColorBrush(Colors.Yellow);
  12.     }
  13.     private void TouchShapeBehavior_ManipulationDelta(Object sender, Manipulation2DDeltaEventArgs e)
  14.     {
  15.         var senderShape = (Shape)sender;
  16.         var translateTransform = (TranslateTransform)senderShape.RenderTransform;
  17.         translateTransform.X += e.Delta.TranslationX;
  18.         translateTransform.Y += e.Delta.TranslationY;
  19.     }
  20.     private void TouchShapeBehavior_ManipulationCompleted(Object sender, Manipulation2DCompletedEventArgs e)
  21.     {
  22.         var senderShape = (Shape)sender;
  23.         //TODO: This should really be done at the end of inertia, otherwise it will take the original color and keep moving…
  24.         senderShape.Fill = senderShape.Tag as Brush;
  25.     }
  26. }

The Wintellect Silverlight Touch Library – aka LightTouch

Ultimately, the code referenced above has made its way into a larger library that provides additional Touch functionality beyond just attachable support for Manipulations, including support for Gestures, as well as enhancements that make it very easy to add touch support to scrollable controls like the ListBox.  For Manipulations, Inertia is also added to the mix.  This core manipulation Behavior is actually at the heart of each of these additional elements.  This project is being published up on CodePlex and can be accessed at http://lighttouch.codeplex.com

Summary

Obviously, it is a little hard to see all this in action. To that end, if you have a multi-touch-enabled monitor, you can download  the project from CodePlex and try it out. While there was some tedious boilerplate code required to hook touch events up to the individual UIElements’ manipulation behaviors, what results in the end is a reusable behavior that can be easily reused throughout an application.

Author’s Note: Parts of the content prepared for this article were inspired by content from Jeff Prosise’s blog series on “Touch Interfaces for Windows Phone”, as well as Mike Taulty’sTouched” blog series.

We deliver solutions that accelerate the value of Azure.

Ready to experience the full power of Microsoft Azure?

Start Today

Blog Home

Stay Connected

Upcoming Events

All Events