Following on from that previous post I wanted to take a look at processing raw touch events.
In truth, I suspect that most application developers don’t want raw touch events and would rather work at the level of gestures and/or manipulations but being able to get access to the raw events at least means that you (or someone else) can build a higher level framework so I think it’s worth starting at the bottom of the stack.
Let’s take a look at just picking up some raw events across different frameworks.
Silverlight 4
In Silverlight 4, raw touch events are pretty much all you get in the framework and it’s surfaced in a very simple way. I wrote a little control that displays an arc (from Blend 4’s shapes) and spins it around and called it SpinnyControl and then I authored a UI consisting of only a Black Grid and wrote this tiny bit of code behind for it;
public partial class MainPage : UserControl { Dictionary<int, SpinnyControl> controls; public MainPage() { InitializeComponent(); this.controls = new Dictionary<int, SpinnyControl>(); this.Loaded += OnLoaded; } void OnLoaded(object sender, RoutedEventArgs e) { Touch.FrameReported += OnTouchFrameReported; } void OnTouchFrameReported(object sender, TouchFrameEventArgs e) { TouchPointCollection points = e.GetTouchPoints(this.LayoutRoot); foreach (var item in points.Where(p => p.Action == TouchAction.Down)) { SpinnyControl control = new SpinnyControl() { HorizontalAlignment = HorizontalAlignment.Left, VerticalAlignment = VerticalAlignment.Top }; this.controls[item.TouchDevice.Id] = control; this.LayoutRoot.Children.Add(control); control.RenderTransform = new TranslateTransform() { X = item.Position.X - (control.Width / 2), Y = item.Position.Y - (control.Height) / 2 }; } foreach (var item in points.Where(p => p.Action == TouchAction.Up)) { SpinnyControl control = this.controls[item.TouchDevice.Id]; this.LayoutRoot.Children.Remove(control); } foreach (var item in points.Where(p => p.Action == TouchAction.Move)) { SpinnyControl control = this.controls[item.TouchDevice.Id]; ((TranslateTransform)control.RenderTransform).X = item.Position.X - (control.Width / 2); ((TranslateTransform)control.RenderTransform).Y = item.Position.Y - (control.Height / 2); } } }
and you can see that running here on my touch screen;
and you can try it yourself here if you’ve got a touch screen;
and here’s the source.
WPF
We can get the same raw touch events delivered in WPF so I made a blank WPF 4.0 application and left the UI as a single black Grid with event handlers added to it for the touch events because in WPF touch events are routed to controls whereas in Silverlight I just get the events delivered to the main window and it’s then up to me to figure out which controls that may or may not impact.
<Grid x:Name="LayoutRoot" Stylus.IsPressAndHoldEnabled="False" Background="Black" TouchDown="OnTouchDown" TouchUp="OnTouchUp" TouchMove="OnTouchMove"> </Grid>
There’s also something else pretty important on that Grid definition and that’s the Stylus.IsPressAndHoldEnabled=”False” because without that the OS will try and look for Press+Hold gestures rather than just allow the touch events through to the app.
I also copied over my SpinnyControl user control although, to be honest, I found it easier to re-create the animations quickly in Blend rather than try and remember the property syntax as I changed them from CompositeTransform to TransformGroup.
With that done, the code’s the same as it was but re-org’d to fit with WPF’s eventing scheme;
public partial class MainWindow : Window { Dictionary<int, SpinnyControl> controls; public MainWindow() { InitializeComponent(); this.controls = new Dictionary<int, SpinnyControl>(); } void OnTouchDown(object sender, TouchEventArgs e) { TouchPoint point = e.GetTouchPoint(this.LayoutRoot); SpinnyControl control = new SpinnyControl() { HorizontalAlignment = HorizontalAlignment.Left, VerticalAlignment = VerticalAlignment.Top }; this.controls[e.TouchDevice.Id] = control; this.LayoutRoot.Children.Add(control); control.RenderTransform = new TranslateTransform() { X = point.Position.X - (control.Width / 2), Y = point.Position.Y - (control.Height) / 2 }; e.Handled = true; } void OnTouchUp(object sender, TouchEventArgs e) { SpinnyControl control = this.controls[e.TouchDevice.Id]; this.LayoutRoot.Children.Remove(control); e.Handled = true; } void OnTouchMove(object sender, TouchEventArgs e) { TouchPoint point = e.GetTouchPoint(this.LayoutRoot); SpinnyControl control = this.controls[e.TouchDevice.Id]; ((TranslateTransform)control.RenderTransform).X = point.Position.X - (control.Width / 2); ((TranslateTransform)control.RenderTransform).Y = point.Position.Y - (control.Height / 2); e.Handled = true; } }
and you can find that code here and I even changed the colours for WPF
Silverlight on Windows Phone
What about on the Windows Phone 7? Ideally, I’d just lift my Silverlight 4 code and drop it straight down onto the phone.
I hit a stumbling block because I’d gone and used a shape from the Microsoft.Expression.Drawing.dll and that’s not strictly available on the phone. So, I went back to Blend and converted it into a Path for the phone and used that instead.
Other than that the source code copied straight across from Silverlight on the desktop and just worked – in fact, it took me more time to figure out how the heck to print the screenshot of the device in action because it’s not-so-easy to do if you want to take the screenshot whilst the device actually has the focus (which it will do if you’ve got 2 fingers pressed to the screen ).
So, all-in-all the raw touch capabilities are all there in a pretty consistent format on each framework and it’s easy enough to at least get going with it.
What about gestures?…