I want to measure and analyze user movements and gestures in the UI in order to refine the application user experience. I had imagined that feature-tracking libraries (like EQATEC or Preemptive's Runtime Intelligence) would allow this. However, this does not appear to be the case.
Ideally, I'd like to be able to instrument a UI and then capture mouse and keyboard navigation gestures to display via a heat-map.
My searches have come up empty. Does anything OSS or commercial exist here?
After trying a number of approaches, including the ones here as well as using UIAutomation Events and ETW for WPF, I've decided on a simple attachment of a handler to WPF events. This allows me to not only capture the event data, but also the UIElement which has the users attention, so it is much easier to trace the user action and intention. Without this, I'd need to capture a visual of the screen and make a visual determination of what is going on.
Here's a sample:
While it is not shown here, once I get the
UIElement
I perform logging and can even then use theUIElement.DataContext
to determine the state of the ViewModel which is driving the view so we can find patterns of usage during certain workflows and data-states as well as visual states. We can then get reports on this, as well as differentiate and compare our heat maps by paths through workflow and data values.