Building a mobile usability testing kit

Category
We made this
Written by
Sam Hails
Date
05.07.2016
Share

With a series of Apps in the pipeline and a couple of super secret projects down the road, the UX team at POKE have been turning our attention to usability testing. This fits in with our renewed interest in sprint techniques and iterative methodology but goes further in our belief that testing and showing work-in-progress to an intended audience is something that should be done early and often.

To really understand the usability of a particular digital product it isn’t just necessary to observe the user’s interactions with the screen, but also to observe the subtle, often subconscious, behavioural cues that can communicate so much of the user’s experience.

There seems to be a bit of a myth out there that testing is time consuming and expensive. We wanted to challenge the notion that testing somehow requires enormous expense, two way mirrors, eye tracking, white lab coats and a king’s ransom. So we set out to select and develop a suite of tools that we could use to perform usability testing anywhere – from a gym locker room (yes, we did this) to a more formalised setting at POKE towers.

To really understand the usability of a particular digital product it isn’t just necessary to observe the user’s interactions with the screen, but also to observe the subtle, often subconscious, behavioural cues that can communicate so much of the user’s experience. A holistic testing kit should therefore allow recording, not just of the screen, but the user’s hands, facial expressions and of course audio. 

There are a plethora of software solutions out there to choose from. However we found that most were overly complicated for the scenarios we had in mind. Many allowed cloud storage of the video recordings and online commenting. Features we felt were unnecessary and actionable more cheaply and effectively elsewhere. None provided the combination of observations we were looking for. So we embarked on a well trodden POKE path. We got hacking.

The recording of a user’s hands allows one to observe off-screen clues such as hesitation or aborted interactions. It also allows observation of screen interactions in the absence of our jailbroken iPhone (see below). Production-wise these recordings are usually achieved by shooting over-the-shoulder of the user or with static, rather awkward and unnatural, setups of phone, camera and tripod. The best approach employs using a sled – essentially a camera rig attached to the device.

Developing our own sled became one of the most interesting aspects of the project, progressing from first prototype, which used existing consumer items from Amazon, to latest; a multi-phone compatible version developed with our new 3D printer. In true hack style we have open-sourced the design on GitHub to share what we think is a pretty cool tool for others to use, and for ongoing collaboration and development with the GitHub community.

Having solved the hands, we then turned our attention to capturing the clearest possible feed of the screen. To record a phone’s screen is fairly straightforward but in order to generate a polished and completely unambiguous output we wanted to go further and generate a precise overlay of the user’s touch interactions. Solutions for this exist but only within the walled gardens of 3rd party apps. We wanted to go free-range across the iOS estate and, after a nervous weekend of iPhone jailbreaking, we found our solution with the rather brilliant TouchPosé+

News posts > Sled.jpg

The 3D printed sled

Testing dashboard showing video feeds from: the sled, face-cam and screen capture.

The final task was to bring all our monitoring together. Mixing together 3 video feeds and a separate audio while keeping them in sync is normally the preserve of pro (read expensive) systems. But some lateral thinking led us to the world of CCTV monitoring software. We combined this with a custom desktop background and some screen capture software which resulted in what we think is a low cost, yet slick, dashboard.

So far, the initial trial runs have been pretty successful and provided the kind of comprehensive monitoring we were aiming for. The kit isn’t yet suitable for all circumstances but provides a base setup from which we can now pick, mix and adapt for different scenarios. 

 

Download the 3D files and/or collaborate with us on Github

News posts > Sled.gif

 

3D print files, instructions, parts and software list

Collaborate on Github