At UX London this year Jon Kolko talked about using our UX skills to solve problems beyond the commercial, digital world. I think he’s tapping into a common desire in our industry to work outside of our typical domain.
I got lucky in this respect a couple of months ago; I usability tested a booklet that teachers use. I wish I could claim responsibility for this idea; however, it came from a rather smart client of mine! He recognised some potential issues with the booklet, and he wanted to test it and use the results to advocate internally for design changes.
When googling I couldn’t find much about anyone testing print products, but I am vaguely aware that marketers do use A/B testing (quantitative). When doing direct mail campaigns they will test the response rate from two different designs. I don’t know if I’m the first to do qualitative testing on a print product (I’d guess not), but if anyone wants to do this in future, I hope this post proves useful.
I did my best to replicate the standard recording tools when usability testing a digital product; i.e. two cameras – one on the participant to capture emotional reactions, and one on the booklet in their hands (ala silverback
- full disclosure – I work for clearleft!). I didn’t manage to create a rig that captured a readable version of what the participant was looking at on the page, but it was certainly good enough to make a reasonable judgement about what happened during analysis. With a bit more hacking I’m sure you could get closer to a mobile testing rig
. Here’s what I managed to achieve.
I used my macbook pro with the iSight camera on the participant, and an external USB camera on the booklet in their hands. I mixed the video together using camtwist studio
(its free) in studio mode, sending the combined video to quicktime, which recorded the output and audio.
A few ‘gotchas’ I encountered:
- the setup was fairly temperamental with two cameras. I’m told by the experts that its best to have one on firewire and one on USB – so its likely that there were hardware complications with both on USB. A third of my sessions were conducted with a single camera when the twin camera setup failed to work.
- Quicktime runs in 64-bit these days, and doesn’t recognise the video coming from camtwist in this mode. You can make Quicktime startup on 32-bit mode by going to ‘Applications’ > right click on ‘Quicktime Player’ > ‘Get Info’ > check ‘run in 32-bit mode’.
- camtwist prefers having the cameras use the same resolution – check the defaults and change as required.
The test script
Broadly speaking, I made no significant changes to the usability testing method. I tested mostly with new users: I interviewed them about their domain experiences first, and used their story to create a realistic scenario in which they would encounter the booklet. At this point I handed them the booklet (or a competitors), and gave them a task to complete, asking them to think aloud as they go.
We had already suspected that the booklet was difficult to use, and the sessions confirmed this from the word go. The first participant got lost immediately when she mistook the distracting inside cover for the table of contents (which was visible on the facing page!).
Assumed knowledge: The traditional use of this booklet has required that users know a lot of things up front to get anything done. This problem raised it head very quickly – the terminology was confusing even to those domain-savvy participants. This problem wasn’t clearly visible to the organisation as the market is steeped in very old traditions that are rarely questioned, and new users are rarely encountered – the booklet is introduced to a teacher perhaps once in a lifetime.
Visual design: The typography, visual hierarchy, and layout were clearly very problematic, easy page scanning was near-impossible. We could all see this problem without testing it, but we needed actionable evidence so that stakeholders could clearly empathise with the users.
Navigation: The navigation of the booklet also proved interesting to watch: the participants spent a lot of time jumping back and forth between sections, and in a couple of cases searching in vain for an index at the back. It’s hard, in retrospect, to gauge how much of this was the booklet design, and how much was typical print-consumption behaviour. The booklet is not designed to be consumed in a linear fashion (and probably couldn’t be – as the sections would then contain huge amounts of repetitive information).
What was different?
Print behaviours: because I don’t have a history of watching how people consume print, it was harder to distinguish between genuine problems and common print use, although some of the flicking back and forth was clearly the result of a design issue. In the digital world we call this pogosticking – when a user is forced to click back and forth repeatedly to find information. We’ve observed enough behaviour to spot these common patterns – I’m sure there is research out there for common print behaviours, but I didn’t check beforehand.
The feel: users commented on the way the paper felt in their hands: “its like bible paper” and how the transparency inhibited readability. Perhaps the tactile nature, and the use of additional senses made it easier for people to express their relationship to the product.
The use: teachers who already use the booklet showed me their annotations in the margins. Seeing this made me wonder more about how limited digital products might be in other respects – you can’t just do what you want with a web page, but you can cut/rip/write on/bookmark paper in any way you like. Unexpected use might be easier to spot/guesstimate in the print world.
Emotional responses: when testing digital products, we know just how inclined people are to blame themselves for the problems they encounter: “I’m terrible with computers“. Although the frustration was at least partly caused by the design of the booklet, the negative emotional responses were expressed far more clearly than when I observe people using digital products. It was the first time I observed people sighing loudly in frustration, and at one point shove the booklet away!
I’m inclined to believe that people expect more from print because it is not historically perceived as complicated – and so they are much more critical of the product when having a problem.
Because of this difference, I’m inclined to do more of this kind of thing in future, particularly to capture reactions to brand – it seems from this project that its easier to discern the reactions when the user is holding your product (and not a mouse)!
- Setting up camtwist with quicktime is pretty easy now I’ve worked it out, but the camera issues need addressing. I might use a firewire-connected video camera next time as the second camera.
- I will look for research into how people consume print before conducting testing.
- I think this might be interesting to combine with emotional response testing for a clearer assessment of a brand.