10c a day

‘Usability testing’ sounds foreboding. It often is.

A lot of set-up and cat-herding needs to happen to get people to show up prepared. Then there’s the dread of the “I hate it” comments. This might be why testing tends to get put off until the end of the project when it becomes almost obligatory—a box to tick.

We (at Future Super) have loosely adopted the Usability testing on 10c a day method from Steve Krug’s Don’t Make Me Think book. The chapter’s introductory quote gives you the idea:

Why didn’t we test this sooner?

—What everyone says at some point during the first usability test of their web site

The method calls for one round of testing each month with three participants. Observers note the top three usability issues per session, and debrief after lunch.

Why so few people? Why once a month? Why no more planning? In Krug’s words:

  • It keeps it simple so you’ll keep doing it
  • It gives you what you need
  • It frees you from deciding when to test

Our April session focussed on the upcoming Join form refresh. Three participants1 completed the dummy Join form and talked aloud (after some prompting). Everyone who attended these sessions debriefed on the major usability issues and ranked them from most serious to least.

A digital board with sticky notes
Although a bit clunky, G Suite’s Jamboard was a quick and easy way to allow each observer to note their observed usability issues with the upcoming Join form refresh.

Some of these issues were fixed on the same day. Everything else is now high-up on our to-do list. We have faith they’ll get done because many of us saw how problematic they were.

May’s session is booked and ready to go. We’ll be testing improvements to our Member Portal—wherever they are up to by that time. Stay tuned for how that goes!

This post originally ran in Future Super’s Safe Team Brave Work blog as part of our fortnightly-ish release notes.

  1. Amongst other criteria, we chose participants based on what device they had to test with. Two mobile users and one desktop roughly matched our web traffic. It worked well, so we’ll continue this pattern for future testing sessions. ↩︎