Common Sense and a Simple Approach

Are you ready to take the “one morning a month” usability challenge?

When I consider the theme of Users First! the work of Steve Krug comes to mind.  In his two books, Don’t Make Me Think and the sequel Rocket Surgery Made Easy, he lays out a simple, lightweight, and effective method for quickly identifying usability problems.  In my own experience a design team can spend a lot of time guessing at why a particular interface is frustrating users.  As Albert Einstein observed “The formulation of the problem is often more essential than its solution…”  Steve Krug’s straightforward methods let you observe real users, in realistic situations, and quickly identify where the difficulties are encountered.  All he asks is “one morning a month” dedicated to usability.  If you are not already familiar with his work take a moment now and head over to – go ahead, I’ll wait. 

Welcome back!  A while back, at the School of Drama, Sarah Stevens-Morling and I decided to accept Krug’s challenge.  With only a rough idea of what we were doing we set aside one morning for each month through the academic year for usability testing on a range of the school’s websites.  We started with a review of the current sites.  In each session we set up 2 - 3 scenarios, where the user had a specific task to perform – i.e. purchase a ticket to a show or submit an accident report.  Each round of testing had 5 - 7 users, allowing us to complete a round in 2 hours.  With set-up and debrief each session required 3 hours of time.  No special facilities were required.  We’d set up a testing area in a cubicle or conference room and have the observers in another room close by.  TechSmith’s Morae testing software allowed session recording, note taking and remote observers.  The only pain was finding and scheduling the testers, although we got the hang of it after a couple of months.

And what did we learn?  A lot.  The biggest realization was how often our own professional guesses were wrong – and how obvious the real problems were once we watched several people struggle over the same hurdles.  In many cases resolution of the usability issues were simple, such as relocating a button or reducing non-essential page content.  As we progressed through the months, and gathered data, we began testing paper prototypes of page designs to confirm new ideas before investing in design and execution.  The analysis also revealed a pattern of systemic problems AND the actual data to quantify a loss of ticket income.  In the end the results helped justify to leadership that a major site redesign was required – and that the investment would pay for itself.  The new ticket purchase sites, launched this fall, have already delivered on a significant rise in web ticket purchases. 

So, are you ready to take the “one morning a month” usability challenge?  What would it take to start a campus wide initiative and community of practice around this idea? 


Having served as a guinea pig in Randy and Sarah’s implementation of Krug’s challenge, I can say that it was a compelling and refreshing experience. How often are we allowed (and even encouraged!) to question the very nature of a system or an interface design with a sense of childlike uninhibitedness? Yet often these are the most valuable kinds of questions we can ask, as Drama’s end results have shown. 
We already have a [slightly dormant] Yale Usability community of practice, and these kinds of case studies would be a perfect accelerant to rekindle its flames. So yes, I’m ready!

Perhaps the idea of a usability community of practice (COP) with an action-based agenda – monthly open testing sessions, with applications rotating among members – would be of interest.  Let’s keep this in mind as a possible session for the conference.

Morae is a great platform for capturing usability testing sessions, but (1) it runs only on Windows, and (2) it’s very expensive (~$1500), and may not be affordable for the comparatively casual usability studies many of us need to perform.

CMI2 is historically a Mac shop (from the days of Paul Lawrence), so when we began doing usability testing in 2009 we went with what was then a $50 application (now $69) called Silverback, which did the core things we needed.

What are the key things we want to capture when we record usability test sessions, and are there other applications that folks have used and would recommend?

It might be interesting to try using the Echo360 personal capture tool.  If it can deliver a live stream and recording this would easily allow remote observers, which is really useful.  It is cross platform.  And with Yale decision to make that the lecture capture standard it will be available to the community.  I haven’t had a chance to try it out yet, but from the descriptions I’ve heard it seems like it should work.