How usability testing is helping improve desktop Linux.
During the past year, we at Novell have conducted hundreds of usability tests on different parts of the Linux desktop. We use two video cameras—one on the face, one on the hands—and a frame grabber to record everything the user does. We ask our subjects to perform five or six simple tasks with Linux, and we burn the result to a DVD.
This month, we've released our test results to the Linux community at openSUSE.org. Linux developers have access to more than 200 user tests and analysis. You can watch real users encounter real design problems and get a sense of their thought process.
We've all read about the benefits of usability testing, but until you actually try to sit through two hours of these videos, you don't viscerally understand why it's so important. Watching these videos is exciting and emotionally exhausting. You squirm. And it focuses you like a laser.
For example, we asked a woman to send mail to a friend. Against all odds, she started Evolution (nothing in the menus indicates that it's a mail program; something we hadn't realized before, but it was immediately obvious after watching her stalk one by one through the menu items muttering to herself along the way).
The correct next step would have been for her to click on the New button that's in the upper left-hand corner of the window. This button didn't even register for her, however. Instead, because she wanted to “send” a mail, she clicked repeatedly on the “Send” part of the Send / Receive button just to the right. For about a minute.
This is easy to fix; we just need to change the labels to be more sensible (and then test again on 5–6 people to be statistically comfortable that we changed them appropriately). It was interesting to watch this video and instantly realize that the Send / Receive button is all about how Evolution works and not about what the user wants to do. I've been staring at that button for five years, and I never realized it was wrong until I saw that video.
Most industrial usability testing labs are impressive facilities: a one-way mirror separates the test subject from a video processing/observation room, where the video footage is recorded and where engineers and product managers can watch the test subjects in real time. The observation room usually has to be sound-proofed to muffle the anguished shouts of the engineers. These facilities are impressive, but they are also very expensive and somewhat intimidating to test subjects, who often feel like their abilities are being scrutinized in the stark light.
We built our labs for less than $1,000 US each with a couple of cheap Webcams, a video mixer, a DVD recorder and a small TV used as a monitor. It is feasible to get this cost down to $200 if the frame grabbing and video recording are implemented entirely in software. At these prices, this introduces a new way for people to contribute to open-source projects: perform some usability testing and send the results to the authors.
In one test, we asked a woman to find a document she had created earlier and make some changes to it. There are a couple of ways to do this: she could use Beagle to search for the file, she could open the file manager and step through her directory structure until she found the file or she could use the recent-documents facility in the OpenOffice.org File menu to find the document.
She paused briefly with a blank desktop in front of her, looking to see if the document she had created might be there. She opened the file manager and poked around a little bit. And then, finally, she ran OpenOffice.org and went directly to the File menu. “I know I'm doing this wrong”, she said, selecting the file by name from the menu, “but at least this will work.”
Technically, she succeeded in achieving her goal: she opened the file, and it didn't take her very long to do it. But she tried a few different avenues before finding one that worked. And she wasn't comfortable about it.
There's a difference between software that is usable and software that is a pleasure to use. Until you watch people using your software, it's hard to know how well you're doing.