Disclaimer: I haven’t done any usability testing on paper forms. If I were, however, this is how I would (ideally) go about it:
1.) For recording purposes, I’d use both a digital voice recorder as well as a video camera to record all sessions. This gives you the option to review both audio feedback and video feedback separately-- a practice which I’ve found helpful in the past. Very often, you’ll find you’ll hear intonations and inflections you miss when paired with visual queues, and you’ll notice body language and eye positioning better without audio distractions. A big part of in-person UX testing is being able to read the subject to time probing questions and dig deeper into issues; I would imagine this is even more important in paper testing.
2.) Tasks should still be similar to testing web forms. Give your users a goal, and observe their efforts to achieve it. Whether this is filling out a particular section of the form, interacting with provided figures and tables, or simply their ability to complete the entire package without outside assistance, you’re still looking for many of the same things.
3.) Metrics to measure:
- Time on task
- Ease of Use
- Task completion success
- Error recovery success (IE if they made a mistake, were they able to recover and complete the form without asking for a new copy?)
- Total number of errors committed
- Number of references to help documentation (if applicable and provided)
I’m curious, though, how others would approach this problem.