Usability Testing at Bobst Library at New York University

Usability testing: Process, Procedure, and Results

Section One of this document provides information about the procedure we used for usability testing, as well as what we learned from the process of usability testing.

Section Two of this document provides the conclusions drawn from the usability testing; including a checklist based both directly and indirectly on the usability testing, to be used as we design future tutorials.

Section One

Background / Need
To make the most of the new position created at Bobst Library (Instructional Design Librarian) the Division of Libraries decided to identify what was most needed by the NYU community in terms of online Web instruction. The instructional services unit decided the best way to determine what was needed in these tutorials was to perform usability testing on our target audience. The questions we wanted to answer were:

  • How interactive should the tutorials be?
  • How linear in nature should the tutorials be?
  • How should the navigation in the tutorials be arranged to best suit our users?
  • How best can we test the knowledge our users have gained through using our tutorials?
  • Should the tutorials be primarily text-based, or should they be graphics and multimedia rich?
  • Should tutorials be pages with itemized information, from which users can easily navigate back to a click able table of contents, or should their navigation be such that you go through each section with no option of skipping sections?

We first had to attain exempt status from the University's Office of Human Subjects. To do this, we submitted an application explaining how we were planning to implement the testing, and on whom we were planning to test. In the application, we explained that we would test only NYU community members, aged 18 or older.

Factors to determine upon attaining exempt status:
1. Who would perform this testing?
2. How many testers would we use?
3. What kind of questions would we ask?
4. Would we only show our Web pages and sites, or others as well?
5. How would we advertise / schedule

1.Who would perform this testing?
Deciding who would perform this testing was straightforward. Marybeth, head of Instructional Services; Paula Feid, head of Undergraduate Services; and myself- Nadaleen Tempelman-Kluit, Instructional Design Librarian, were the logical testers, because we initiated the process and determined the need for testing. We also felt that if only the three of us performed the testing, the results would retain consistency. Initially we did not feel that the three of us would have participate in each test, but, upon beta testing four students, we decided that it would be more effective if all of us were present; one to record the subjects comments; one to record the paths the students took, and to time the students; and one to read the script, ask the questions, and help the student navigate if they had problems.

2. How many students would we perform testing on?
As far as deciding how many students to perform testing on, the literature on usability testing concluded that as few as five subjects was enough to get the information desired. Much of the literature states that after five subjects, the information obtained becomes redundant. Despite this, we decided to test approximately fifteen students, so that we would have very concrete conclusions.

We also decided to "beta" test four students to confirm that our methods were solid; our questions appropriate; and that the whole process ran smoothly. These four students were work-study students at the library. They were not recorded in the results.

3. What kind of questions would we ask?
We went through many versions of questions for the usability testing even before we beta tested. Upon beta testing, we refined the questions further. Our final document contained sixteen questions, which we believed would take the subjects between 45 minutes and an hour. The text in many of these questions prompted our subjects to think aloud throughout the testing. The following points reflect the reasons we performed usability testing:

  • How easy or hard the navigation was in the tutorials
  • How interactive students wanted / expected the tutorials to be
  • How graphics rich they wanted the tutorials to be
  • How multimedia rich they wanted the tutorials to be
  • Based on the above points, the questions we asked were:
      1. Where would you go from the Bobst Library home page to find the tutorial section of the Web site?
      2. What are your thoughts (please think aloud) on the home page to the Tutorials offered at Bobst Library?
      3. From the tutorial home page, please choose either "How to find a book" tutorial, or "How to find an article" tutorial. Please spend some time looking at the tutorial of your choice and let us know what your thoughts are on the tutorial.
      4. From the home page of the WWW tutorial, where would you go if you wanted to find out what a Meta search engine is?
      5. If you were writing an English paper and wanted to find out how to include a Web site you had used as a source in a bibliography, where would you go in this tutorial to find such information?
      6. When doing an online search, it is easy to get too many results, or "hits." In this tutorial, where would you go it you wanted to learn how to search more effectively, so that you wouldn't get so many results?
      7. From the page you are now on, please navigate to the home page of the WWW tutorial.
      8. Please tell us what you think of this Web site in terms of navigation and design and ease of use.
      9. Please navigate your way from the library's home page to the Boolean Tutorial. Spend some time looking at this tutorial. Please tell us your thoughts on this tutorial. What do you think of the design and navigation?
      10. Please view the Oasis site ( Do you find the table of contents layout of this site clearer than the tutorial section of the Bobst Library site? Do you have any other thoughts on this site?
      11. Please look at the Actden tutorial. ( Can you tell us what you think of the look and feel of this site?
      12. What do you think of the quiz bytes? Do you think they would reinforce what you have learned in doing the tutorial?
      13. Can you tell us where you would go on this site to find out how to organize e-mail addresses?
      14. 14. Of the tutorials you have looked at today, which most appeals to you?
      15. Of the tutorials you have looked at today, which do you think you would learn the most from?
      16. Do you have a favorite Web site? Could you please show us the Web site?

      4. Would we only show NYU Libraries' pages and sites, or other institutions as well?
      While the existing online tutorials available through Bobst Library differ in their presentation (Boolean tutorial, How to find a book, etc) we decided to present some other tutorials and arrangements of library instructional information to determine if students preferred them. Not all of them were library sites and tutorials; and we made sure we showed online information that was quite diverse in presentation style and look and feel.

      5. How would we advertise?
      We advertised within the library and rewarded our subjects for one hour of their time with a 20-dollar gift certificate to the book or computer store at NYU. We arranged with students appointments via the phone. In advance we had determined times and dates to perform this testing, all in a very concentrated block of time. We performed the usability testing over a short time period, in part to ensure that we continued to hone the process, and in part due to scheduling reasons.

      Usability testing method conclusions
      Though all the literature stated that after five testers, the results would become redundant, we did not find that. There were some redundant results, but also some surprising differences.

      What worked
      Overall, the implementation and methods of testing were sound. We attained valuable results and identified a number of improvements / changes to made to the tutorials. The factors we felt worked well were:

      Performing testing during small concentration of time:
      In performing the testing: by scheduling students for testing over only a few days, we were able to observe differences between the students more easily, and to create an efficient environment for the testing.

      Performing beta testing:
      This helped to determine which questions were not working, and which were. Also, from the beta testing we decided to make printouts of several of the tutorial's home pages to show subjects toward the end of the testing, when we asked them which tutorial they liked best. The printouts helped to remind them what tutorials we had shown them.

      Consistent testers:
      By having the same three librarians performing the testing, there was a consistency that wouldn't have been available with a wide variety of testers.

      Generous monetary reward:
      While 20 dollars for less than an hour of work seems generous, served to make the students take the testing seriously, and generated enough interest in the testing to acquire enough students in a short period of time.

      Performing testing with over fifteen participants:
      While all the literature suggested this was overkill, we feel it was well worth the time to have that many testers. There were unique thoughts from each student.

      Encouraging students to think aloud throughout testing
      This helped in that the students often said things that weren't directly related to the questions we were asking, thereby giving us more information. Some people didn't talk as much; therefore we asked them quickly what they were thinking.

      Informal discussion after scripted questions:
      We discussed the Web sites on a more informal basis with the students, and felt this was useful, if only because it relaxed the formality and loosened the students up, allowing them to say things they might not have said in the formal part of the process. We did not record these informal discussions in the results.

      Informal discussion amongst librarians after each tester:
      This was a valuable part of the testing, because it helped to clarify information which we might not have heard and recorded correctly; it also was useful in summarizing the results of the tester.

      Keeping the scope of the testing narrow
      Because we were asking the students to navigate from the home page to the instructional section of the site, it was tempting to discuss navigation not only in the tutorials, but also in the whole Bobst Library site. We made concerted efforts to stay away from broadening the scope of the testing. Had we included the whole site, the goals of the usability testing would have shifted, and the information attained would not have been to focused.

      What we'd do differently
      Advertise outside of the library
      By advertising only inside the library, our testers tended to me somewhat familiar with the Bobst Library Web site.

      Use a form to have students electronically schedule appointments
      This would cut down on the time it took to schedule.

      Find a method to get a good cross section of NYU students
      While we were lucky in naturally getting a good cross section of undergraduate and graduate students, as well as International students and National students, finding a way to assure that the cross section is representative of the student population at NYU is important.

      Section Two

      Section two provides the conclusions drawn from the usability testing; including a checklist based both directly and indirectly on the usability testing, to be used before implementing future tutorials

      Usability Testing Results (general)

      · Wording unclear: too much library lingo; obtuse language (i.e. citations; tutorials; Boolean, etc)
      · Too much text in some tutorials: some tutorials are too wordy and text is in large paragraphs, and not written in a Web friendly format.
      · Users didn't like being locked in to tutorials (Boolean) but wanted a non-linear design structure
      · Graphics distorted
      · No motivation to go to Tutorial section of Web site (How to Use Bobst): Make sure links to tutorials are put in more relevant places.

      Tutorial Checklist (consistent style)

      • Make wording clear
      • Writing for Web format: bullets, tips, chunking, etc
      • Use consistent font
      • State objectives of tutorial on home page
      • Group reference links together at bottom of document so as not to distract user. Put only relevant links in the text of the document. (word more clearly)
      • Have links back to home page visible and consistent on every page
      • Make auxiliary exercises optional; but have some of it.
      • Use navigational graphics that reflect subject matter / are metaphorical
      • Use non-linear design structure
      • Efficient hierarchy of information, to minimize number of steps for users.
      • Always have link to webmaster / designer so users can provide feedback
      • Consistent pattern of modular units
      • Clean graphics
      • Design for "lowest common denominator" 600 X 480
      • Provide printer-friendly format to tutorials that provide a lot of content, so users can easily print out content: 535 pixels in width to fit an 8.5X11 inch print page
      • Consistent approach to titles, sub-titles headings, and subheadings
      • Publicize tutorials from other sections of Bobst Library site than How to Use Bobst, and on Blackboard and other course websites.