STC Usability SIG Home
Back to the Newsletter
This article was originally posted in the August 2004 issue (Vol 11, No. 1)

 

About the Authors

Steven Streight spearheads Streight Site Systems, a web usability analysis, credibility assessment, and content writing provider. He also provides direct marketing support and advertising material. Steven has worked in database and direct response marketing since 1978. He may be contacted at astreight@msn.com or on his usability blog site.

Related Topics

STC Usability SIG Newsletter

logo70x50.gif (1973 bytes)
Usability Interface

User Observation Testing: Forms and Procedures for an Information-Driven Web Sites

by Steven Streight

The W.D. Boyce Council of the Boy Scouts of America was concerned about their staff Web site. Complaints from users indicated that the site might have some usability problems. The W.D. Boyce web development committee decided to revise the design of the site, but wanted to identify usability deficiencies prior to the re-design.

As a new volunteer member of the committee, my responsibility consisted of advising the committee on usability and textual content issues. Here was my opportunity to make a significant contribution of expertise.

I suggested that my usability team observe typical users interacting with the Web site. Watching actual users attempt to accomplish tasks at the Web site would enable us to pinpoint specific usability problems. After I explained how a user observation test could be conducted, and what would be required of test subjects and testing facilities, the committee gave me the go-ahead.

www.wdboyce.org is an information-driven, text-dominant Web site, with relatively complex information architecture. Vital information is often buried in non-obvious locations, requiring users to embark on "linking expeditions" (site navigation-enabled information hunts) that are frustrating and time-consuming. Users are confronted with redundant links (differently labeled links that lead to same content), pseudo-redundant links (similarly labeled links that lead to very different content), and faulty link nomenclature (link labels that are vague or inaccurate).

The organization wants its staff leaders and volunteers to use the site as their primary source of organization news, policy statements, activity updates, and forms to download to reduce paper and postage costs for the organization.

W.D. Boyce web committee members understood how web text must be formatted differently from print media. Readers of books, magazine articles, and personal letters, for example, tend to read in sequential order, from beginning to end. Some passages may be skipped over, but print reading is far more linear (straight line from point A to point B, etc., to point Z) than web viewing. It's a mistake to think that web users will interact with Web site text the same way they consume print media text.

The sheer abundance of online material, and the free and easy access to it, contributes to users racing through Web sites, even relevant ones, until they spot the exact material they want. Unfortunately, most of the text on the site was in a "print read" format.

To convert the print read text to "web scan" text, I proposed using the following techniques:

  • Shorter paragraphs
  • Copy chunking, with intrasite hypertext links
  • Bulleted and numbered lists replacing dense text blocks
  • More descriptive heads and subheads
  • Underlining of hypertext links
  • Selected and non-selected hypertext link color differentiation (blue for unselected, not yet visited links; purple for selected, already visited links)
  • Elimination of unnecessary articles and superfluous words

The committee also agreed that a Site Index and some Multiple User-Segmented Site Maps (suggested paths through the site, based on user type) might be a good addition. It's suspected that users experience confusion upon arriving at the site. It's not clear where a specific type of user (leader, volunteer, donor, parent, or sponsor) should go in the site, nor where certain items are found, e.g., application forms for an upcoming event.

However, even with these enhancements, the site was still considered to be in need of a complete overhaul. To get our bearings for the construction of the new site, a User Observation Test was conducted on the current site. What follows are the forms and procedures we used.

Those who are new to web usability analysis, or those who are familiar only with certain aspects, may see something clarified that was previously mysterious.

The first thing we needed to do was recruit volunteers to be test subjects.

Computer Skills Level Telesurvey

We wanted two low level, two intermediate, and two high level computer skills users in the test. Test subjects were phoned and qualified by a Computer Skills Level Telesurvey. There is no strict "If, then" methodology for this. It's more like "If, probably," as in: "If [such and such is true], probably [the user is at this skills level]."

The questions we used were:

  • What they generally used a computer for, with ten suggested uses, from email only (LOW) to professional IT occupational work (HIGH).
  • How they found Web sites to visit, from magazines with lists of hot sites (LOW) to multiple specialized search engines (HIGH).
  • How they primarily navigated a Web site, from navbars and main menu link listings (LOW) to site maps and advanced search boxes (HIGH).
  • How much time they spent on the computer, from less than one hour a day (LOW) to more than three hours a day (HIGH).
  • What is their main usage of a computer, from pleasure or entertainment (LOW) to professional online forums and discussion lists (HIGH).
  • What skills level they perceive themselves to be at, from novice to computer expert.
  • What are their favorite Web sites, open ended, with AOL, Yahoo, eBay, amazon.com being ranked LOW to Slashdot, Wired News, and wikipedia being ranked HIGH. (Note: this is no reflection on the quality or professionalism of the Web sites so tagged. Low skills users tend to visit more popular, high traffic sites, while high skills users tend to visit more technical, specialized sites.)

Introductory Remarks to Test Subjects

Test subjects were thanked for coming. They were emphatically reassured that this evaluation was not to determine their intelligence or computer savvy, but to determine the efficiency and usability of the Web site. Gourmet deli cold cut sandwiches were available: easy to feast upon, not messy. Cans of cold soda were also provided.

Subjects were tested one at a time. Although the surveys were intended for post-test purposes, subjects filled out two of them while waiting to be tested, to decrease the amount of time they had to spend in this process. While the System Usability Scale Questionnaire had to be filled out after the usability test, whether the other surveys were filled out before or after the test was not a vital concern.

User Observation Test Procedures

Subjects were asked to sit in front of a computer monitor displaying the Home Page of the Web site being tested.

They were given a list of ten site task assignments. The test administrator read each assignment to the subject, prior to the subject performing the task; to be sure he understood what to do. When the administrator said, "Go," a stopwatch was activated and the task performance was timed to one hundredth of a second.

Video recording of the testing is recommended, but was not available in this case. The administrator took notes on how the subject attempted to accomplish each task, what links were followed, what navigation tools were utilized, and what comments were made.

When it was apparent that a subject was exasperated and would not be able to complete a task, the administrator said, "Give up? That's okay. This one's a tough one. Here's how you can find this." Test subjects were shown a site path leading to the information.

The administrator engaged in silent, non-invasive observation. No assistance was given, but, to keep the test from being emotionally cold and inhuman, friendly comments were made, such as: "It probably should have its own link on the Home Page" or "That's good thinking, but the site regretfully wasn't designed that way."

One test subject pulled the monitor closer to him due to his poor vision. Another was a bit awkward with the mouse; he kept clicking the right click mechanism or pushing down on the mouse wheel, thereby activating unexpected functions. In this case, the administrator intervened, informing the subject of his operational errors.

Link Strategy Survey

This questionnaire was designed to take the place of "card sorting," in which the user arranges cards with link labels or page section titles printed on them, to convey the desired information architecture for a Web site.

Labels of top navbars, left column main menu links, and bottom of page text links are presented. Users are told, for example, "A top navbar is a horizontal navigation tool that runs across an upper region of the Home Page. Here are the links in that navbar. What information would you expect to be provided by these links?" Space is provided for them to describe what they think should be in those links.

We also listed links we are considering adding to the left column main menu. Test subjects were asked to check mark those they agreed should be included. They were also asked if they paid much attention to bottom of page text links. We asked them if they thought they'd use a "Search This Site" search engine text entry box. Finally, we asked them, "For what purpose do you primarily use this Web site?"

Site Satisfaction Survey

This questionnaire, which determines if the Web site is meeting the organization's goals, was prefaced with, "Tell us what you really think, not what you assume we may be hoping to hear." Test subjects were asked:

  • Do you consider this Web site to be your #1 source of organization news, information, and forms?
  • Would you prefer another source for these things, and what would that be? (Email was mentioned by a test subject.)
  • Do you consider quick and easy access to organization news and information to be important?
  • Do you use your computer much, or visit other Web sites? Multiple choices ranging from "No...or I rarely visit _______ sites" to "Yes, online shopping" and "Yes, at my job I do computer work or visit Web sites."
  • Average time you spend on the computer: ______ hours per day.
  • What does this Web site do best? Not do a very good job at?
  • What could be done to improve this Web site?
  • Was anything missing from this Web site? Please specify.
  • Any comments you'd care to make about this Web site testing program?

System Usability Scale Questionnaire

The final form we asked test subjects to fill out is our modified version of the Digital Equipment Company Ltd. System Usability Scale (SUS) Scale, which is in the public domain. We modified this scale to accommodate Web sites. In its original version, it's designed for assessing any software or hardware system. It can be seen at www.usabilitynet.org under Home > Tools & Methods (bottom text link) > Established Questionnaires > SUS.

Our modified version of this questionnaire, which is strongly recommended for usability tests of Web sites, is available via email to anyone who contacts me to request it.

Conclusion

Now the hard part begins: compiling the results into a concise report for the organization. Provide the organization with a brief explanation of test methodology, a summary of the test results, and a prioritized list of recommended actions to take, based on the test results.

Let the organization know that no Web site is perfect. New web norms are being established at high-popularity sites. Current users are improving their skills and expecting advanced functionalities. New web users are continually logging onto the net landscape.

All these facts point to the reality that both the World Wide Web and the collective pool of users are constantly changing. Thus, usability evaluation should be a periodic, ongoing process, not a one time event.

 

 
Go to STC Society Web Site