may13_b.indd C&RL News May 2013 258 Digital card sorting poses opportunities for libraries to easily engage in usability research. In this article I discuss my experience conducting a Web-based card sorting activity, and outline the advantages and disadvantages for libraries adopting and using Web-based card sorting tools in an attempt to answer the question: Is digital card sorting better than analog? Libraries and librarians are no strangers to usability, and certainly not to low-cost usabil- ity testing. One such method is card sorting. In a card sort study, subjects arrange a list of items or concepts (usually written on index cards) into logical organizational categories. Researchers typically encourage participants to “talk aloud” during the study to hear their thought process. The end result is a rich set of data in terms of how users organize infor- mation and how they think while organizing information. Card sorting has proven to be an effective and cheap1 protocol to inform libraries about the information architecture of their research guides, Web sites, and other resources. Academic libraries have adopted card sort analyses in a variety of ways. These analyses help to direct library Web site organization and labeling in such a way to match users’ mental models. Card sorting is useful during Web site redesign projects,2 helps libraries as- sess visual consistency of Web sites,3 provides librarians with insight on user understandings of resource categorization and organization,4 and informs libraries of student mental models of research guides.5 For the most part, libraries seem to be performing card sorts the analog way—using paper and pens. But as Internet technologies and the related user-experience market grow, so too has product availability and the flex- ibility to cheaply conduct digital card sorting activities. A plethora of digital card sorting tools exist, yet academic libraries have not embraced them. Digital card-sorting activities may be conducted using a Web service or computer software. Articles that discuss the use of online card sorting services6 do not further examine digital card sorting methods. One case of online card sorting Last summer, I collaborated with a colleague at Oregon Health & Science University (OHSU) to create a LibGuide geared toward Oregon Masters of Public Health (OMPH) students and faculty. The OMPH is a joint degree program between three institutions, Portland State University (PSU), OHSU, and Oregon State University (OSU). Students of the OMPH program may take classes at any of the three institutions, but most of the cross-enrollment occurs between PSU and OHSU, due to their proximity in Portland, Oregon. OSU is 90 miles south in Corvallis, Oregon. As librarians supporting students who may enroll in classes at any of the collaborating institutions, we identified a need for a library research guide Emily Ford Is digital better than analog? Considerations for online card sort studies Emily Ford is urban and public affairs librarian at Portland State University’s Branford P. Millar Library, e-mail: forder@pdx.edu © 2013 Emily Ford ACRL TechConnect May 2013 259 C&RL News to clarify varying access issues and policies encountered by students at each library, and to showcase the differing and rich resources available to students. Because our users must navigate more than one library Web site, we felt it necessary to ensure that the guide would best match their mental models in terms of content organization and labeling. We decided that a card sort activ- ity would enable us to create a LibGuide better suited to OMPH student and faculty needs. As a professional graduate program, many, if not most, of the students enrolled in the OMPH program are nontraditional, returning, part-time students with full-time jobs, fami- lies, and generally busy lives. Classes in the program are offered mainly in the evenings and may also be taught by adjunct faculty members. Based on our experience, offer- ing targeted research workshops to OMPH students that resulted in poor turnout, we knew that recruiting students to participate in an in-person card sorting activity would be challenging. Since OMPH students had previously expressed to the librarians a need to receive library services and instruction at their points of need and convenience, a remote and asynchronous card sorting tool would best match our user group’s needs and ability to participate. As a result we pursued using online card sorting services to conduct the card sort study. We chose to use the OptimalSort7 tool, which is a product of the online usability software suite, Optimal Workshop.8 Using the free ver- sion of OptimalSort, we were able to conduct a small card sorting activity comprising of 30 cards, and the capability for up to ten individu- als to participate. (Using more cards or having more participants requires a subscription to the service.) Setting up the card sort was a cinch. All we had to do was create an account with Optimal- Sort, answer a series of short questions about the sort, and enter in our cards. We were even able to set up the study to record demographic information about users, such as their main university affiliation and status. What’s more, the free service allowed us to capture qualita- tive data using open-ended survey questions in combination with the card sort activity. Four days after we announced the study on the OMPH electronic list, the card sort au- tomatically closed. Thirteen users started the card sort and ten completed it. Participants from each institution affiliated with the OMPH program were represented, as were both fac- ulty and students. As we hoped, we received a mix of feedback regarding guide organization and label-naming conventions (the sort was structured as an open card sort, allowing users to name their own organizational categories). When it came time to analyze the card sort data, the tools provided by OptimalSort made the job relatively painless. OptimalSort provided easy-to-crunch output data without requiring us to use expensive and robust sta- tistical analysis software. Available downloads included spreadsheets loaded with data and pre-programmed macros that enabled us to easily and quickly normalize and standardize data. Furthermore, the site generated den- drograms, a similarity matrix, and participant- centric analysis, which are all visual tools that helped us analyze and understand the results.9 Based on the data we collected using the online card sorting tool, were able to create Online services Plenty of online services exist. The fol- lowing are the two mentioned in the LIS literature. • OptimalSort.com provides free card sorting activities with a limit of ten par- ticipants and a limit of 30 cards. Beyond the free service, OptimalWorkshop offers monthly and annual subscriptions to its usability tools and services. The company offers discounts on subscription services to nonprofit and educational institutions. • WebSort.net offers a free version of card sorting for studies of up to ten par- ticipants. Its pricing model offers services in packs of numbers of studies, rather than by a monthly or yearly subscription. It of- fers discounts to educational institutions and nonprofit organizations. C&RL News May 2013 260 a LibGuide that directly addressed OMPH student and faculty needs as they navigate multiple libraries for their learning and teach- ing, respectively. Advantages and disadvantages Although using OptimalSort’s free account worked well for our purposes, using an online tool for card sorting activity has advantages and disadvantages. If considering using card sorting methodologies for usability testing, libraries should consider whether analog or digital card sorting methodologies will best suit their needs. First, using a digital tool cuts down on the amount of time and energy needed to analyze data. Kari D. Weaver and Kimberly Babcock Mashek10 acknowledge the challenge of time intensity that analog card sorting brings to re- searchers. It has been discussed that using on- line card sorting tools cuts down dramatically on time spent crunching data.11 The task of entering the data of each card sort participant by hand is painstaking. Imagine entering by hand just ten users’ organization and labeling. (As a comparator, our raw data from ten par- ticipants created a spreadsheet with 300 rows.) Arguably, the data received from an online card sort ensures users are comfortable in the study environment.12 Participants most likely use their own computers in the comfort of their own homes or work environments rather than working with index cards in an unfamiliar room with strangers watching. In this way, online card sort activities can alleviate anxieties users experience. Moreover, the remote and asynchronous nature of online card sorting studies enables librarians to target a wider user base than face-to-face studies. For librarians supporting part-time, returning and/or distance students, online card sorting allows for feedback from users who are unable to visit the library to participate in a card sort study, or those who could not otherwise be consulted face-to-face. Despite the time saved in analyzing data and the ability to reach a wider user base, online card sorting does pose some disad- vantages. Perhaps the largest disadvantage is the loss of qualitative data. In analog card sorting activities, researchers request that participants “think aloud.” Thinking aloud provides researchers with great insight into possible instances of conflict with organiza- tion, confusion with card labeling, and more. For our card sort, we realized we would lose this rich “think aloud” data, but we felt that we could still gather useful information without it. Moreover, larger card sort studies (those with more than 30 cards) may become un- wieldy online. In analog card sorts, participants may find it easier to conduct the activity by spreading out cards on a large table. Online, a long list of cards may force a user to scroll too much in a computer browser to view all of the cards in the activity. Another disadvantage of online card sorting may be attributed to service and/or software functionalities. Some services and software may mask card label lengths at a certain number of characters. As a result, a researcher may make suboptimal card label- ing choices based on a service’s functionality, rather than what he or she wants a label to accurately represent. The service or software may also not allow users to discard cards that they feel are not useful or do not make sense to include in their organization. In this way, discarded cards can provide valuable information to researchers about the content and organization of the study. Finally, using free online services may mean a restriction of the number of participants, which may fall below the numbers needed to get statistically significant results. This disadvantage, how- ever, is easily overcome either by purchasing software or subscribing to a Web service that allows for unlimited test subjects. In the end, librarians should weigh the advantages and disadvantages of analog and digital card- sorting methods to determine which method will best meet their needs. Conclusion Online card sorting activities can provide valuable information to librarians building smaller scale information portals, such as research guides. This usability technique also May 2013 261 C&RL News enables distance learners to participate in the improvement of library Web sites and tools, by providing valuable input to the information architecture of library created objects. Librar- ies and librarians will need to suss out for themselves if online card sorting techniques will meet the needs of their library, project, and patrons. For us, digital was better than analog. OptimalSort was the right tool at the right time and at the right price. Notes 1. Erica Reynolds, “The Secret to Patron- centered Web Design: Cheap, Easy, and Powerful Usability Techniques,” Computers in Libraries (June 2008). 2. Nina McHale, “Toward a User-Centered Library Home Page,” Journal of Web Librari- anship 2 (2/3): 139–76; Dominique Turnbow, Kris Kasianovitz, Lise Snyder, David Gilbert, and David Yamamoto, “Usability Testing for Web Redesign: A UCLA Case Study,” OCLC Systems & Services 21 (3): 226–34; Laura Robbins, “What a User Wants: Redesigning a Library’s Web Site Based on a Card-Sort Analysis,” Journal of Web Librarianship 1 (4): 3–27; Richard Rogers and Hugh Preston, “Us- ability Analysis for Redesign of a Caribbean Academic Library Web Site: A Case Study,” OCLC Systems & Services 25 (3): 200–211; Krystal M. Lewis and Peter Hepburn, “Open Card Sorting and Factor Analysis: A Usability Case Study,” The Electronic Library 28 (3): 401–16. 3. Thea van der Geest and Nicole Loorbach, “Testing the Visual Consistency of Web Sites,” Technical Communication 52 (1): 27–36. 4. Philip Hider, “Library Resource Catego- ries and Their Possible Grouping,” Australian Academic & Research Libraries 40 (2): 105–15. 5. Caroline Sinkinson, Stephanie Alexander, Alison Hicks, and Meredith Kahn, “Guiding Design: Exposing Librarian and Student Mental Models of Research Guides,” portal: Libraries and the Academy 12 (1): 63–84. 6. Hider, Australian Academic & Research Libraries, 105–15. 7. OptimalSort, www.optimalsort.com. 8. Optimal Workshop, www.optimalwork- shop.com. 9. Screenshots of dendrograms, similarity matrix, and participant-centric analysis from the project are available at http//dr.archives. pdx.edu/xmlui/handle/psu/9246. 10. Kari D. Weaver and Kimberly Babcock Mashek, “Creating Usability Tests That Work for Your Web Site and Other Web Applications,” in Brick and Click Libraries Symposium, 51–57. 11. Cassi Pretlow, “10 Web Tools to Create User-Friendly Sites,” Computers in Libraries 28 (6): 14–17. 12. AM Wichansky, “Usability Testing in 2000 and Beyond,” Ergonomics 43 (7): 998–1006. 2012-154, US Dept. of Education (October 2011), http://nces.ed.gov/pubs2012/2012154. pdf. 2. “Standards for Distance Learning Li- brary Services,” www.ala.org/acrl/standards /guidelinesdistancelearning. 3. “Not Finding It?” widget example, http://library.uncg.edu/borrowing/. 4. Kathleen Pickens-French and Krista McDonald, “Three’s Company,” presentation at the ALA Annual Conference in Anaheim, California, June 20–25, 2012. 5. “UNCG Libraries Instructional Tech Tool- kit: Present section,” http://uncg.libguides. com/content.php?pid=298104&sid=2497737. 6. “ANTS: Animated Tutorial Sharing Proj- ect,” http://ants.wetpaint.com/. 7. “Peer Reviewed Instructional Materi- als Online Database,” www.ala.org/cfapps /primo/public/search.cfm. 8. “Poll Everywhere,” www.polleverywhere.com. 9. “A Short List of How-Tos,” http:// libguides.usu.edu/how. 10. “Diving in and Learning to Swim as a New Distance Education Librarian,” http:// guides.library.msstate. edu/divingin. (“Diving into distance learning librarianship,” cont. from page 257)