796 Getting More Value from the LibQUAL+® Survey: The Merits of Qualitative Analysis and Importance- Satisfaction Matrices in Assessing Library Patron Comments Brian Detlor and Kathryn Ball Brian Detlor is Associate Professor in the DeGroote School of Business at McMaster University; e-mail: detlorb@mcmaster.ca. Kathryn Ball is Director, Assessment and Accountability, in Mills Memorial Library at McMaster University; e-mail: katball@mcmaster.ca. ©2015 Brian Detlor and Kathryn Ball, Attribution- NonCommercial (http://creativecommons.org/licenses/by-nc/3.0/) CC BY-NC. This paper examines the merit of conducting a qualitative analysis of LibQUAL+® survey comments as a means of leveraging quantitative LibQUAL+ results, and using importance-satisfaction matrices to present and assess qualitative findings. Comments collected from the authors’ institution’s LibQUAL+ survey were analyzed using a codebook based on theoretical insights of customer satisfaction with library features. Quali- tative findings extended the quantitative results and yielded key recom- mendations that were new or unclear from the quantitative results alone. Importance-satisfaction matrices were beneficial in pinpointing primary and secondary opportunities for improvement, areas to place continued emphasis, and areas where expectations were exceeded. ne of the most well-known assessment tools used by academic libraries to measure service quality is the LibQUAL+ standardized test instrument administered by the Association of Research Libraries (ARL). The test polls library users, via the convenience of a Web survey, about the services a library provides. One of the advantages of the instrument is the ability for a library to compare its own aggregate scores with those from prior years, and with scores from peer libraries as well. Another is the relative ease with which the survey is administered. An individual library provides its library users with a URL to the survey; from there, survey results are collected and stored in a centralized ARL LibQUAL+ database and then analyzed and presented back to the library in individualized reports describing their users’ desired, perceived, and minimum expectations of service.1 The LibQUAL test scale has been shown to be a reliable and valid survey instru- ment.2 The instrument has been widely used by more than 1,200 institutions across five continents to date.3 However, LibQUAL+ is not without its critics. Some research- ers find that LibQUAL+ respondents do not fully understand the three service levels asked in the survey (that is, minimum, desired, and perceived).4 Some researchers doi:10.5860/crl.76.6.796 crl14-665 Getting More Value from the LibQUAL+® Survey 797 point out that gap scores calculated on these service level scores are moving targets since respondents’ expectations change with more experience or new developments.5 Further, some researchers conclude that those in libraries appointed to analyze LibQUAL+ data tend to find the use of multiple gap scores in the LibQUAL+ instru- ment confusing and not inherently insightful when trying to use these scores to plan or take appropriate actions to address library service issues.6 Others agree and caution the difficulty of interpreting quantitative measurements alone when assessing library user satisfaction, recommending that libraries consider their own particular situations and contexts when doing so.7 In response, some libraries turn to analyzing the qualitative comments made by LibQUAL+ respondents as a means of getting more value out of the survey data and gaining greater insight into the perceptions library patrons have about their library’s services and operations. Note that the LibQUAL+ survey allows respondents to answer one open-ended question: “Please enter any comments about library service in the box below.” Respondents are free to discuss any topic of their choice. Often respon- dents discuss items of importance to them. Approximately 40 percent of respondents provide textual comments.8 Many libraries find these comments helpful since they provide a level of insight not captured by the closed-ended Likert-scaled questions in the LibQUAL+ survey. Though several researchers have written about the analysis of quantitative LibQUAL+ data, very few have written about the analysis of qualitative LibQUAL+ data. More research on the utility of conducting a qualitative analysis on LibQUAL+ survey data is needed.9 Given the additional insight an understanding of LibQUAL+ comments could potentially provide library decision-makers, libraries would benefit from practical instructions on how to analyze comments from the open-ended question found in the LibQUAL+ survey. One potential avenue worthy of exploration is the utility of customer importance-satisfaction matrices to help decipher qualitative comments. Importance-satisfaction matrices (such as importance-satisfaction charts or service attribute matrices) are popular information dissemination tools used by marketing researchers to highlight customer perceptions toward the importance of, and satisfac- tion with, an organization’s services.10 These matrices are known to be low-cost, easily understood techniques that can yield insight into areas where organizations should devote more attention (in other words, areas of high importance where customers are the least satisfied).11 Specifically, these matrices are useful in pinpointing primary and secondary opportunities for improvement, areas to place continued emphasis, and areas where expectations are exceeded.12 It is important to recognize that, by recommending the use of importance-satisfaction matrices, the authors of the paper are not advocating the abandonment of a detailed review of individual comments. Indeed, it is only after a careful reading and analysis of the comments made by survey respondents that the information-satisfaction ma- trices can be created. Given this, several questions arise about conducting a qualitative analysis of LibQUAL+ data. How should libraries go about analyzing LibQUAL+ comments in a systematic, objective way that provides library decision-makers with valid information for decision-making purposes? What additional insight, if any, does a qualitative analysis of the LibQUAL+ data provide over that of a quantitative analysis? Given that qualita- tive analysis tends to be a more time-consuming and energy-expending exercise than quantitative analysis procedures, is the extra effort of analyzing LibQUAL+ comments worth the insight gained? Finally, to what extent are importance-satisfaction matrices useful in assessing and deciphering qualitative LibQUAL+ comments as a means of identifying key priority areas of library service and operations that need to be addressed? 798 College & Research Libraries September 2015 To find answers to these questions, the researchers conducted a qualitative analysis of their own university’s most-recently administered LibQUAL+ survey and presented their findings to senior library management using importance-satisfaction matrices. Senior library administrators at McMaster University supported the study, as they felt a qualitative analysis of LibQUAL+ comments would help the library identify ways to offer better service to its constituents in the future. It was hoped that analysis of the qualitative comments via the use of importance-satisfaction matrices would lead to better understanding of top priorities and yield better recommendations on improve- ments to library services in ways that more closely matched library patron needs, expectations, and priorities. Background Qualitative Analysis of LibQUAL+ Data Few studies exist that discuss the qualitative analysis of LibQUAL+ data.13 Of the ones reported in the literature, two stand out in terms of the amount of detailed description provided on how to conduct a qualitative analysis of LibQUAL+ comments. The first is a study by Bradford and Bower.14 These authors provide an excellent overview of qualitative analysis techniques and specifically explain how content analysis software can be used to analyze LibQUAL+ survey data.15 The second is a study by Begay et al.16 These authors describe a grounded-theory approach of coding and analyzing LibQUAL+ comments made by library patrons. Similar to the first study mentioned, these authors used text analysis software to help code library patron comments into categories and explore relationships and possible associations between the coded comments.17 Both studies are similar in that the authors illustrate the importance of methodically analyzing library survey comments using well-known structured qualitative analysis techniques. For example, the two studies describe techniques, such as open and axial coding, commonly used by qualitative researchers.18 With open coding, the researcher breaks down, examines, compares, conceptualizes, and categorizes data. With axial coding, relationships between categories and subcategories of coded comments are established and used to merge categories back together. The result is a final “codebook” comprising a hierarchy of categories and a data set of qualitative comments “coded” to categories in the codebook. The two studies are also similar in their explanation of how to display and report coded comments to library administrators and work colleagues. Both describe the use of “frequency lists” to display a rank order of the most commonly coded categories, and the merit of using frequency lists to communicate the relative importance of items on the list to one another. The underlying concept here is that the greater the frequency of any particular category on the list, the more likely that category is of heightened concern to library patrons. Two items from these studies are of particular relevance to this paper. The first is the structure of Bradford and Bower’s frequency list. They devised their category naming structure to embed attributed meaning. For example, by coding the category pertaining to collections as “collection—negative” and “collection—positive,” the list provides a mechanism by which to easily discern how many comments about the library collection were positive or negative in nature when these categories were dis- played. The second was how Begay et al. organized categories in their list according to demographic group (such as “undergraduate,” “graduate,” “faculty,” “staff”) so that comparisons could easily be made in terms of the frequency of coding categories by user group. Both of these points were taken into consideration for the current study. What was insightful was the need to present coded library comments in a way that Getting More Value from the LibQUAL+® Survey 799 showcased both the importance of coding categories along demographic lines, as well as the extent to which coded comments were positive or negative in tone. This led to exploration of importance-satisfaction matrices as a potential mechanism to present LibQUAL+ qualitative survey findings. Importance-Satisfaction Matrices Importance-satisfaction matrices are a popular information-dissemination tool used by marketing researchers across a variety of industries to highlight customer percep- tions toward the importance of, and satisfaction with, an organization’s services.19 These matrices can be used to assess customer perceptions of the importance of, and satisfaction with, an organization’s services. With the help of these matrices, customer feedback can be organized into five areas: 1. primary areas of improvement (high importance/low satisfaction); 2. secondary areas of improvement (low importance/low satisfaction); 3. areas of continued emphasis (high importance/high satisfaction); 4. areas of exceeding expectations (low importance/high satisfaction); and 5. zones of indifference (moderate importance/moderate satisfaction). Figure 1 illustrates the various areas and zones of an importance-satisfaction matrix. Marketing researchers suggest that organizations focus attention on items in the “primary area of improvement” quadrant. Underperformance with activities identified in this area has the potential to alienate customers. Organizations should consider this area to be the “trouble zone” where immediate attention by management is required. Items that fall in the “secondary area of improvement” quadrant are not priorities requiring immediate attention. Customers view items in this area to be of lesser con- cern. Marketing researchers suggest that management should not devote too much FIGURE 1 The Importance-Satisfaction Matrix 800 College & Research Libraries September 2015 energy enhancing the performance of items in this area, as the return on investment is lower. Items found in the “area of continued emphasis” are considered to be part of an organization’s success. Marketing researchers suggest that organizations continue to invest in maintaining this high level of performance. Overall, customers think that the organization is doing a good job in currently servicing items in this area and that these items are important. Customers are likely to think that continued (or increased) emphasis of items in this area is required. Items in the “area of exceeding expectations” are activities where customers are satisfied with how the items are serviced, but they do not consider these items as important as others. Items in this area do positively affect the customer experience; however, their absence may not necessarily harm it. Marketing researchers suggest maintaining these items if they are low cost or can be indirectly tied to revenue. Conversely, organizations should consider reducing service for these items as a means of improving overall profitability of the organization. Items in this area can be differentiators that help an organization distinguish itself from its competi- tors. Items in the “zone of indifference” should be monitored. Marketing researchers suggest that no immediate action is required in enhancing service, as customers view items in this area to be relatively important and performing moderately well.20 Methodology The LibQUAL+ Lite survey administered in spring 2013 by McMaster University Library served as the study’s data set. The survey was conducted between March 4 and March 29, 2013. Approximately 3,000 undergraduate students, 1,000 graduate students, and 600 faculty members were invited to complete the survey. In total, 620 valid surveys were received. This compares to 473 completed surveys in 2010 when the same survey was last run at McMaster. In terms of representativeness by user group, undergraduate students were underrepresented, while graduate student and faculty responses were overrepresented . However, this pattern is typical with other LibQUAL+ surveys administered at McMaster over the last several years. Overall, a fair representation across various academic disciplines and programs occurred. Of the 620 surveys, 275 (or 44.4%) contained qualitative comments to the one open-ended question in the LibQUAL+ survey. The Coding Process Prior to any coding of the data, a conceptual framework was developed to help structure the coding process and establish coding rigor. A conceptual framework was considered necessary to help set the boundaries of investigation and guide the coding process. To devise the conceptual framework, the researchers conducted a literature review on studies concerning performance measurement and assessment of library services. The researchers wanted to ensure that the conceptual framework was based on prior empirical work and contained a comprehensive list of relevant theoretical constructs as much as possible. In the end, theoretical insights from Oliva and Moroni of customer satisfaction with library features informed the development of the study’s conceptual framework.21 Oliva and Moroni identify six library features (opening times, spaces, staff, collections, services, and communication) and provide a list of 14 library services that can be assessed in terms of satisfaction with respect to the six library features identified. The conceptual framework was useful in developing an initial codebook by which to categorize the qualitative data. One researcher (a librarian) went through all com- ments and coded them according to the guidelines of the codebook. Questions/con- cerns were discussed with the other researcher (a faculty member). Changes to the codebook were made accordingly (new categories were identified, some categories were merged, others were deleted). The faculty researcher conducted a second round Getting More Value from the LibQUAL+® Survey 801 of coding, double-checking the validity and suitability of the coding made in the first round. Modifications were discussed between the two researchers, resulting in changes to the codebook and the coding of data. In this way, the initial codebook that was first developed evolved and dynamically changed over iterative rounds of data coding using open and axial coding techniques. The final version of the codebook comprised three main divisions: 1) library features, 2) user satisfaction, and 3) user demographics (see table 1). In terms of library features, six discrete types were identified: 1) collections; 2) com- munication; 3) hours of operation; 4) personnel; 5) services; and 6) space. The majority of these library features were then further categorized in the codebook as comprising TABLE 1 Codebook Features (the characteristics of the library as a whole; refers to any service or resource offered by the library) Attributes (characteristics of the library feature) Types (instantiations of the library feature) Collections (refers to information items the library holds and/or provides access to) Access (how easy it is to access the collection) Ease of Use (how easy it is to use the collection) Quality (the quality of the collection available) Quantity (the amount of material in the collection available) Usefulness (how useful the collection is) Electronic Collections • Article Databases • Digital Collections • E-books • E-journals • E-maps & GIS Data • Institutional Repository • Theses Physical Collections • Archives & Research Collections • Hardcopy Books • Hardcopy Journals • Theses Communication (refers to the various media tools the library uses to communicate and correspond with its users) Access (how easy it is to access the media tool) Ease of Use (how easy it is to use the media tool; user- friendliness of the interface of the media tool) Quality (the quality of the information provided by the media tool) Quantity (the amount of information provided by the media tool) Usefulness (how useful the media tool is) • Bulletin Boards • E-mail • Exhibits • Newsletter • Social Media (such as Twitter, Facebook) • Website 802 College & Research Libraries September 2015 TABLE 1 Codebook Features Attributes Types Hours of Operation (the hours the library is open) Personnel (includes all library personnel) Attitude (the personality, disposition, demeanor, friendliness etc. of library personnel) Availability (how easy it is to access library personnel to get their help) Helpfulness (the ability of library personnel to resolve a user’s problem) Skills (the competency of library personnel to do their work) Librarians Library Administrators/ Senior Managers Library Staff Services (refers to the various services the library provides its customers) Access (how easy it is to access the service) Ease of Use (how easy it is to use the service) Quality (the quality of the service) Quantity (the amount of service available) Usefulness (how useful the service is) Circulation Services • Course Reserves • Interlibrary Loan Finding Tools (Tools that allow users to find items housed in the library’s collections) • Catalogue • Databases • Subject Guides Information Literacy Instruction Information Technology Services • Computer Workstations • Electrical Outlets (electrical plugs) • Internet • Laptop Lending Services Photocopy And Printing Services Reference Services • Virtual Reference • Face-to-Face Reference / Reference Desk Room Booking Services Getting More Value from the LibQUAL+® Survey 803 TABLE 1 Codebook Features Attributes Types Spaces (the physical spaces in the library; Access (access to the space) Amount (amount of space available) Cleanliness (how clean or untidy the space is) Food and Drink (food and drink in the library space) Noise (quietness/ loudness of the space) Size (how small or big the space is) Group Study Space Individual Study Space General Library Space (refers to the spaces in the library as a whole) Specific Library Space (refers to specific spaces in the library) • Cafe Space • Elevators • Media Space • Prayer Space • Stacks • Washrooms Satisfaction (how satisfied a user is with a particular library feature) Possible Values Unsatisfied (not satisfied; dissatisfied; has problems with) Satisfied (okay with; not displeased, but also not highly pleased; neutral) Highly Satisfied (very pleased) Demographics (personal information about the user) Category Possible Values Discipline (the user’s academic discipline or primary area of affiliation) Arts & Science Business Engineering Humanities Health Sciences Science Social Sciences User Group (the academic cohort to which the user belongs) Faculty Graduate Students Undergraduate Students 804 College & Research Libraries September 2015 both an attribute and a type. Attributes pertained to quality characteristics of the library feature, while types pertained to specific instantiations of the library feature in ques- tion. For example, with respect to collections, attributes comprised categories such as “quantity,” “ease of use,” “usefulness,” and “accessibility,” while types pertained to categories such as “e-book,” “catalogue,” “website,” and “interlibrary loan.” In terms of user satisfaction, the codebook facilitated the recording of three levels of library patron satisfaction (“unsatisfied,” “satisfied/neutral,” “very satisfied”) with each comment made. Satisfaction levels were determined by the extent to which a comment was negative or positive in its tone and message. With respect to demographics, two sets of characteristics were incorporated. The first was “user group” (faculty, graduate students, and undergraduate students); the second was “discipline” (for example, business, engineering/computer science, humanities, science). Other demographic descriptions were considered (such as age, sex, library building most frequently visited); however, it was felt that these other descriptions were not necessary in that they could be interpreted from the user group and discipline categories already established. All 275 qualitative comments were coded using the codebook outlined in table 1. Each comment was coded with a library feature type, one or more library feature attributes, and a satisfaction score. Most comments contained multiple statements of expression of library patron perceptions of library features. This yielded more than 700 coded qualitative statements (that is, 724 comments by user group, and 736 com- ments by discipline). Content Analysis Software To facilitate the coding and analysis of comments, a content analysis software pack- age—QSR NVivo—was used. Coding with the QSR NVivo tool occurred in two ways. Initially, QSR NVivo was used to “auto-code” comments based on user demographics (that is, by user group and discipline). This saved a substantial amount of work and yielded immediate results. From there, the researchers used QSR NVivo to manually code the comments. Analysis of the comments was facilitated by QSR NVivo via the use of query matrices. Query matrices were generated and run for all user group, discipline, and satisfaction combinations. This yielded 66 queries. Analysis of query results involved counting the frequency of coded comments for each query, as well as methodically examining query results and making constant comparisons. A Microsoft Word workbook was used to facilitate this process in terms of ensuring that all pos- sible queries were run, results were recorded, and interpretations were documented. Use of Importance-Satisfaction Matrices Once the data was coded and analyzed, the next step involved plotting the results onto importance-satisfaction matrices. An importance-satisfaction matrix was created for each significant category in the codebook (collections, communication, hours of operation, personnel, services, and spaces). Plotting the comments onto importance- satisfaction matrices involved scoring comments onto satisfaction and importance scales. Satisfaction was scored on a 0 to 100 percent scale. Satisfaction values were calculated by summing the number of comments in a category for a particular user group or discipline that were coded as being “satisfied” or “highly satisfied” and converting that number to a percentage of the total number of satisfaction comments coded in that category for that particular user group or discipline (that is, the total number of “unsatisfied,” “satisfied,” or “highly satisfied” comments for that particular user group or discipline). Importance was scored on a 0 to 40 percent scale. Since no category comprised more than 40 percent of the total number of comments coded for Getting More Value from the LibQUAL+® Survey 805 any particular user group or discipline, 40 percent was selected as the upper bound. Importance values were calculated by taking the number of comments coded for a category for a particular user group or discipline and converting that number to a percentage of the total number of comments received for that particular user group or discipline. Calculation of importance scores in this manner was based on the premise that the greater the number of comments made by a user group or discipline on a particular category, the higher the importance of that category to that particular user group or discipline. Findings Since the purpose of this paper is not to communicate the results of the qualitative analysis of the McMaster LibQUAL+ survey per se, but rather to communicate the merit of conducting a qualitative analysis and using important-satisfaction matrices to assess library patron comments, only a subset of high-level findings from the McMaster case study are reported here as a means to showcase the extent to which a qualitative analysis of LibQUAL+ comments and importance-satisfaction matrices are useful. Faculty valued library collections the most, as the majority of comments made by this group was about the collections. Most comments about the collections were primarily about access and quantity concerns. Faculty voiced displeasure finding and keeping track of library information resources. Faculty placed a high value on, and appreciated, librarians and library staff. In terms of library spaces, many comments reflected a view that the library should be a place to work and not socialize. In general, faculty were not happy with the recent trend to increasing group work areas with relaxed food and drink policies at the expense of individual quiet study areas. Graduate students valued library collections the most, as comments made by this group were predominantly about the collections. However, like faculty, most comments about the collections were primarily about access and quantity concerns. Graduate students found the library website cumbersome to use. Graduate students valued library personnel. With respect to library spaces, about twice as many negative as positive comments were received. Areas of concern were too much noise and a lack of individual study space. Positive comments reflected the value that graduate students have toward the library as a place to work. Undergraduate students valued library services and spaces the most. With respect to library services, many comments were about needed improvements in the physi- cal services the library provides: wireless Internet connection, study space, electrical outlets, computers, printing. Main concerns were about the amount of study space, in particular individual study space (more is needed) and noise issues (quiet areas not quiet; they should be monitored or redesigned). Undergraduates stated that existing spaces need to be updated and refurbished with new furniture, more comfortable chairs, and improved lighting. Positive comments reflected the value that undergradu- ate students place on the library as a place to work. Overall, not many comments were made by this user group about the collections. Undergraduates expressed concern over accessing and finding library resources. In response, they tend to use Google Scholar out of frustration. Undergraduates were less appreciative of library personnel. The above findings were plotted on an importance-satisfaction matrix (see figure 2). Plotting the results onto an importance-satisfaction matrix led to the identification of key recommendations to management. First, items in the upper left-most quadrant were identified as needing immediate attention (these were: collections for faculty and graduate students and spaces for un- dergraduates). As described above, activities identified in this area have the potential to alienate customers. Based on this, the following two recommendations were suggested: 806 College & Research Libraries September 2015 • Recommendation #1: Upgrade the library collections for faculty and gradu- ate students in terms of improvements in: 1) access to the collections and 2) the quantity and quality of the collections. Improvements in access could be facilitated through the development of more intuitive and easy-to-use interfaces to the library catalogue and website. Improvements in quantity and quality could be accomplished by purchasing additional back files to provide library patrons with more comprehensive and perpetual access to scholarly library materials. • Recommendation #2: Address undergraduate concerns with physical library services and the physical library space. Facilitating improvements in wireless Internet access within the library and improved remote access from home will go a long way in making this library user population happy. Meeting demands for more electrical outlets, better computers, faster and cheaper printing will increase the general satisfaction of this user group too. FIGURE 2 Sample Results Displayed on an Importance-Satisfaction Matrix Getting More Value from the LibQUAL+® Survey 807 Second, items that fell in the “secondary area of improvement” quadrant were identified as nonpriority items that did not require immediate attention. As described above, customers view items in this area to be of lesser concern. Management should not devote too much energy enhancing the performance of items in this area, as the return on investment is lower. Based on this, the following recommendation was made: • Recommendation #3: Care should be taken when making improvements to the library website design so that only “primary area of improvement” revisions are made. The library website was identified to fall within the “secondary area of improvement” quadrant. As such, it would probably be wise and more cost-effective to improve the library website in ways that only help address concerns identified in the “primary areas of improvement” quadrant, rather than making other kinds of modifications to the website that yield low investment returns. Specifically, any enhancements to the library website that help improve access to the collections (such as improved search- ing and browsing functionality; increased access to e-journals, e-books, and digital collections from the library website) should be done, as these would likely yield the greatest return on investment. Other modifications may not be worth the extra effort and expense. Third, items found in the “area of continued emphasis” were considered to be part of an organization’s success. As described above, marketing researchers suggest that organizations continue to invest in maintaining this high level of performance. Overall, customers think the organization is doing a good job in currently servicing items in this area and that these items are important. Customers are likely to think that continued (or increased) emphasis of items in this area is required. Based on this, the following recommendation was suggested: • Recommendation #4: Continue investing in library personnel, especially in ways that faculty and graduate students want. For example, continue recent efforts to hire adequate numbers of highly skilled librarians and other professional staff to support faculty and graduate students’ research and teaching needs. Discussion The outset of this paper called for answers to several questions concerning a qualitative analysis of LibQUAL+ data. The first question asked how libraries should go about analyzing LibQUAL+ comments in a systematic, objective way that provides library decision-makers with valid information for decision-making purposes. A review of the literature, and this study as well, speak to the utility of using structured qualitative analysis techniques such as open and axial coding, using content analysis software, and taking advantage of frequency lists of attributed coding categories along demographic lines to disseminate qualitative results to library decision-makers. In addition, this study provided evidence of the usefulness of a conceptual framework based on prior theory to structure an initial codebook design. The second question asked what additional insight a qualitative analysis of the LibQUAL+ data provides over a quantitative analysis. Prior literature, and this study too, suggest that content analysis of LibQUAL+ comments offers a more in-depth understanding of service quality as perceived by library patrons than can be obtained 808 College & Research Libraries September 2015 and communicated from LibQUAL+ quantitative data alone. For example, at McMaster, though quantitative findings from the LibQUAL+ survey did indicate that improve- ments were needed in library spaces for undergraduates, the qualitative results pro- vided much more contextual information in terms of specifics that could be addressed (such as providing better wireless Internet access in the library, more electrical outlets, improved individual study space). The sample of McMaster findings presented in this paper showcase the greater detail and insight a qualitative analysis offers. Qualitative findings extended the quantitative results and yielded key recommendations that were new or unclear from the quantitative results alone. The third question asked whether the extra effort needed to conduct a qualita- tive analysis of LibQUAL+ comments was worth the insight gained. Though any answer to this question is purely subjective, based on the researchers’ experience at McMaster, the answer is a resounding “yes.” The qualitative analysis at McMaster yielded detailed insights along demographic lines that library managers could sink their teeth into. As a result of this qualitative study, senior library manage- ment expressed they had a better understanding of library patron perceptions and priorities that could be put into action, more so than the quantitative LibQUAL+ results alone provided. The last question asked to what extent importance-satisfaction matrices were useful in assessing and deciphering qualitative LibQUAL+ comments. Overall, importance- satisfaction matrices were found to be beneficial in pinpointing primary and secondary opportunities for improvement, areas to place continued emphasis, and areas where expectations were exceeded. Specifically, the matrices were helpful in identifying ser- vices where the level of satisfaction was relatively low and the perceived importance of the service was relatively high. Of interest, the reported viability of importance-satisfaction matrices in this study is largely due to the fact that importance and satisfaction scores were derived from a qualitative analysis of textual comments, rather than calculated quantitatively from Likert-scaled scores. The researchers suggest that, when asked a single open-ended question to enter any comment about an organization’s services (as is done in the LibQUAL+ survey), users tend to enter comments concerning service items that are truly of importance to them and provide contextual information in their comments that express their true level of satisfaction with that service. This is probably a more effective strategy than presenting a list of services to customers and asking them to rate the importance and satisfaction with those services quantitatively on Likert-scaled questions. The researchers suggest this since customers may tend to rate all attributes as highly important when explicitly asked. For example, asking customers whether the importance of a certain product or service being “green” (that is to say, environmen- tally friendly) is important in their decision to be satisfied with that product or service may yield an affirmative response, but in actuality the degree to which a product or service is green or not may play little into their real behaviors to be satisfied with that product or service. It should be noted that many of the above findings are supported by recent results reported in the library and information science literature. For example, recent Ithaka Faculty Survey results also highlight the importance of library collections and access to research materials by faculty members.22 Further, studies using anthropological and ethnographic research methods to understand the research practices of undergraduate and graduate students also report that undergraduate students value “physical ser- vices” (such as strong wireless Internet signals and electrical outlets) and are generally unaware of the services and expertise of library staff to assist them, and that graduate students have a strong desire for separate, quiet study space.23 Getting More Value from the LibQUAL+® Survey 809 Conclusion This paper examined the merit of conducting a qualitative analysis on LibQUAL+ survey data as a means of leveraging quantitative survey results and using importance- satisfaction matrices to present LibQUAL+ qualitative findings. To facilitate this, textual comments collected from the authors’ home institution’s most recent LibQUAL+ survey data were analyzed and compared to the survey’s quantitative findings. The research- ers found that the qualitative analysis extended the survey’s results in meaningful ways. Of importance, the qualitative analysis led to the identification of key recom- mendations that were either new or not obviously clear from the quantitative results alone. The researchers conclude that the qualitative analysis, though problematic in terms of the extra time and energy required, does indeed add needed value. Overall, importance-satisfaction matrices were found to be beneficial in communicating areas of library service and operations that need attention. Library managers at McMaster were pleased and supportive of the study’s methodology and findings. Recall that by recommending the use of importance-satisfaction matrices, the authors of the paper are not advocating the abandonment of a detailed review of individual comments. Indeed, it is only after a careful reading and analysis of the comments made by survey respondents that the information-satisfaction matrices can be created. In fact, the authors feel the matrices could be useful in distinguishing which subset of comments by LibQUAL+ survey respondents are worth further investigation and/or more careful rereading and attention. It is the authors’ opinion that the matrices help prioritize which comments should be taken more seriously and prevents those who are reading the comments from being distracted or overly persuaded by less important comments that happen to command attention simply because they are lengthy, well articulated, or eloquently written. It is important to note that the study offers both theoretical and practical contribu- tions. In terms of theory, the conceptual framework and final version of the codebook are useful in furthering collective knowledge of library features (attributes and types) to consider when assessing a library’s services. With respect to practice, the utility of conducting a qualitative analysis of LibQUAL+ comments and using importance- satisfaction matrices to disseminate results and generate recommendations to manage- ment are a value-add. In conclusion, this paper is beneficial to those who are responsible for the analysis of LibQUAL+ qualitative library patron comments in their libraries. The paper pro- poses a structured and rigorous method by which to present a qualitative analysis of LibQUAL+ comments in a visual manner that library managers may find appealing and useful in terms of helping them make informed decisions concerning improve- ments to library operations and services. Notes 1. Ben Hunter and Robert Perret, “Can Money Buy Happiness? A Statistical Analysis of Predictors for User Satisfaction,” Journal of Academic Librarianship 37, no. 5 (2011): 402–08. 2. Martha Kyrillidou, Toni Olshen, Fred Heath, Claude Bonnelly, and Jean-Pierre Cote, “Cross- Cultural Implementation of LibQUAL+™: The French Language Experience,” 5th Northumbria International Conference Proceedings, Durham, U.K. (2003); Miguel Morales, Riadh Ladhari, Javier Reynoso, Rosario Toro, and Cesar Sepulveda, “An Independent Assessment of the Unidimensional- ity, Reliability, Validity and Factor Structure of the LibQUAL+TM Scale,” Services Industries Journal 32, no. 16 (2012): 2585–605; Bruce Thompson, Colleen Cook, and Martha Kyrillidou, “Concurrent validity of LibQUAL+TM scores: What Do LibQUAL+TM scores Measure?” Journal of Academic Librarianship 31, no. 6 (2005): 517–52; Bruce Thompson, Colleen Cook, and Russel L. Thompson, “Reliability and Structure of LibQUAL+TM scores: Measuring Perceived Library Service Quality,” portal: Libraries and the Academy 2, no. 1 (2002): 3–12. 810 College & Research Libraries September 2015 3. Association of Research Libraries, General Information – LibQUAL+, available online at www.libqual.org/about/about_lq/general_info [accessed 2 April 2014]. 4. Bruce Thompson, Colleen Cook, and Fred Heath, “The LibQual+ Gap Measurement Model: The Bad, the Ugly, and the Good of Gap Measurement,” Performance Measurement and Metrics 1, no. 3 (2000): 165–78. 5. Michael J. Roszkowski, John S. Baky, and David B. Jones, “So Which Score on the LibQUAL+ Tells Me if Library Users Are Satisfied?” Library & Information Science Research 27, no. 4 (2005): 424–39. 6. Tim Bower and Dennis Bradford, “How to Get More from Your Quantitative LibQUAL+TM Dataset: Making Results Practical,” Performance Measurement and Metrics 8, no. 2 (2007): 110–26. 7. Hunter and Perret, “Can Money Buy Happiness?” 407. 8. David Green and Martha Kyrillidou, “LibQUAL+ Procedures Manual,” available online at www.libqual.org/documents/libqual/publications/2011_proceduresmanual.pdf [accessed 28 April 2014]. 9. Wendy Begay, Daniel R. Lee, Jim Martin, and Michael Ray, “Quantifying Qualitative Data: Using LibQUAL+TM Comments for Library-Wide Planning Activities at the University of Arizona,” Journal of Library Administration 40, no. 3/4 (2004): 111–19; W. Dennis Bradford and Tim Bower, “Using Content Analysis Software to Analyze Survey Comments,” portal: Libraries and the Academy 8, no. 4 (2008): 423–37. 10. Ernest Azzopardi and Robert Nash, “A Critical Evaluation of Importance Performance Analysis,” Tourism Management 35 (2013): 222–33. 11. John Martilla and John James, “Importance-Performance Analysis,” Journal of Marketing 41, no. 1 (1977): 77–79. 12. Richard L. Oliver, Satisfaction: A Behavioral Perspective on the Consumer (Boston, Mass.: Irwin McGraw-Hill, 1977). 13. Bradford and Bower, “Using Content Analysis Software,” 424. 14. Bradford and Bower, “Using Content Analysis Software,” 423–27. 15. Ibid. 16. Begay et al., “Quantifying Qualitative Data,” 111–19. 17. Begay et al., “Quantifying Qualitative Data,” 111–19. 18. Anselm L. Strauss and Juliet M. Corbin, Basics of Qualitative Research: Techniques and Proce- dures for Developing Grounded Theory, 2nd ed. ( Thousand Oaks, Calif.: Sage Publications, 1998). 19. Karl Albrecht and Lawrence J. Bradford, The Service Advantage: How to Identify and Fulfill Customer Needs (Homewood, Ill.: Dow Jones Irwin, 1990). 20. Barbara A. Almanza, William Jaffe, and Lingchun Lin, “Use of the Service Attribute Matrix to Measure Customer Satisfaction,” Hospitality Research Journal 17, no. 2 (1994): 63–75; Ahmet Ak- tas, A. Akin Aksu, and Beykan Cizel, “Destination Choice: An Important-Satisfaction Analysis,” Quality & Quantity 41 (2007): 265–73. 21. Laura Oliva, “La rilevazione della customer satisfaction in biblioteca: il caso di Milano- Bicocca” (PhD diss., 2013), available online at http://hdl.handle.net/10760/18631 [accessed 15 February 2014]; Ilaria Moroni, “User Satisfaction Surveys in Two Italian University Libraries: Model, Results and Good Practices,” 5th International Conference on Qualitative and Quantitative Methods in Libraries, June 4–7, 2013), La Sapienza University, Rome, Italy [accessed 2 April 2014]. 22. R. Housewright, R.C. Schonfled, and K. Wulfson, Ithaka S+R Faculty Survey 2012 (2013), available online at www.sr.ithaka.org/research-publications/us-faculty-survey-2012 [accessed 6 October 2014]. 23. S.G. Gibbons, “Techniques to Understand the Changing Needs of Library Users,” IFLA Journal 39, no. 2 (2013): 162–67; E. Yoo-Lee, T.H. Lee, and L. Velez, “Planning Library Spaces and Services for Millennials: An Evidence-Based Approach,” Library Management 34, no. 6/7 (2013): 498–511.