College & Research Libraries vol. 79, no. 4 (May 2018) 554 Experiencing Evidence-Based Library and Information Practice (EBLIP): Academic Librarians’ Perspective Lili Luo* This study investigates practitioners’ involvement in Evidence-Based Library and Information Practice (EBLIP) at an academic library. Through focus group interviews, the study reveals that most of the evidence-based decisions in academic library practice are considered “Know-what (works)” and serve the “instrumental” purpose, seeking to determine what actions will lead to desired outcomes in addressing a specific problem. Practi- tioners use a wide range of evidence sources to support their decision making. Challenges they encounter in EBLIP related to time, mentoring/ training, availability, and accessibility of evidence, organizational cul- ture, and personality. Study findings will help increase the awareness of evidence-based practice in academic libraries, deepen the professional understanding of EBLIP, enrich the literature on the topic, and identify important issues pertinent to EBLIP for further exploration. Introduction Evidence-Based Library and Information Practice (EBLIP) refers to the practice that “promotes the collection, interpretation and integration of valid, important and ap- plicable user-reported, librarian-observed, and research-derived evidence.”1 EBLIP allows librarians to apply the best available evidence, moderated by user needs and preference, to improve the quality of professional judgment. It is a movement to change the direction of library practice to be more research-based and requires a paradigm shift in the profession. EBLIP signifies the incorporation of research as a means to improve the quality of librarians’ day-to-day decision making.2 After reaching out to library directors and other thought leaders nationwide, Library Journal identified 11 essential skills that librarians are expected to master in the next 20 years.3 One such skill was the ability to determine the data needed to make decisions, understand how to collect, analyze, and gain insight from that data, and present the accompanying narrative to explain it to others. This skill indicates the necessity and importance for librarians to engage in EBLIP because data is a crucial type of evidence that could be applied to improve library practice. * Lili Luo is Associate Professor in the School of Information at San Jose State University; e-mail: lili.luo@ sjsu.edu. The author would like to extend her gratitude to the Statewide California Electronic Library Consortium (SCELC) for supporting this study, and to Kristine Brancolini, Marie Kennedy, and Susan Gardner for their generous support in the study process. ©2018 Lili Luo, Attribution-NonCommercial (http://creativecommons.org/licenses/by-nc/4.0/) CC BY-NC. doi:10.5860/crl.79.4.554 mailto:lili.luo@sjsu.edu mailto:lili.luo@sjsu.edu http://creativecommons.org/licenses/by-nc/4.0/ https://doi.org/10.5860/crl.79.4.554 Experiencing Evidence-Based Library and Information Practice 555 This article presents a study that investigates EBLIP in an academic library setting, seeking to answer the following research questions: 1) what types of decisions are be- ing supported by evidence; 2) how evidence is used in supporting decision making; and 3) what the challenges are in the EBLIP process. Focus group interviews were conducted to provide an in-depth examination of how academic librarians and library staff employ the best available evidence to arrive at sound decisions about solving practical problems. Through the exploration of their engagement in EBLIP, we hope to promote the awareness of EBLIP in academic libraries and ultimately contribute to the preparation of librarians and library staff for successful EBLIP. Literature Review The movement of evidence-based practice originates in medicine in the early 1990s. Evidence-based medicine (EBM) focuses on merging what is learned from the litera- ture with what is observed in daily practice, all to produce a better-informed outcome for patients.4 Over the years, this evidence-based approach has been applied to other fields, including librarianship. EBLIP involves “methods for resolving daily problems in the profession through the integration of experience and research. It involves asking questions, finding information to answer them (or conducting one’s own research) and applying that knowledge to our practice.”5 Booth and Brice6 (2004) established that the EBLIP process contains five steps: 1) define the problem or formulate the question; 2) find the evidence; 3) critically appraise the evidence; 4) apply the appraised evidence to the problem; and 5) quality assurance—evaluate the plan. The process is also referred to as the 5A model because it goes through the five stages of Ask, Acquire, Apprise, Apply, and Assess. Later, Booth7 refined this model and proposed a more corporate and collaborative 5A model of Articulate (the problem), Assemble (the evidence base), Assess (the evidence), Agree (the actions), and Adapt (the implementation), to better accom- modate the complexity of library problems and the iterative nature of the EBLIP process. As the key component of EBLIP, evidence has been the focus of the literature. Draw- ing upon EBM traditions, Eldredge8 proposed a nine-level hierarchy of evidence in EBLIP: 1) systematic reviews of multiple rigorous research studies; 2) systematic re- views of multiple, but less rigorous research studies, such as case studies and qualitative methods; 3) randomized controlled trials; 4) controlled-comparison studies; 5) cohort studies; 6) descriptive surveys; 7) case studies; 8) decision analysis; and 9) qualitative research. This hierarchy only includes research evidence and is arranged in descend- ing order of the evidence’s methodological soundness. Crumley and Koufogiannakis9 challenged the evidence hierarchy by stating that librarianship, as a profession, “tends to reflect more qualitative, social sciences/humanities in its research methods and study types which tend to be less rigorous and more prone to bias.” They argued that randomized controlled trials were minimal in library research and should not be placed as top-level evidence. In a later study, Koufogiannakis10 further discovered that librarians’ conceptualization of evidence was more inclusive than research literature. Her findings indicated that librarians used both hard evidence and soft evidence in EBLIP. Hard evidence consists of published literature, statistics, local research and evaluation, nonscholarly publications, and facts. Soft evidence comprises input from colleagues, tacit knowledge, feedback from users, and anecdotal evidence. Lewis11 supported the idea of using anecdotes as evidence in EBLIP, but she cautioned that it should not be used in isolation as the basis for major changes. She explained that it can, however, be used to inform further investigation. Gillespie et al.12 examined EBLIP among Australian librarians and reached conclusions similar to Koufogiannakis,13 revealing that librarians perceived evidence to be more encompassing and include a wide variety of sources. 556 College & Research Libraries May 2018 Crumley and Koufogiannakis14 identified six domains of library practice where EBLIP can be applied: reference/inquiries, education, collections, management, informa- tion access and retrieval, and marketing/promotion. To effectively implement EBLIP in these domains, they suggested that librarians keep online lists of questions that have already been studied and those that need to be investigated. Then, librarians should explore which study designs and methods (such as case study, experimental design, or survey) best answer the questions from a particular domain and compile appropriate resources and search terms that match each of the six domains to facilitate the process of searching and locating evidence in the literature.15 They also highlighted the importance of educating new librarians to take an evidence-based approach to their profession, teaching them research skills and cultivating an appreciation of research among them. It is well acknowledged that research constitutes an essential source of evidence in EBLIP. However, there have been complaints that the research evidence base is not large enough and of poor quality.16 The library profession was criticized for being overly focused on practice and lacking research-mindedness. One study analyzed the contents of 1,880 articles in library and information science journals and found that only 16 percent “qualified as research.”17 The gap between research and practice is so concerning that the 2016 Annual Conference of the Association of Library and Information Science Education dedicated its prestigious President’s Panel to address this issue.18 Researchers have increasingly called for more endeavors to explore how to gradually bridge the research-practice gap in librarianship.19 Another related barrier to EBLIP is that librarians lack competencies for critically ap- praising research evidence and applying it to practice.20 The main cause for this barrier is inadequate research education and training for librarians. This reinforces the suggestion from Crumley and Koufogiannakis21 that library schools must play a major role in equip- ping future librarians with a solid understanding of what research means and entails, and mastery of the necessary skills and knowledge to design, conduct, and disseminate quality research. Meanwhile, continuing education is also pivotal to strengthening librar- ians’ research competencies and capability in conducting EBLIP. The federally funded program that provides research methods training for academic librarians, the Institute for Research Design in Librarianship (IRDL), witnessed significant improvement in librarians’ research confidence and skills after they completed the training program.22 Other obstacles to EBLIP include negative organizational dynamics, such as poor leadership or an organizational culture that does not value evidence, time constraints, negative personal outlook such as self-doubt or fear to ask for help, and lack of access to needed evidence.23 Identification of these obstacles could help librarians understand how to make progress toward being more evidence based.24 It is worth noting that evidence-based practice is widely applied in a variety of disci- plines, including audiology, speech-language pathology, dentistry, nursing, psychology, social work, and education. It is also increasingly implemented at the center of political and policy debates. Tseng25 explained that “much of federal focus on building and us- ing research evidence has embraced a What Works agenda.” There have been various initiatives to use research evidence of what works in administering federal programs, and executive departments and agencies are encouraged to apply behavioral science insights in their work. Advocates have suggested legislative language for defining “evidence-based” in the reauthorization of the Elementary and Secondary Education Act. The Evidence-based Policymaking Commission Act has identified ways that data and research can improve public policy. Given the growing trend of evidence-based practice, library practitioners need to promote more actively and to engage in EBLIP. This study builds on the existing EBLIP literature and explores practitioners’ perception of and involvement in EBLIP in an academic library setting. Experiencing Evidence-Based Library and Information Practice 557 Study Procedures Focus group interviews were conducted among library practitioners at a mid-sized university library in California. The library serves a population of 561 full-time faculty and 8,187 students and employs 26 full-time librarians with MLIS degrees and 22 nondegreed library staff members. A focus group generally involves 6 to 12 individuals discussing a particular topic under the direction of a moderator who promotes integration and ensures that the discussion remains on the topic of interest.26 The basic purpose of the focused interview is to gather qualitative data from individuals about their experience or attitude regard- ing some particular concrete situation, which serves as the focus of the interview.27 Focus groups are commonly used for research that is either exploratory, clinical, and/ or phenomenological.28 In this study, practitioners’ perceptions of and involvement in EBLIP was the “particular concrete situation” that required investigation. The study was relatively singular in focus and exploratory in nature. Thus, the focus group interview was determined to be a proper instrument for data collection. With the approval of IRB, three focus groups were conducted with three differ- ent populations—library managers (7 participants), librarians (8 participants), and nondegreed library staff (7 participants). Each focus group was about 90 minutes, and participants received a $50 Amazon.com gift card for their participation. The validity of focus group interviews is usually affected by the extent to which participants feel comfortable about openly communicating their ideas, views, or opinions. Stewart et al.29 summarized the variables that influence group dynam- ics into three broad categories: individual differences, interpersonal factors, and environmental factors. In this study, we made conscious efforts to reduce the in- fluences to a minimum. Each focus group contained only participants of the same rank, which enabled them to speak freely and not be concerned about contradicting others. Since the participants already had an existing relationship, they were able to accommodate individual differences and interact with each other in a friendly and respectful manner. Meanwhile, the topic under study was not a sensitive issue that could have provoked strong emotional responses, which helped contribute to a positive group dynamic. All participants were encouraged to partake in the discussion, and the moderator directed the conversation to avoid group confor- mity, allowing the more vocal participants to fully express themselves, but not dominate the discussion, and inviting the more reticent ones to share their input and not feel pressured or excluded. All of the focus group interviews took place in a secure, well-lit conference room in the library with which all participants were familiar. Pizza and beverages were served to ensure a comfortable environment for the group discussion. The focus group interview guide was developed based on the research questions, focusing on three main areas of inquiry: types of decisions supported by evidence, the way evidence is used in supporting decision making, and challenges in the EBLIP process. All three focus groups were audiorecorded and transcribed. When coding the transcripts, a combination of deductive coding and inductive coding was applied. For deductive coding, a coding scheme (as shown in table 1) was developed based on two studies: the summary of use of research evidence in education by Maciolek30 and the investigation of librarians’ definition and use of evidence in practice from Koufogiannakis.31 Inductive coding followed a three-step process:32 open coding for initial classification and labeling of codes, axial coding to identify the core concepts, and selective coding to determine the relationships between codes and uncover the central themes. Two coders coded the transcripts, resolving their conflicts in the coding process and arriving at 100 percent agreement. http://Amazon.com 558 College & Research Libraries May 2018 Results The study intended to answer the three research questions: 1) what types of decisions are being supported by evidence; 2) how evidence is used in supporting decision making; and 3) what the challenges are in the EBLIP process. It should be noted that each of the three focus groups involved a distinct population. Library managers and librarians had more decision-making power than nondegreed library staff. Thus, in the focus group discussions, managers and librarians primarily shared their own involvement in evidence-based decision making, while staff reflected more on their role in assisting their supervisors in making decisions. Types of Decisions Supported by Evidence The coding scheme derived from Maciolek33 was applied to define the types of deci- sions supported by evidence. The majority of evidence-based decisions (60% of the examples shared in the focus groups) fall under the category “Know-what (works).” Such decisions seek to determine what actions will lead to desired outcomes at accept- able costs and without unwanted consequences. As shown below, participants used evidence to help them decide what would work so that they could take proper actions to address a particular issue. TABLE 1 Coding Scheme For Analyzing Data From The Focus Group Interviews Area of Inquiry Code Label Code Definition Types of decisions supported by evidence Know-why To understand why a certain action is required Know-what (works) To determine what actions will lead to desired outcomes at acceptable costs and without unwanted consequences Know-who (to involve) To identify the stakeholders that need to be involved for potential actions Know-about (problems) To understand the nature, history and characteristics of existing problems/ phenomena/situations in context Know-how (to put into practice) To investigate how to perform an action or implement a solution effectively Purposes of using evidence to support decision making Instrumental use Evidence is used to directly influence a specific decision, or a solution to a specific problem Strategic or tactical use Evidence is used as an instrument of persuasion to support or challenge an existing position Imposed use Evidence is used as a requirement imposed by others, such as the funding agency Conceptual use Evidence is used to impact the knowledge, understanding, and attitudes of practitioners and decision-makers Sources of evidence Hard evidence Tangible evidence, including published literature, existing statistics, local research findings, and non-scholarly publications Soft evidence Intangible evidence, including human knowledge/input and anecdotes Experiencing Evidence-Based Library and Information Practice 559 “Recently we subscribed to a streaming video website, where we were allowed to choose 150 titles that we wanted to put on the streaming site. And I wasn’t sure how to go about that. So, I searched our reserves catalog. I took like three years’ worth of reserves and found which films were put on reserve most often. And then, I did a circulation search to see what films, you know, circulated most often. And that’s how I made my decisions for the 150 films.” The second most popular type of evidence-based decisions (30% of the examples shared in the focus groups) can be characterized as “Know-about (problems),” where evidence is sought to help participants understand the nature, history, and charac- teristics of an existing problem/phenomenon/situation in context. The quote below demonstrates how a participant used evidence to help him conceptualize digital scholarship and its related services. “Over the past few years I’ve been sort of identifying what it means to do digital scholarship here. I began by looking at other institutions similar to us. How I selected the institutions mainly had to do with their mission and the type of school they were. We’re bigger than a typical liberal arts school. However, our mission is very similar to other liberal arts school and that we emphasize teaching. So, I looked at liberal arts schools which often have like 1,500 students. I looked at the services they are offering and the way that they frame them. Beyond that is reading literature. There are a lot of different types of literature out there. Digital scholarship is so new that there’s not necessarily a lot of rich peer-reviewed information out there. So you kind of have to look at websites and blogs and that kind of thing. But I did find certain things that had been written, not necessarily about libraries, but digital scholarship in general. So, for example there’s a chapter in a book that I read that was specifically about digital communities within liberal arts institutions. It wasn’t library-centric but it was liberal arts-centric and that was very helpful.” Two other types of evidence-based decisions were also mentioned but only minimally. • Know-how (to put into practice). Evidence is gathered to investigate how to perform an action or implement a solution effectively. As shown in this quote, “So I’ve looked at literature to see how the scope of the institutional repository could be expanded because getting faculty to provide their articles is difficult. But then I read about the different ways in which an institutional repository can be used to build an archive for different sorts of things, whether it’s student works or student journals, or research data sets. So, I’ve tried to work towards that,” the librarian consulted evidence to determine how to expand the institutional repository. • Know-who (to involve). The decision is to identify the stakeholders that need to be involved for potential actions, as portrayed by this quote, “This is a silly one, but for me it’s kind of a big deal. I can’t show up for this one class where I check on students’ ability to use a certain digital tool, and I don’t have people I can rely on to do that for me. Or so I thought. Then I realize I do and it’s one of our librarian residents and I was like, why don’t I just send her, which sounds really simple except that I needed to make sure she understands what she’s talking to the students about and everything. So, I took her with me today to a class where she saw what I was doing and simultaneously, I actually want her to learn this tool. For me, the problem-solving aspect was obviously thinking of somebody who could take the place, but also in the long run having her learn that tool so that she can help me create tutorials.” How Evidence Is Used in Supporting Decision Making Content in this section is examined from three aspects: purposes of evidence use, 560 College & Research Libraries May 2018 sources of evidence, and selection of evidence. Collectively, they present a full depic- tion of how evidence is used in supporting decision making. Purposes of Evidence Use The majority of participants’ evidence use (84% of the examples shared in the focus groups) served the purpose of directly influencing a specific decision, or a solution to a specific problem, which was labeled as “Instrumental use” by Maciolek.34 The three quotes below represented three incidents where evidence was consulted to impact specific decisions related to collection weeding, budget modeling, and equipment and space modification. “Well the collection development coordinator and I needed to work on weeding books out of the basement. And there wasn’t enough time to actually review all the books before decisions were made. So, we decided to try to base it upon the usage of broad subject categories. Since we have academic subject headings on our records for the books, I was able to compile the usage data for each category and then we analyzed that. And the books in the subject categories that had lower usage were reviewed for possible weeding whereas the books in the subject categories that had higher usage were moved into the new storage area.” “I do budget modeling and trying to forecast where we’re going to be at different points of the year with our budget. We built that forecast model based on looking at this year’s data patterns because the spending ebbs and flows throughout the year. And for different budget lines, it ebbs and flows at different points. You have to account for these fluctua- tions. So, I did that just looking at previous years’ patterns.” “We did a multifaceted study of user behavior. It was sort of based on the Rochester study but we added a quantitative survey component to the qualitative study. We used the re- sults to add some equipment that we didn’t even realize we needed, like public scanners, and to upgrade some other equipment and to create dedicated quiet study space in the library and stuff like that.” Two other purposes of evidence use were also discussed in the focus groups, but at far less frequency. • Conceptual use. Evidence is used to impact the knowledge, understanding, and attitudes of practitioners and decision makers. As indicated in this quote, “I need to find out what coming changes are happening in cataloging—like with linked data. That’s what we’re moving towards now. I’ve been reading scholarly articles talk- ing about how it may go or how things may develop to help me determine what my department should do in reaction to that or how we may need to change how we do our work to accommodate linked data in the future,” the participant relied on evidence to strengthen knowledge of trends in cataloging. In another example, “I use a RSS feed reader to maintain a peripheral awareness of a lot things. I read scholarly literature, but also blogs and librarian listservs. It’s usually not clear at the time when I’m browsing whatever it is when it would be helpful. But I pick up those pieces and a lot of times, I use them down the line,” the participant reviewed evidence regularly and habitually to advance professional knowledge in general. • Strategic or tactic use. Evidence can be used as an instrument of persuasion to support or challenge an existing position. This example, “in my previous position at an art college, students started this campaign for us to have extended hours. There’s an art college consortium and I went on there and looked at the hours of all the other Experiencing Evidence-Based Library and Information Practice 561 art schools. And we were open more hours than any other college in the country. So, I was able to nip it in the bud,” shows the success of using evidence to counter an argument. Sources of Evidence The types of evidence sources shared in the focus groups were similar to the findings of Koufogiannakis35 (as shown in table 2). However, there are two major differences. First, Koufogiannakis listed “Facts” as a source of evidence, defined as “[things] that the majority of people agree to be true.” In this study, participants raised concerns about that, as indicated by the following comments, “I’m leery of facts. Just look on Facebook to see how wrong a lot of people can be but be in agreement about something,” and “We often teach the students in the classroom to question what’s generally accepted to be true and show them what was generally accepted as fact 150 years ago versus what we know now. So, I like to debunk that word.” Second, a new category, “Analysis of virtual or physical artifacts,” emerged from this study, and it was not reported in Koufogiannakis’s study. This source of evidence refers to examinations of library collections or other related artifacts, as exemplified in this quote: “My archivists took a standard instrument available online to assess the deterioration of our audio visual tapes sitting inside the vault in our collections. That helped them identify the number of materials that we should be really worried about and come up with some strategies as to how to prioritize any kind of salvage or digitization efforts we might want to consider making.” It is worth noting that some evidence was purposefully generated for a particular decision (for example: a needs assessment survey would be conducted to inform the decision about library space redesign), whereas other evidence already existed and was consulted if relevant to decision making (for example: existing reference service usage statistics could help optimize staffing arrangement at the reference desk). The former is similar to what Maciolek36 described as the “process” emphasis in evidence-based practice, where practitioners engage in the evidence generation process rather than simply using existing evidence. Maciolek believed that such engagement in “process” can lead to “changes in ways of thinking and in ways of behaving among individuals and throughout organizations.” Selection of Evidence (Why Use One Type but Not the Other) When asked how they select the evidence they need among the multiple sources, participants’ responses can be grouped into three patterns: • Hard evidence is prioritized over soft evidence. Participants prefer to consider hard evidence such as literature or existing statistics first; and, if they fail to find what they need from the hard evidence sources, they would then consult trusted colleagues who may have the necessary expertise. • The source of evidence to consider is influenced by the nature of the practice— certain types of library practice may rely on some sources of evidence more than others. For example, a participant (outreach/marketing librarian) explained his penchant for statistics in this quote, “There are certainly areas of my work that lend themselves more to statistics. We do social media analysis to find out whether or not we’re posting the right type of content and getting the attention of the right user. Social media statistics are very helpful for determining what we should do next week, next month, next year, and where we should be spending our time and resources because there’s just so much data that you can pull from those numbers.” • Accessibility, convenience, and timeliness of evidence constitute an important factor. For instance, a participant used RSVP data from previous years (people who RSVP’d versus people who attended events) to determine whether to continue requiring people to RSVP for events. He resorted to existing statistics 562 College & Research Libraries May 2018 TABLE 2 Sources of Evidence Evidence Source Examples Hard Evidence Published literature Peer reviewed journal articles (e.g., articles in College & Research Libraries) Original research Research conducted through a systematic process, using proper methods such as survey and in-depth interview in data collection and analysis to answer a specific research question (e.g., a carefully designed study that examines the efficacy of a library’s information literacy instruction) Analysis of virtual or physical artifacts Examination of the content or conditions of library- related artifacts (e.g., inspecting wear and tear of rare books, quality of catalog records, and deterioration of audio visual tapes) Internal statistics Automatically generated statistics related to library resources/programs/services (e.g., circulation and reference service usage statistics) External statistics Statistics generated by other entities than the library (e.g., other campus units, or professional organizations like Association of College & Research Libraries) Publicly available documents Unpublished documents that are publicly available (e.g., reports of professional organizations and policy documents of other libraries/institutions) Blogs and social media Information on blogs, Twitter, Facebook and other social media platforms that is relevant to library practice Conference presentations, proceedings and posters Professional conferences in library and information science or other fields Soft Evidence Input from internal colleagues Ideas, advice and suggestions from colleagues in the same library (e.g., a new instruction librarian asking colleagues for input on his lesson plan) Input from external colleagues Ideas, advice and suggestions from external colleagues via venues like email, listservs and professional meetings (e.g., posting a message to the listserv “libref-l”, asking for input on choosing virtual reference software) Input from user community Feedback, ideas, and suggestions from library users (e.g., comments users leave through the feedback form on library website) Anecdotes Anecdotal observations and communications with library users regarding library spaces/resources/ services/programs (e.g., conversing with a student about the library study rooms while waiting for the elevator together) Experiencing Evidence-Based Library and Information Practice 563 over other evidence sources because it was readily available, as indicated in his comment, “I guess we could have considered talking and doing interviews with attendees of events, interviews with people who did not attend our events, and try to find out maybe from a qualitative standpoint, why they may have not RSVP’d. But the data was readily at hand and it was the easiest thing to analyze given the time we had.” Another participant acknowledged the frequent need for the most up-to- date information, and such information usually does not exist in the published literature, but is found in less scholarly sources like social media. Challenges in EBLIP A total of five challenges in the EBLIP process were identified in the focus group dis- cussions. Lack of time is the foremost challenge recognized by all participants. The time constraint limits their capability to thoroughly engage in EBLIP even if there are adequate resources and administrative support. Particularly, generating evidence from original research is time consuming, because, as a participant commented, “if you want to do it well and do it right; it takes a lot of planning before you even get to the start point.” Participants also acknowledged that librarians have different positions and different expectations within each position; and, in some positions, it would be easier to carve out the time to produce/collect, evaluate, and apply evidence. Meanwhile, as shown in this comment, “the librarians here do not have faculty status; if they had faculty status it would be a much easier thing to say so many hours a year you can take and do research. I think the lack of having the faculty status at this particular institution makes doing that problematic because then suddenly HR’s concerned about what you’re doing. If it was faculty, they wouldn’t be.” The lack of faculty status makes it problematic for librarians to have release time to conduct research, which further aggravates the time challenge. Second, participants agreed that “if collectively in the work as an organization, if we wanted that [EBLIP] to be happening, there’d have to be mentoring for new librarians.” They believed that mentoring and training for new librarians is critical in helping them ap- preciate the importance of EBLIP and understand the methods and procedures of EBLIP. This would ultimately contribute to fostering the EBLIP culture in an organization. Third, availability and accessibility of evidence can be restricted, presenting road- blocks in the process of locating evidence. For instance, while the published literature contains research results from other institutions, they are not applicable if librarians’ home institution is not comparable, in scale and size, to the institutions reported in the literature. In another example, data collected by other campus units, such as information technology, student services, student affairs, and the registrar’s office, could be potentially beneficial and yet is inaccessible to the library. Such data could help contextualize students’ use of the library and reveal how that is related to their academic life. However, there are concerns of privacy; and it can be challenging for librarians to be granted access to the data if they request it. The fourth challenge is organizational culture—without a supportive administra- tion and effective communications within the organization, it is unlikely for EBLIP to succeed. A participant recognized that currently “evidence-based practice is applied inconsistently and it’s not always communicated clearly and widely. And so, it’s hard to know how to build that into our own practices.” Library administrators and managers may have different management styles, and the one that is conducive to fostering evidence-based practice includes the following characteristics: 1) explaining the rationale behind each decision and encouraging people to ask why a decision is made; 2) involving staff in the decision-making process, particularly in evidence gathering; and 3) avoiding micromanaging and offering staff adequate autonomy in making decisions in their own job responsibilities. 564 College & Research Libraries May 2018 Finally, personalities play a role in how people embrace EBLIP. For evidence-based practice to be fully integrated into librarians’ daily work, it is important to keep an open mind and be willing to change. Discussion Findings of the study reveal that academic librarians’ evidence use primarily served the “instrumental” purpose—they employed evidence to influence a specific decision or a solution to a specific problem. The dominance of this pattern of evidence use is likely attributable to the practical nature of librarianship, where librarians’ daily work involves numerous specific decisions and actions. This evidence use pattern fits the “Research-Based Practitioner Model” described by Nutley, Walter, and Davies,37 where research evidence use is the responsibility of individual practitioners in addressing practical issues. Key factors in supporting such evidence use are “professional education and training as well as enabling practitioners to access good quality research evidence and developing their ability to critically appraise the evidence.”38 Given the prevalence of the “instrumental” use of evidence in academic libraries, it is crucial that academic librarians are well equipped with knowledge about the research process and methods so that they can properly identify and evaluate published research and conduct original research in their evidence-based practice. It is worth noting that lack of mentoring/ training was acknowledged as a challenge in EBLIP by the participants in this study. Insufficient training would then lead to unfamiliarity with the research process, and lack of confidence in research, which have also been deemed as barriers to EBLIP.39 The library profession has been making efforts in providing research-focused edu- cation and training opportunities. About 62.3 percent of ALA-accredited Library and Information Science (LIS) degree programs list research methods as a required course in their curriculum. Professional associations such as the Medical Library Association offer continuing education courses or webinars that aim to enhance practitioners’ research skills. Library consortia are in a good position to provide such training op- portunities too. An example is Statewide California Electronic Library Consortium, whose “Research Day” is an annual event that provides research methods training for its member and affiliate librarians. Furthermore, the aforementioned IRDL and the Librarians’ Research Institute in Canada are both nationwide professional devel- opment programs that provide research methods training for academic and research librarians. While these efforts are laudable, they are not enough to meet the needs for EBLIP to become widespread and inherent in librarians’ daily practice. It is necessary to continue to raise awareness of EBLIP among the profession, reinforcing the importance of librarians’ research knowledge and skills and advocating for more research-focused professional development opportunities for them. Regarding the sources of evidence, findings of this study echoed Gillespie et al. and Koufogiannakis,40 confirming that librarians use a wide variety of evidence to support decision making, ranging from hard evidence such as published literature, original research, internal and external statistics, to soft evidence such as input from colleagues and users, as well as anecdotes. Given the abundance of evidence sources, librarians can combine them to fully inform a particular decision when needed. As Maciolek41 pointed out, complex problems can be better addressed when “research evidence is used in combination with different types of knowledge, including professional expertise, practice wisdom, and personal experience from a variety of sources.” In addition to evidence sources, this study also explored how librarians selected evidence. The factors influencing evidence selection, such as source preferences and the nature of practice, could provide useful insights on professionally preparing librarians for EBLIP. For example, if certain evidence sources are favored in a particular area of librarianship, Experiencing Evidence-Based Library and Information Practice 565 education and training can be customized to help librarians understand how to more effectively engage in evidence-based practice in that area. The challenges in EBLIP identified in this study were consistent with Koufogianna- kis’s study.42 Effective methods to overcome these challenges should be an important consideration in efforts that seek to foster or enhance EBLIP. For instance, regarding the limited availability and accessibility of evidence, academic libraries may: 1) strive to ease the access barriers to other campus units by building relationships with them and advocate for a transparent process of collecting and releasing institutional statistics; and 2) form collaborative initiatives to glean and share professional data, similar to public libraries’ Measures that Matter, which surveys the current state of public library data, assessing current strengths and weaknesses, developing a greater understanding about how data should be collected, stored, and made accessible, and formulating a plan for future action.43 Organizational culture is also a determinant of EBLIP’s success. This finding echoed what Farkas, Hinchliffe, and Houk44 discovered in their study—close to 60 percent of academic librarians believed that “library leadership uses assessment data systemati- cally in decision making” and “library leadership offers explicit support to get faculty/ staff involved in assessment” were factors influencing assessment culture in academic libraries. One of the models of research evidence use developed by Nutley, Walter, and Davies,45 the Organizational Excellence Model, explained that the key to successful research evidence use lies in the development of appropriate structures, processes, and cultures within an organization. Leaders and managers are responsible for develop- ing an organizational culture that is research-minded and evidence-oriented. Helpful approaches include embedding evidence use in the systems and processes by way of standards, policies, procedures, and tools and providing ongoing opportunities to learn about, apply, discuss, and reflect on EBLIP. To ultimately overcome the challenges and seamlessly integrate EBLIP into librari- anship, what Booth46 described as a “paradigm shift” is required, and there needs to be coordinated attempts to develop a climate to enable the paradigm shift. Examples of such a climate for EBLIP to thrive include the open access journal Evidence Based Library and Information Practice and the biennial conference of the same name. The ACRL Value of Academic Libraries and Assessment in Action initiatives, the ARL Library Assessment Conference, and the Northumbria Conference on library performance management are also important efforts contributing toward the climate. More research endeavors are also needed to engage the profession in active conver- sations about evidence-based practice. Future research may consider examining the efficacy of EBLIP education and training; and exploring consensus building in EBLIP as consensus about the quality, reliability, and implications of evidence has an impact on evidence use. It is important to understand whether the consensus is about the evidence or about the values associated with the decision, whether and how consensus about one drives consensus about the other, and whether consensus occurs differently for different types of decisions (for example, decisions related to “know what [works]” and those concerning “know about [problems]”).47 Conclusion This study provides an in-depth exploration of academic librarians’ experience of EBLIP, focusing on examining the types of decisions supported by evidence, the ways in which evidence is used in supporting decision making, and the challenges in the EBLIP process. Findings of the study will help increase the awareness of evidence- based practice, deepen the professional understanding of EBLIP, enrich the literature on the topic, and identify important issues pertinent to EBLIP for further exploration. 566 College & Research Libraries May 2018 Practitioners and educators may also draw upon this study to develop education and training programs to equip librarians with the competencies and confidence to successfully engage in EBLIP. It is worth noting that the study was conducted at one academic library, and the limited generalizability is an inherent weakness of qualita- tive research. Findings may be used to determine key variables related to EBLIP that could be further measured in a larger-scale quantitative study. Notes 1. Andrew Booth and Anne Brice, Evidence-Based Practice for Information Professionals (London, U.K.: Facet Publishing, 2004). 2. Rick Wallace and Nakia Carter, “Evidence Based Library & Information Practice,” Tennessee Libraries 58, no. 1 (2008): 2–4; Andrew Booth, “Australian Supermodel? A Practical Example of Evidence-Based Library and Information Practice (Eblip),” Health Information & Libraries Journal 23, no. 1 (2006): 69–72. 3. Meredith Schwartz, “Top Skills for Tomorrow’s Librarians” (2016), available online at http:// lj.libraryjournal.com/2016/03/careers/top-skills-for-tomorrows-librarians-careers-2016/ [accessed 12 January 2017]. 4. David L. Sackett, Evidence-Based Medicine: How to Practice and Teach EBM (Philadelphia, Pa.: WB Saunders Company, 1997). 5. Denise Koufogiannakis, Linda Slater, and Ellen Crumley, “A Content Analysis of Librarian- ship Research,” Journal of Information Science 30, no. 3 (2004): 227–39. 6. Booth and Brice, Evidence-Based Practice for Information Professionals. 7. Andrew Booth, “Eblip Five-Point-Zero: Towards a Collaborative Model of Evidence-Based Practice,” Health Information & Libraries Journal 26, no. 4 (2009): 341–44. 8. Jonathan D. Eldredge, “Evidence-Based Librarianship: An Overview,” Bulletin of the Medical Library Association 88, no. 4 (2000): 289–302. 9. Ellen Crumley and Denise Koufogiannakis, “Developing Evidence-Based Librarianship: Practical Steps for Implementation,” Health Information & Libraries Journal 19, no. 2 (2002): 61–70. 10. Denise Koufogiannakis, “Academic Librarians’ Conception and Use of Evidence Sources in Practice,” Evidence Based Library and Information Practice 7, no. 4 (2012): 5–24. 11. Suzanne Lewis, “Reflections on Using Patrons’ Stories as Practice-Based Evidence,” Evidence Based Library and Information Practice 11, no. 1 (2016): 107–10. 12. Ann Gillespie, Faye Miller, Helen Partridge, Christine Bruce, and Alisa Howlett, “What Do Australian Library and Information Professionals Experience as Evidence?” Evidence Based Library and Information Practice 12, no. 1 (2017): 97–108. 13. Koufogiannakis, “Academic Librarians’ Conception and Use of Evidence Sources in Prac- tice,” 5–24. 14. Crumley and Koufogiannakis, “Developing Evidence-Based Librarianship: Practical Steps for Implementation,” 61–70. 15. Ibid., 65. 16. Heather J. Pretty, “Barriers to Evidence-Based Library and Information Practice,” Feliciter 53, no. 1 (2007): 30–32, available online at http://research.library.mun.ca/27/3/C_-_Pretty_2006. pdf, archived at https://perma.cc/S3N3-SV5G [accessed 28 October 2016]. 17. Mirna E. Turcios, Naresh Kumar Agarwal, and Linda Watkins, “How Much of Library and Information Science Literature Qualifies as Research?” Journal of Academic Librarianship 40, no. 5 (2014): 473–79. 18. June Abbas, Martin Garnar, Marie Kennedy, Brian Kenney, Lili Luo, and Michael Stephens, “Bridging the Divide: Exploring LIS Research and Practice in a Panel Discussion at the Alise ’16 Conference,” Journal of Education for Library and Information Science 57, no. 2 (2016): 94. 19. Heting Chu, “Research Methods in Library and Information Science: A Content Analysis,” Library & Information Science Research 37, no. 1 (2015): 36–41. 20. Pretty, “Barriers to Evidence-Based Library and Information Practice,” 30–32. 21. Crumley and Koufogiannakis, “Developing Evidence-Based Librarianship: Practical Steps for Implementation,” 61–70. 22. Lili Luo, Marie Kennedy, and Kristine Brancolini, “Institute for Research Design in Librari- anship: Impact on Information Literacy Research and Practice” (paper presented at the European Conference on Information Literacy, 2016). 23. Martha Kyrillidou, Steve Hiller, Martha Kyrillidou, and Jim Self, “When the Evidence Is Not Enough: Organizational Factors That Influence Effective and Successful Library Assessment,” Performance Measurement and Metrics 9, no. 3 (2008): 223–30. http://lj.libraryjournal.com/2016/03/careers/top-skills-for-tomorrows-librarians-careers-2016/ http://lj.libraryjournal.com/2016/03/careers/top-skills-for-tomorrows-librarians-careers-2016/ http://research.library.mun.ca/27/3/C_-_Pretty_2006.pdf http://research.library.mun.ca/27/3/C_-_Pretty_2006.pdf https://perma.cc/S3N3-SV5G Experiencing Evidence-Based Library and Information Practice 567 24. Denise Koufogiannakis, “Determinants of Evidence Use in Academic Librarian Decision Making,” College & Research Libraries 76, no. 1 (2015): 100–14. 25. Vivian Tseng, Evidence at the Crossroads Pt. 1: What Works, Tiered Evidence, and the Future of Evidence-Based Policy (New York, N.Y.: William T. Grant Foundation, 2015), available online at http://wtgrantfoundation.org/evidence-at-the-crossroads-pt-1-what-works-tiered-evidence-and- the-future-of-evidence-based-policy [accessed 28 October 2016]. 26. David W. Stewart and Prem N. Shamdasani, Focus Groups: Theory and Practice, vol. 20 (Thousand Oaks, Calif.: Sage Publications, 2014). 27. Robert K. Merton, “The Focused Interview and Focus Groups: Continuities and Discon- tinuities,” Public Opinion Quarterly 51, no. 4 (1987): 550–66. 28. Bobby J. Calder, “Focus Groups and the Nature of Qualitative Marketing Research,” Journal of Marketing Research (1977): 353–64. 29. Stewart and Shamdasani, Focus Groups: Theory and Practice. 30. S. Maciolek, Use of Research Evidence: Social Services Portfolio (New York, N.Y.: William T. Grant Foundation, 2015), available online at http://wtgrantfoundation.org/library/uploads/2015/09/ Use-of-Research-Evidence-Social-Services-Portfolio.pdf [accessed 28 October 2016]. 31. Koufogiannakis, “Academic Librarians’ Conception and Use of Evidence Sources in Prac- tice,” 5–24. 32. Earl R. Babbie, The Basics of Social Research (Chicago, Ill.: Cengage Learning, 2013). 33. Maciolek, Use of Research Evidence: Social Services Portfolio. 34. Ibid. 35. Koufogiannakis, “Academic Librarians’ Conception and Use of Evidence Sources in Prac- tice,” 5–24. 36. Maciolek, Use of Research Evidence: Social Services Portfolio. 37. Sandra M. Nutley, Isabel Walter, and Huw T.O. Davies, Using Evidence: How Research Can Inform Public Services (Bristol, U.K.: Policy Press, 2007). 38. Maciolek, Use of Research Evidence: Social Services Portfolio. 39. Koufogiannakis, Slater, and Crumley, “A Content Analysis of Librarianship Research,” 227–39. 40. Koufogiannakis, “Academic Librarians’ Conception and Use of Evidence Sources in Prac- tice,” 5–24; Gillespie, Miller, Partridge, Bruce, and Howlett, “What Do Australian Library and Information Professionals Experience as Evidence?” 97–108. 41. Maciolek, Use of Research Evidence: Social Services Portfolio. 42. Koufogiannakis, “Determinants of Evidence Use in Academic Librarian Decision Making,” 100–14. 43. Jennifer A. Dixon, “Imls, Cosla Launch: Measures That Matter” (New York, N.Y.: Reed Business Information, 2016), available online at http://lj.libraryjournal.com/2016/11/managing- libraries/imls-cosla-launch-measures-that-matter [accessed 12 January 2017]. 44. Meredith Farkas, Lisa Janicke Hinchliffe, and Amy Harris Houk, “Bridges and Barriers: Factors Influencing a Culture of Assessment in Academic Libraries,” College & Research Libraries 76, no. 2 (2015): 150–69. 45. Nutley, Walter, and Davies, Using Evidence. 46. Booth, “Australian Supermodel?” 69–72. 47. Maciolek, Use of Research Evidence: Social Services Portfolio. http://wtgrantfoundation.org/evidence-at-the-crossroads-pt-1-what-works-tiered-evidence-and-the-future-of-evidence-based-policy http://wtgrantfoundation.org/evidence-at-the-crossroads-pt-1-what-works-tiered-evidence-and-the-future-of-evidence-based-policy http://wtgrantfoundation.org/library/uploads/2015/09/Use-of-Research-Evidence-Social-Services-Portfolio.pdf http://wtgrantfoundation.org/library/uploads/2015/09/Use-of-Research-Evidence-Social-Services-Portfolio.pdf http://lj.libraryjournal.com/2016/11/managing-libraries/imls-cosla-launch-measures-that-matter http://lj.libraryjournal.com/2016/11/managing-libraries/imls-cosla-launch-measures-that-matter _GoBack