21 Unheard Voices: Institutional Repository End-Users Beth St. Jean, Soo Young Rieh, Elizabeth Yakel, and Karen Markey Beth St. Jean is a doctoral candidate, Soo Young Rieh and Elizabeth Yakel are Associate Professors, and Karen Markey is a Professor in the School of Information at University of Michigan; e-mail: bstjean@umich. edu, rieh@umich.edu, yakel@umich.edu, and ylime@umich.edu, respectively. Support for the MIRACLE (Making Institutional Repositories A Collaborative Learning Environment) Project (http://miracle.si.umich. edu) is provided by the Institute of Museum and Library Services (IMLS) through its National Leadership Grant Program (LG-06-05-0126-05). We would like to thank the IR managers and staff members who assisted us in our recruitment efforts, as well as the twenty end-users who participated in this study. We also wish to thank Raya Samet and Jihyun Kim for their assistance in conducting and coding interviews. © Beth St. Jean, Soo Young Rieh, Elizabeth Yakel, and Karen Markey This exploratory study investigates the perceptions and experiences of a group of institutional repository (IR) stakeholders seldom heard from: end-users. We interviewed twenty IR end-users recruited through five IRs to discover how they characterize the IR, how/why they use the IR, their credibility judgments in relation to the IR, and their willingness to return to and/or recommend the IR. Despite our small sample size, we were able to ascertain that IR end-users, although not yet loyal IR devotees, recognize their value and unique nature. Our findings also revealed sev- eral areas for improvement, such as lack of visibility and transparency. lthough some of the earli- est institutional repositories (IRs) such as the University of California’s eScholarship and MIT’s DSpace have now been in operation for more than seven years, we know very little about who is actually searching and retrieving items from IRs (“IR end-users”) or their motivations for turning to IRs. Much of the IR literature to date has focused on the need for and difficulties with content recruitment, pay- ing little attention to IR end-users. As IRs approach the close of their first decade, there is a need to shift some of the focus from contributors and content toward end-users and use. The chicken-and-egg problem (“Users will not use the archive until there is a [sic] sufficient content but they won’t contribute content until they use it”1) can only be solved by learning about and attempting to tailor the IR to the interests and needs of both contribu- tors and end-users. This article attempts to fill this gap in the literature by report- ing the findings from an exploratory study consisting of interviews with 20 end-users recruited through five differ- ent IRs. The research questions driving this study are: 1. How do end-users characterize the IR? 2. What approaches do end-users take to accessing and using IRs? 3. For what purposes do end-users use IRs? crl-71r1 22  College & Research Libraries  January 2011 4. To what extent do end-users per- ceive the information from IRs to be credible, relative to information from other sources? 5. To what extent are end-users willing to return to the IR and/ or to recommend the IR to their peers? 6. How do IRs fit into end-users’ information-seeking behavior landscapes? Literature Review Background Various definitions of the term “insti- tutional repository” have been posited, each with a differing focus ranging from the content housed in IRs to the services offered by them to the potential they offer for transforming the traditional scholarly communication system. Raym Crow of- fered one of the earliest definitions, de- scribing an IR as “A digital archive of the intellectual product created by the faculty, research staff, and students of an institu- tion and accessible to end users both within and outside of the institution, with few if any barriers to access.”2 Clifford Lynch offers a more services-focused defi- nition, characterizing a university-based IR as “a set of services that a university offers to the members of its community for the management and dissemination of digital materials created by the insti- tution and its community members.”3 Lynch stresses that an IR is fundamentally “an organizational commitment to the stewardship of these digital materials, including long-term preservation where appropriate, as well as organization and access or distribution.”4 Many potential benefits of IRs have been touted in the literature. Crow men- tions increased visibility and impact for faculty members and other research- ers, increased visibility and relevance for academic libraries, and broadened accessibility that government agencies and other funding sources often seek or demand for the products of the research programs that they fund.5 Susan Gibbons lists several additional benefits associated with IRs, including guaranteed long-term preservation, increased efficiency for the institution through centralization of the distribution efforts of individual faculty members, an opportunity to showcase faculty and student work and thereby establish a scholarly reputation for the institution, and a way in which institu- tions can respond to the scholarly com- munication crisis.6 Crow mentions an additional very important far-reaching benefit: “Progress in most academic disciplines relies largely on the amount of available information.… Thus the abil- ity to locate and retrieve more relevant research more quickly and easily online will improve scholarly communication and advance scholarly research.”7 IR End-Use One of the main functions of an IR is to widen access to faculty work within, across, and beyond university walls and to thereby increase the potential research impact of this work.8 End-users are often recognized as vital to the ultimate success of IRs. It has been emphasized that the ability to recruit content for an IR funda- mentally relies on the ability to provide evidence that contributing content will have an impact,9 that an assessment of an IR’s value must take all types of us- age into account including end-use,10 and that open access (that is, free online availability without restrictions11) cannot be achieved without users who are able to find and use the scholarly content ar- chived in IRs.12 Furthermore, awareness (or lack thereof) of end-use can affect an IR’s very sustainability.13 Despite the widespread recognition of the central importance of end-users to the ultimate success of an IR, we know very little about end-users. In 2007, Dana McKay pointed out that “There are no known reports of actual usage of any IR” and, furthermore, that “virtually nothing is known about IR end-users.”14 In an earlier study conducted by the authors, it was discovered that many IR manag- Unheard Voices: Institutional Repository End-Users  23 ers have little familiarity with who is accessing the materials in the IR.15 Other studies similarly report that IR managers are concerned about how difficult it is to gather evidence of end-use and to find out who is actually downloading materials from the IR.16 It has been noted that investigations into the usage of e-print archives tend to focus on depositing by authors, rather than use by end-users.17 Much of IR literature to date focuses on the more immediate problem of recruiting content, often employing the term “user” to refer to contributors rather than end-users. Nancy Fried Foster and Susan Gibbons point out, “Without content, an IR is just a set of empty shelves”;18 however, an IR with content but no end-users is similarly pointless. In fact, IRs with evidence of end-use may actually be easier to sell to potential contributors. However, what would be useful for end-users seems to have gone largely uninvestigated and unconsidered. Dorothea Salo observes, “[The institutional repository] is like a roach motel. Data goes in, but it doesn’t come out.”19 She further asserts, “what institutional repositories offer is not per- ceived to be useful, and what is perceived to be useful, institutional repositories do not offer.”20 Although it appears that Salo had contributors rather than end-users in mind, this assertion could be equally applicable to end-users. In his predictions for 2008, Peter Suber predicts, “the rate of spontaneous self- archiving will start to rise significantly when the volume of OA literature on deposit in repositories reaches a critical mass. The mass will be critical when researchers routinely search repositories, or routinely find what they seek in re- positories. Only by using repositories as readers will they appreciate the value of using them as authors.”21 This prediction raises the importance of end-use to that routinely accorded content recruitment. Eugenio Pelizzari makes the important point that users may be simultaneously unwilling to use e-print archives until there is sufficient content and unwilling to contribute content until they use the ar- chives themselves.22 This chicken-and-egg problem can only be solved by recogniz- ing the crucial importance of both of these interdependent aspects. McKay’s23 statement that there are no known studies of actual IR end-use remains largely the case today. Very little is known about end-users to date, with the exception that IR usage statistics have consistently revealed that the majority of IR end-users reach the IR via Google and Google Scholar.24 Although this informa- tion is of interest, it adds little to our knowledge of who IR end-users are and why they are accessing the IR. Empirical Studies of IR End-Use Despite the lack of studies of IR end-users, several studies have investigated potential end-use of open access (OA) materials and proposed IRs. Many of these stud- ies purport a great deal of interest on the part of potential IR end-users. For example, two studies report that faculty and graduate students are interested in accessing a wide variety of content for an array of different purposes.25 Focusing on “research students” (graduate students pursuing a Master of Philosophy or Doc- tor of Philosophy degree) as an important group of potential users (both contribu- tors and end-users) for IRs, Margaret Pickton and Cliff McKnight26 report that their interviewees expressed interest in being able to obtain materials from the IR that they would not have deposited themselves. They recommend that the IR and search engines covering OA material be included in user education sessions and that the IR offer value-added services for both contributors and end-users. For end-users, these recommended services include quality indicators, the ability to browse through subject-based collec- tions, the inclusion of supplementary materials, the provision of links to cited material, and the ability to cross-search both internal and external repository collections. Pickton and McKnight point 24  College & Research Libraries  January 2011 out, “students’ experiences as readers [of OA materials] are likely to colour their attitudes as authors.”27 Another study by Jack Maness, Tomasz Miaskiewicz, and Tamara Sumner28 found that the actual goals and needs of potential IR users were quite different from what those planning a new IR at the University of Colorado at Boulder had originally assumed. While they had believed that IR users would primarily want to have access to a collec- tion of the published research materials of faculty and graduate students, their study indicated that potential users want to have a network in which they can share teaching and learning materials, find po- tential collaborators, and promote their research to colleagues. Alesia Zuccala, Charles Oppenheim, and Rajveen Dhiensa29 surveyed users and nonusers of five different public repositories in the United Kingdom, including e-Prints Soton (University of Southampton’s IR) about their percep- tions, usage, and experience related to one of these repositories. They indicate that 45 percent of the e-Prints Soton users had heard about the IR from a colleague or friend and that 51 percent of these us- ers indicated that they usually found the material available in this IR to be relevant to their needs. Another study conducted by Pickton and McKnight30 involved a survey of repository managers at 60 universities in the United Kingdom and elsewhere. Just 14 of their respondents commented on the use that research stu- dents were making use of the repository as readers. Of these 14, 13 indicated that they had no data on which to base their comments in this regard, often because the only data they had were download counts that did not show who was using the IR. The one remaining IR indicated that they had anecdotal evidence of IR end-use by doctoral students. They con- clude that IR managers know a lot more about research students as contributors than about research students as readers and are concerned about the difficulty of gathering evidence of IR end-use. It is noted that we know far more about authors and contributors than about any other type of IR user.31 In an earlier study, it was discovered that, although nearly half of the surveyed IR managers reported gathering user counts to assess the success of their IR, just 10 percent stated that they had interviewed their users and only 4 percent stated that they had surveyed their users.32 We do not know where IR end-users are coming from, how they find the IR, what they look for in the IR, and how they use the IR functionality available.33 We also do not know who end-users are, how they are using IR content, and how satisfied they are with the quality of IR content.34 In the following sections, the results from our interviews with diverse IR end-users will be presented with the hope that these findings will be useful in beginning to fill this gap in the literature and in acquainting IR managers with end-users in a less quantitative and more actionable way. Methods This qualitative study entailed semis- tructured interviews conducted over the telephone with 20 IR end-users contacted through five different IRs. This section discusses the methods used to recruit participants, conduct interviews, and analyze the resulting data. Participant Recruitment We initially sought to recruit people who had used an IR multiple times to find in- formation. At the outset, this seemed like a simple matter; however, it turned out to be difficult to identify people who fit this criterion. We contacted the managers of five different IRs and requested their assistance in recruiting end-users. We knew from prior reports that the bulk of IR end-users do not enter through an IR homepage but through search engines, such as Google. However, we were not able to identify end-users who arrived at the IR through a search engine because, according to the IR managers, there was Unheard Voices: Institutional Repository End-Users  25 simply no way we could capture those users. Thus, we employed the following two methods to recruit participants: 1) We asked IR managers to place a link to a recruitment form on the IR homepage; and 2) We asked IR managers to iden- tify active IR end-users. We recruited 17 interviewees via the forms placed on IR homepages and three interviewees through an IR manager. Data Collection The purpose of the form placed on the IR homepages was to identify end-users and obtain their contact information (name, institution, and e-mail address), as well as to find out about their position (under- graduate, master’s, or doctoral student, postdoctoral fellow, faculty member, research professor/scientist, library staff, archives staff, other university staff, or other), their field or discipline, and the number of times they had used the IR to find information. Each participant was offered a $20 Amazon.com gift certificate. Semistructured interviews were con- ducted over the telephone during the first half of 2008. Through the different Table 1 Topics Covered by Interview Questions Research Question Related Topics Covered by Interview Questions (1) How do end-users characterize the IR? • User characterizations of the IR • User characterizations of IR content • User familiarity with the term “institutional repository” • User experience with using other IRs (2) What approaches do end-users take to accessing and using IRs? • How users initially found out about the IR • How users tend to get to the IR (e.g., via Google, the IR home page, their university’s library home page) • How users interact with the IR (e.g., browse, search by author, search by topic) • User perceived success in finding what they seek in the IR (3) For what purposes do end-users use IRs? • Types of IR content sought/accessed by users • User motivations for accessing the IR • User application of IR content (4) To what extent do end-users perceive the information from IRs to be credible, relative to information from other sources? • User perceptions about the relative trustworthiness of library resources, IRs, general Web search engines such as Google, and Google Scholar (5) To what extent are end- users willing to return to the IR and/or to recommend the IR to their peers? • User awareness of other users of the IR • User perceptions regarding possible reasons for non-use of the IR • Likelihood that users will use the IR again (and why/ why not) • Likelihood that users will recommend the IR to their peers (and why/why not) (6) How do IRs fit into end- users’ information seeking behavior landscapes? • How users decide where to begin a search for information (e.g., Google, Google Scholar, library databases) • User perceptions about the specific benefits associated with using the IR 26  College & Research Libraries  January 2011 methods, we recruited 43 potential in- terviewees and interviewed 20 of them. Twenty-three interviewees were excluded either because we were unable to reach them or because we learned that they were IR contributors and were not using the IR to find information. Interviews ranged in duration from 17 to 60 minutes, with the average interview lasting 34 min- utes. Table 1 shows each of the individual topics that were covered by the interview questions, in relation to each of our afore- mentioned six research questions. Data Analysis Interview recordings were transcribed in full, checked and corrected for any omis- sions or errors, and then imported into qualitative data analysis software, NVivo 7. Coding categories followed both from our original research questions for this study and from our ongoing analysis of the interviewee transcripts. The coding scheme (table 2) was developed iteratively, undergoing revisions until we had reached a Holsti Coefficient of Reliability35 of 0.732. We learned two important lessons during the recruitment process: 1) It is difficult to identify IR end-users; and 2) The line between an IR end-user and an IR contributor is often fuzzy. Just as much of the IR literature employs the term “user” to refer to contributors rather than end- users, five of the 20 interviewees who par- ticipated in this study were both actual or potential contributors and end-users. This aspect of IRs appears to be rather unique relative to other information systems. Findings Study Participants Although our interviewee pool was small (n=20) and certainly not representative of all IR end-users, it was heterogeneous along a number of different dimensions. The interviewees included six under- graduate, four master’s, and three doc- toral students, five faculty members, one Table 2 Coding Categories Category Subcategory Perceptions Finding out about IR User description of IR Reasons for non-use Familiarity with term ‘IR’ Benefits of IR Motivations Use incident Content types Purposes/Motivations for using the IR Trustworthiness Trustworthiness Use and Search Success/Failure Use of IR materials IR access path Length of time as IR user Frequency of IR use Modes of searching/browsing IR content Alternative information systems Other IR users Other IRs used Deciding where to begin information search Willingness to Use and Recommend Likelihood of using IR again Likelihood of recommending IR Unheard Voices: Institutional Repository End-Users  27 library staff member, and one museum staff member. Although these interview- ees were recruited through just five dif- ferent IRs, they actually represent ten different institutions, as only 15 of our 20 interviewees were recruited through their own institution’s IR. The students and faculty members who participated in our study represent disciplines from many areas, including Arts/Humanities (n=5), Sciences/Health Sciences (n=10), and So- cial Sciences (n=5). All interviewees except two had used the IR through which we recruited them five times or less—eight interviewees had used the IR one or two times and seven had used the IR three to five times. Table 3 shows the geographic region and Basic Carnegie Classification36 of each of the five institutions (labeled A through E) that participated in this study, along with the year their IR became op- erational, the software their IR is running on, and the number of interviewees who were recruited through that particular IR. How Do End-Users Characterize the IR? In the beginning of each interview, we asked interviewees to describe the IR. We purposefully avoided using the phrase “institutional repository” or the abbreviation “IR.” We simply asked them, “Could you please describe [name of IR] to me? How would you characterize it? What types of content do you think that it contains?” A subsequent interview question probed for interviewees’ degree of familiarity with and understanding of the term “institutional repository” and whether interviewees had ever used any other institution’s institutional repository. Characterizations of the IR Although several interviewees did use the generic term “repository” in their characterizations, only one interviewee (D09, who is an LIS faculty member) used the specific term “institutional re- pository.” Interviewees provided a wide array of similes and metaphors for the IR, including database, drawer, receptacle, gateway, interface, place, server, promo, and online forum. Several interviewees conceived of the IR simply as a repository or database. However, two interviewees specifically likened the IR to Wikipedia. Interviewee C07 said that the IR is “kind of like a static Wikipedia… that requires more bureaucracy.” Similarly, interviewee D13 described the IR as his university’s “equivalent of Wikipedia” because “peo- ple have contributed to it… [people who] are known to be good in that field.” Table 3 Participating IRs Institution Information IR Information IR Region basic Carnegie Classification approximate Year IR became Operational IR System # of Interviewees A East North Central Research Universities 2006 DSpace 3 B West North Central Research Universities 2007 DSpace 1 C East North Central Research Universities 2007 DSpace 3 D East North Central Research Universities 2005 DSpace 7 E New England Research Universities 2007 bepress Digital Commons 6 28  College & Research Libraries  January 2011 While some interviewees pointed out that the audience for the IR was limited to the university community, others felt that the IR was a way to showcase the intellectual output of the university. Interviewee E19 stated, “It looked to me like the online part of a university… like I guess it’s more intended for students and stuff to be able to access the information about the university or whatever papers and things like that. It didn’t seem to me that it was intended for the general pub- lic.” In contrast, interviewee E17 opined, “It looks like it’s kind of a promo for the intellectual property at [name of univer- sity].” Interviewee A03 described the IR as a “gold mine for administrators… an opportunity for administrators to track the progress, development, and… impact of those publications… essentially how the [name of university]’s research stacks up against other institutions.” Characterizations of IR Content Many interviewees believed the IR housed a wide variety of content. Inter- viewee B04 described, “There’s a wide variety… from things that people outside of the university would want to look at and things for people inside the university to look at. It’s really kind of a database for a lot of important public information. The types of information range from… there’s like maps of local topography and soil climates and there’s also documents with minutes from board meetings… archives from different schools and institutes and centers throughout the university.” Simi- larly, interviewee D10 pointed out that his university’s IR “contains a wide variety of information.… I mean it’s more than just the senior theses. It’s got a bunch of papers from different topics and ideas. It’s a very diverse database of information.” Displaying extensive knowledge of the contents of her university’s IR, interview- ee D09 explained that it “contains material that is either now or in the future might be of research interest… articles, images, un- dergraduate theses, presentations, video and audio of presentations, slides from presentations, abstracts... images, techni- cal reports, not so many working papers I don’t think, memos, newsletters, the accompanying materials from seminars and symposia and conferences aside from just the presentations, student posters and the student like award-winning projects.” Familiarity with the Term “Institutional Repository” Toward the end of each interview, we asked participants whether they were familiar with the term “institutional repository” and whether they had used any other institution’s institutional reposi- tory. Just six of the 20 interviewees had never heard the term before. However, interviewees’ degree of familiarity with, and understandings of, the term varied greatly. Interviewee A02 stated, “I guess it is what it says. The institution makes a place to store things” while interviewee C07 explained, “I’ve seen it in conjunc- tion with the [name of IR] website.… Just from context… I would just think it was a database of research.” Interviewee E18, who had heard the term IR before, said, “I’m a little unclear about exactly what a repository is. Like I picture one of those… big metal drawer things that you put documents in, but I know it’s electronic now… sort of like ongoing library in that it keeps things even when they’re out- dated, which some libraries actually get rid of stuff when it becomes outdated, and that it represents probably the documents produced by that institution.” User Experience with Using Other IRs When interviewees indicated that they were unfamiliar with the term “institu- tional repository,” we provided the fol- lowing broad definition: “Institutional repositories are digital collections of research and learning materials produced by members of the academic community at each institution.” We then asked inter- viewees if they had ever used any other institution’s institutional repository. Their responses to this question provided fur- ther confirmation that their understand- Unheard Voices: Institutional Repository End-Users  29 ings of the term “institutional repository” were quite diverse. Many interviewees were confused about whether library databases (such as ERIC and JSTOR), faculty and departmental Web pages, open courseware sites, and/or space on university servers would count as IRs. Interviewee A02 stated, “I’m not sure if ARTstor is an institutional repository or JSTOR—would those count?... They’re not run by a specific institution but they kind of are more of a database I guess.” Interviewee D10 pointed out that he may have used an IR “unknowingly.” When attempting to discern whether they had used any other IRs, some interviewees focused on the “institutional” part of the phrase “IR” and inferred that if an information resource did not cover the entire university, then it was not an IR. Interviewee A03 described, “[Name of IR] is the only one that I’m aware of that’s an actual university repository. I use 10 or 15 different library sites but they’ve never been advertised as a repository for every- thing in the university.” Similarly, inter- viewee E17 responded, “I really haven’t. The closest thing I could come up with would be like the WACC [Writing Across the Curriculum Clearinghouse] which is not… I mean it’s sanctioned by [name of institution] but it’s just that one area of [name of institution] so I don’t think I’ve known of anything at all like this.” Our findings suggest that, although IR end-users did not spontaneously use the terminology “institutional reposi- tory,” they felt that they had a basic un- derstanding of what the IR is. However, those understandings varied quite a bit. Many interviewees clearly recognized the relationship between the IR and its host institution. Interviewees’ descriptions of the IR were often very insightful and cre- ative, employing similes and metaphors that helped to illuminate their unique perceptions of their institution’s IR. In fact, interviewees expressed a diverse set of notions about what makes an IR an IR, ranging from the simple provision of storage space to housing content “for everything in the university” (A03). It was apparent that many of them were uncer- tain about what exactly constitutes an IR. What Approaches Do End-Users Take to Accessing and Using IRs? We asked interviewees how they initially learned of the IR, how they usually access the IR, what methods they use when look- ing at or looking for content in the IR, and how well those methods work for them. How Users Initially Found Out about the IR Interviewees described many different ways they first learned about the IR. These included library workshops; suggestions from advisors, professors, colleagues, or university administrators; and notices from the university regarding the require- ment that students deposit their theses/ dissertations in the IR. Interviewee C05 stated, “I first got to know simply because I attended a workshop. We have all kinds of workshops by the library and I attend- ed it and the professor there recommend us, ‘hey you might to check [name of IR] you might get new ideas there’ and then okay oh I never heard of it and I tried.” Some interviewees mentioned that they noticed a link to the IR on their library’s homepage and decided to explore. Inter- viewee D13 explained, “I hadn’t heard of it from word-of-mouth, but from the library’s Web page… I really explored that page a bit so that’s from where I found [name of IR].” Other interviewees first happened upon the IR simply because a Google search had landed them there. Still others indicated that they had first learned of the IR because they were look- ing for a way to store and/or disseminate their own materials. How Users Reach the IR The most common method that inter- viewees reported using to reach the IR was to select the link on their university library’s homepage. Most interviewees, such as D09, were satisfied with using this method of reaching the IR; however, 30  College & Research Libraries  January 2011 interviewee D08 pointed out that the link showing the IR name was insufficient for them to know what clicking on it might achieve, saying “The phrase [name of IR] wasn’t helpful in finding the honors the- ses.... It just says like [name of IR] instead of directly going to the honors theses.” The next most common means was to enter the IR either purposefully or unin- tentionally through Google. Interviewee C07 explained, “I would Google [name of IR] and specify ‘site:[abbreviation of institution].edu’.” Some less commonly mentioned ways included clicking on a direct link into a specific item in an IR that had been e-mailed to them, going directly to the homepage of the IR either by manu- ally typing the URL of the IR which they had memorized into the address bar or by using the history function of the browser, and searching for the name of the IR from the university’s homepage. How Users Interact with the IR Upon reaching the IR, interviewees who were looking for known items navigated directly to the item or used the IR’s search function. They mentioned a wide variety of searching strategies, such as searching by author, title, subject, geographic loca- tion, and keyword and limiting searches to particular collections within the IR. In contrast, interviewees were more likely to browse when they were unsure of exactly what they are looking for. As interviewee C05 explained, “I don’t know exactly how to do a search especially in [name of IR] but… when I know the concept I know the keywords and then I just type the key- words and try to find something useful.… If I just browse it, it’s for fun.… I really don’t care if I can find something. It’s not a goal or desire.” Interviewees also men- tioned various browsing strategies, such as browsing by author, title, subject, and date, as well as browsing through communities and collections, researcher pages, and lists of featured collections, most popular items, and items most recently added to the IR. Interviewees were divided about the merits of both searching and browsing. A few interviewees pointed out that brows- ing works better than searching within their IR. Interviewee D13 explained, “I haven’t searched. I have only browsed because… when I was looking for specific things there wasn’t much going on with those in particular so browse seemed a much better option for me.” In contrast, however, some interviewees explained that browsing was difficult to do in the IR. Interviewee C07 stated, “Normally, I search.… I think the only time I actually browsed was when I looked at the most popular items. That was on the front page to the right.… I couldn’t find a way to get the [Web site] to present information to me in a way that was conducive to ideal browsing.” User Perceived Success in Finding Content in the IR Interviewees gave mixed responses when we asked whether they were able to find what they were looking for in the IR. Several interviewees indicated that they either had not been looking for anything specific or were, in fact, able to find what they had been looking for. However, many interviewees mentioned encounter- ing various barriers to being able to find what they were looking for, including problems with the functionality of the IR Web site, a lack of visibility of the IR itself, and a lack of content in the IR. Several interviewees mentioned dif- ficulties with the IR interface, ranging from disappointment with the layout and organization of the content to lack of features present in more modern Web sites. Interviewee D10 felt that the research in his IR was not organized in a clear way. Interviewee D13 described trying to browse the IR like he browses Wikipedia, but concluded, “I wasn’t able to navigate really well around that thing [the IR].” Interviewee C07 explained, “Maybe I’m spoiled because I’m used to Web 2.0 applications and interfaces but the [Web site] itself is kind of dated look- ing. And it’s difficult to browse… the one metric they give you to browse that’s sort Unheard Voices: Institutional Repository End-Users  31 D11 stated, “I was more interested in seeing what people are researching right here, you know, the people that you could actually contact and talk to about their research as opposed to just reading someone’s article and someone you have no connection to.” Similarly, interviewee E18 explained, “I like the fact that it is lim- ited to just [name of university] because that’s helpful for me in terms of seeing what kind of faculty are working on what kinds of projects.” Other interviewees used the IR to look for models of particular kinds of work, such as theses and dissertations, that had been done at their university. Interviewee D12, an undergraduate working on an honors thesis, used his university’s IR to “see if anyone else had done anything like me” and to see how other students had formatted their honors theses. Simi- larly, interviewee E18, a doctoral student, described looking at dissertations in the IR—“what does a [name of university] dissertation look like? What is it com- posed of?” Interviewee E17, a professor in an English department, recounted re- ferring students in his research methods class to the IR so that they could see good examples of research projects and disser- tations to “see how these things look and feel and smell like.” Another purpose for which IRs are uniquely positioned is networking. When asked to describe her university’s IR, interviewee D11 explained, “What I was drawn to was the fact that it’s within the university so you can theoretically find people in different departments re- searching something similar or at least have some connection whether it’s really an objective clear-cut connection or it’s a meta-connection but you could find people doing similar work across the different departments.” Interviewee E18, a doctoral student in sociology, explained that her university’s IR is more relevant and useful to her than other IRs “because it actually involves the faculty and the grad students in the school I’m affiliated with.” This interviewee specifically men- of like recommendation is popularity but they don’t have the ability to rate articles or promote articles.” The third barrier that interviewees mentioned was the lack of content in the IR. Interviewee comments in this regard ranged from pointing out that they were unable to find what they were looking for in the IR or that there wasn’t much in their particular field in the IR. Interviewee B04 stated, “Well, there’s a lot of informa- tion that’s not here so it would be nice to have more collections I think.… I feel like it’s fairly new or young in its process of collecting information even though it already has quite a bit.” Several other interviewees were pleased with what they were able to find in the IR. Interviewee E17 described, “It’s been very serendipi- tous so far. I can’t say that I looked for a specific thing and found it. Every single time I looked for something specific, I ended up going with something else and forgetting what I looked for initially… sometimes I do find what I’m looking for but 9 times out of 10 I find something new that was published in some obscure place that I never heard of… and I’m like ‘Wow, I need to read that’.” For What Purposes Do End-Users Use IRs? We asked interviewees to describe in detail what they were doing when they clicked on the link to the recruitment form on the IR homepage or when they were last using the IR—what they were looking for and what they ultimately did with anything they had found. Ad- ditionally, we asked questions about their experience with using the IR, the various purposes for which they have ever used the IR, and the different uses they have made of IR content. Motivations for Accessing the IR Interviewees identified various purposes for which IRs are particularly well-suited. First, several described using the IR spe- cifically to find out what research is going on at their own university. Interviewee 32  College & Research Libraries  January 2011 tioned using her university’s IR to identify potential members for her dissertation committee. She explained, “It’s been use- ful in that way to find out what faculty members have actually gone through the whole process of sitting on a dissertation committee.” Finally, interviewees identified the IR as a means of gaining access to materials that are not available through any other channel, particularly unpublished works such as conference papers, recent works produced by a particular author, and the raw data underlying the findings pre- sented in published articles. Interviewee E15 explained, “I could imagine then you might find an article or chapter, an unpublished work that you didn’t know existed through [name of IR] that you wouldn’t find through, say, PsycINFO or some other kind of published database.” Interviewee E17 also described using a particular researcher’s page within the IR to find articles that had been published in places with which she was not familiar. Several interviewees described the benefits of being able to access the raw data underlying research projects. In- terviewee C07 explained, “I like how the [name of IR] projects I’ve looked at can give you the data and then they give an analysis because I can take the data and arrive at a different analysis of the situation.” Interviewee E15, a Com- munications professor, emphasized the value of raw data, such as videotapes and transcripts, for pedagogical purposes. She stated, “We’re all looking for ways to help students get interested in research… they respond really well to being able to personalize and tell them stories about the person who did the work or show them video clips or show them the data and help them kind of make that process come to life so it’s not just received dry facts but there’s an interactive component to it. So I think for me, [name of IR] was one tool to be able to do that.” Interviewees brought up purposes that pertained not just to scholarly endeavors, but to everyday life information needs as well. Interviewees B04 and D13 described using the IR to find out information about their local area. Another interviewee men- tioned that it was nice to be able to use the IR to “see what your friends did for their senior research so you have something to talk about” (D10). Interviewee E18, a doctoral student, described using her university’s IR for inspiration (and pro- crastination) by accessing dissertations to read the acknowledgements sections. Several interviewees indicated that they had used, or will use, the IR “for fun” (C05), “for my own personal benefit” (E19), “for my own enjoyment” (B04), and “for my own pleasure” (C07). Use of IR Content Interviewees mentioned that IR content had helped them to keep current in their area, to brainstorm, to structure their own writing, and to help their students. Interviewee D08—an undergraduate student—described using information from her university’s IR to brainstorm and figure out how to structure, format, and finalize her senior honors thesis. Interviewee A03, a museum curator, explained that his use of the IR is part of his ongoing literature survey and a way of “trying to keep up with the current literature in my area.” Interviewee E15 described showing IR content in class and then expressed some concern about whether this was legal, eventually con- cluding that this was an example of fair use. Interviewees also identified specific purposes for which IRs are uniquely well-suited, such as locating content out- side the traditional scholarly publishing process, looking at models of materials such as theses and dissertations previ- ously accepted by their university, and identifying potential collaborators. To What Extent Do End-Users Perceive the Information from IRs to Be Credible, Rela- tive to Information from Other Sources? Toward the end of each interview, we asked interviewees to talk about the relative trustworthiness of information Unheard Voices: Institutional Repository End-Users  33 obtained from several different sources— the IR, their university library’s Web site and catalog, general Web search engines such as Google, and Google Scholar. Overall, many interviewees described IRs as more trustworthy than Google and Google Scholar, interpreting the term “trustworthy” in many different ways such as factual, legitimate, reliable, repu- table, professional, comprehensive, up- dated, and verifiable. These discussions often revealed the rationales underlying interviewees’ judgments of the relative trustworthiness of various information sources. Interviewees’ assumptions about the extent, status (that is, peer-reviewed or not), and specific creator of the content accessible through an information system clearly influenced their judgments about the trustworthiness of information they obtained from a particular system. Many interviewees also brought up the impor- tance of verifying information they found by checking with multiple information sources. Perceived Credibility of Institution One factor many interviewees pointed out is that the IR’s tie with the institution positively influences their perceptions about the trustworthiness of the content they find in the IR. Interviewee A02 explained, “I trust [name of institution] doesn’t allow anything on [name of IR]… I’m sure the people who produce things for [name of IR] aren’t just, you know, it’s not like a casual choice.” Similarly, inter- viewee E17 stated, “If it was just [name of institution], I would say it’s probably good quality because [name of institu- tion] is putting its name on the line.… I know that it’s going to be a little bit more uniform than a Google search because it’s all sanctioned by [name of institution], so it must be good.… I would hope that any state-supported or private university wouldn’t have something in there that wasn’t at least as reputable as their faculty members… here I would get stuff that’s either been published or presented, I’m assuming, or accepted by the university.” Perceived Credibility of Content Creator Several interviewees pointed out that they base their trustworthiness judgments not on where they retrieve an item from (such as the IR), but on the creator of the infor- mation. Interviewees mentioned relying on criteria such as the name and/or affili- ations of the author, the publisher, and the provider (that is, the university) of the information. Interviewee E15 explained, “I would be less concerned with the gate- way where I found it and more concerned whether it was published or unpublished and if published, in what journal? ... [I]t would have very little for me to do with the gateway through which I accessed it and it would have everything to do with what it was I was looking at and my best guess as to who produced it and why they produced it and whether anybody else had vetted it through the process.” Perceived Status of Content in the IR Interviewees held conflicting assump- tions as to whether IR content has been peer-reviewed or vetted in some way. While some interviewees expressed relatively greater trust in IR content because they believed that it had been peer-reviewed or at least passed through some sort of official approval process to enable it to be placed in the IR, other interviewees expressed concern about the trustworthiness of IR content due to their perception that anybody can put anything into an IR. Interviewees D08 and D10 both felt that IR content is more trustworthy because it “has at least been approved by a lot of other people” (D08) and it has been “reviewed by probably a primary investigator through research or something like that” (D10). In contrast, however, interviewees A02 and D14 were less confident about IR content. While interviewee A02 described IR content as “sort of personally proofed content that you decide to put online yourself,” interviewee D14 explained that she doesn’t trust anything that has not been peer-reviewed so “with an institutional repository, it’s hit and miss. There’s some 34  College & Research Libraries  January 2011 peer-reviewed articles on it but most of them are not.” Perceived Extent of Content in the IR Perceived lack of comprehensiveness in terms of content negatively influenced interviewees’ trustworthiness judgments about IRs. Conversely, information over- load negatively influenced trustwor- thiness judgments about Google and Google Scholar. Interviewee A01 stated, “I wouldn’t rate it [the IR] as highly.… There’s not as much information in there” while interviewee D14 explained, “Google is like throwing darts. Sometimes you get good stuff and 99% of the time, you get a lot of garbage too. You have to be discern- ing.” However, a couple of interviewees expressed different opinions in this regard. Interviewee E18 expressed the opinion that Google Scholar is the least trustworthy because she has found it to be less compre- hensive. Additionally, several interviewees pointed out that some types of IR content are more trustworthy than others: “I think there are things that are trustworthy but the things that the students have are maybe less trustworthy” (D08). Past Experience with an Information Resource Another factor that strongly influenced several interviewees’ perceptions about the relative trustworthiness of informa- tion from various resources is their own prior experience with each of them. Interviewee A02 explained, “I trust it [the library catalog] a lot because I can practically… whenever I use it, I usually get what I need so for that sort of type of trust it works.” Interviewee E20 similarly explained, “I pretty implicitly trust the database portals and things like that that I use. I guess I don’t have any real reason to, but they’ve never really steered me wrong.” Specifically with regard to the IR, interviewee D13 explained, “When I started using [name of institution], I needed to look through it for some time before I started trusting it. As of now, I wouldn’t. But as I go on using it, I would.” Use of Multiple Sources for Verification Many interviewees brought up the need to verify information across multiple sources. This need was felt not only in relation to information retrieved through the open Web, but also to information retrieved through libraries and IRs. Interviewee C07 explained, “I guess I see libraries as a very mutual place and then the resources that they are affiliated with like EBSCO, like JSTOR, they’re just repositories. I mean anyone can submit articles… and you can find varying opin- ions, but that’s the beauty of it because you can cross-reference… it’s not about finding one source that says something to support your argument, it’s about finding multiple sources, multiple convincing sources are more important than having one source that says something… you can just cross- reference, verify information from one source with another source.” Interviewee D14 similarly explained that when she comes across IR content that has not been peer-reviewed, she looks at it and says, “That’s interesting—I need to verify it.” To What Extent Are End-Users Willing to Return to the IR and/or to Recommend the IR to Their Peers? We asked interviewees whether they knew of any other IR users, whether they could think of any reasons that people may not be using the IR, whether they were likely to use the IR again, and wheth- er they were likely to recommend the IR to their peers. We sought to capture not only end-users’ perceptions about these issues, but also any rationales underlying these perceptions. User Awareness of Other Users of the IR Very few interviewees knew of other people using the IR. The people who did often mentioned contributors rather than end-users. Interviewee E16 offered, “There are a couple people actually. Our Vice Provost… actually has a Website and an online journal and so he’s pretty excited about it. I know he uses it. There are a couple of people from [name of institution] Unheard Voices: Institutional Repository End-Users  35 who have their thing on [name of IR].” However, when asked whether he knew of anyone who is using the IR to search for items, he replied, “I don’t know that actually I don’t know.” Interviewees also often mentioned people with whom they share(d) a specific context in which they became familiar with the IR, such as learn- ing about it in a library workshop or being required to deposit their theses in the IR. Several interviewees mentioned that they believed that lack of visibility or awareness of the IR contributed to their not knowing anyone who uses the IR. Interviewee C06 said, “I think there aren’t many students know about [name of IR] because I was talking about [name of IR] to my friend and she didn’t know.” Interviewee A02 similarly responded, “I don’t really know anyone… I don’t want to give a general ‘no’ for undergrads but I don’t think it’s pretty well publicized anywhere.” User Perceptions Regarding Possible Reasons for Non-Use of the IR When asked if they knew of any reasons that people would not be using the IR, nearly two-thirds of our interviewees suggested that this may be due to a lack of visibility or awareness of the IR. In- terviewee C07 explained, “Well, there’s the obvious problem that nobody knows about it.” Interviewee E16 similarly stated, “Well, first thing, as a faculty at [name of institution] I did not know that this existed and so that’s the first point.” Interviewee A01 described the IR as a “well-kept secret,” and explained that the only people likely to know about it are librarians and contributors. Several interviewees pointed out that people may not be using the IR because it’s not readily apparent what exactly is in the IR and it appears that the content is tailored or specialized in such a way that it is of limited use to them. Interviewee D08 explained, “I don’t really know about all the other stuff online [in the IR]. I haven’t really used it. It just seems like it’s really narrow.” Some interviewees explained that it may be hard for them to recommend the IR since they don’t really know what is in it. Interviewee A01 ex- plained, “I guess it’s hard to recommend using [name of IR] as a researcher without really having a better understanding of what it is people are putting in here.… What is happening here—I have no idea.” Interviewee D13 explained that the appeal of Google is its ability to retrieve “global information” and that the reason IRs are not well known may be the restricted nature of their content. Interviewee E17 pointed out that she can find the informa- tion she cares about via Google and that the IR would not be her first thought if she was looking for something in her field. Some interviewees suggested that people might not be using the IR due to a lack of content. Interviewees D13 and E20 both described their future use as contingent on growth of the content in the IR. Interviewee D13 stated, “I expect to be using more of it soon provided that it gives me more information.” Several interviewees brought up a lack of content in the IR when asked if they would recom- mend the IR to their peers. Interviewee C05 said, “I would be more than happy to recommend this [the IR] to them [in- terviewee’s friends] but I have to say I will warn them, you know, try to explore if you can, but do not expect too much.” A handful of interviewees thought that people might not be using the IR due to dissatisfaction with the look or functional- ity of the IR. Interviewee D11 described, “I think the usability interface could be somewhat better. It kind of looks a lot more like a database… and really not as aesthetically interesting like a Website… just a lot of links and hyperlinks and text and not really much helping you sort through all of that.” Interviewee D10 simi- larly explained that it was easier to find something using Google than to take the extra step of going to the IR to search for it. Likelihood That Users Will Use the IR Again The majority of our interviewees indi- cated that they are likely to use the IR 36  College & Research Libraries  January 2011 again. Interviewee A02, an undergraduate student, explained that he will return to the IR to look at the theses “since now it’s a lot more convenient.” One inter- viewee (A03) explained that he has the IR bookmarked and that he will definitely return to it since each university has its own strength. Another interviewee (D13) said that he will return to the IR because “it sounds like a good place… to keep in touch.” Some interviewees explained that they will return to the IR because it gives them access to content that they may not be able to access in any other way. In fact, several interviewees used words like “in- teresting,” “neat,” and “cool” to describe IR content. Interviewee E20 said, “I would like to make more use of it [the IR] and I would also like to see other folks make more use of it and I think as it gets more widely disseminated as a place for people to put information up, I think it will be- come a really valuable rich resource for interdisciplinary collaboration and things like that. Right now, I feel like it’s just starting out and it hasn’t sort of reached its full potential yet.” Likelihood That Users Will Recommend the IR to Their Peers Nearly all interviewees indicated that they would recommend the IR to their peers. Many interviewees mentioned spe- cific advantages of using the IR to locate content, such as being able to access con- tent that is not available elsewhere, being able to more efficiently locate information, and being able to find reputable informa- tion. Several interviewees indicated that they would recommend the IR specifically for things that are not in the “standard channels of literature searching” (A01), such as items that are not in print. Inter- viewee B04 described searching the IR as more efficient than using a search engine: “it’s very user-friendly and it’s easy to find what you’re looking for… it won’t take you down a rabbit hole.… This is a very convenient and efficient method of trying to find what you’re looking for.” Interviewee D10 described the IR as “a pretty good source of information that’s reputable.” How Do IRs Fit into End-Users’ Information-Seeking Behavior Landscapes? We sought to identify where IRs fit into end-users’ overall information-seeking landscapes. To this end, we asked inter- viewees how they decide where to begin when they are looking for information and whether they feel that the IR helps them to find more information and/or better information. How Users Decide Where to Begin a Search for Information When asked where they tend to begin a search for information, many interview- ees pointed out that it depends on what they are looking for. As interviewee D14 explained, “It depends what I’m looking for. If I’m looking for a book, I’ll probably go to WorldCat first… If I’m looking for an article and I have some general idea where I’m going to find it… I’ll probably go to an electronic database first. If I’m looking to get started and get some ideas, I’ll probably go to Google first.” Google was often the first choice of interview- ees, especially those looking for general information. Interviewee A01 stated, “In my personal curious ramblings… I use Google a lot. It’s a first impulse.” Several interviewees mentioned using Google first “just to get a sense of what’s out there” (E17) or to get a “surface level gloss of a particular term or a topic or a citation” (E20). Interviewees often described turning to library databases if they were looking for more scholarly materials; however, they were not always sure which specific databases to turn to. As interviewee A02 explained, “I usually go to Google first.… [Y]ou have to know what each database offers and there are many databases I don’t know about.” A couple of inter- viewees mentioned that they would turn to a colleague. Interviewee E15 described, “I also just contact people. My field is a fairly small one and so if I’m looking for Unheard Voices: Institutional Repository End-Users  37 someone within my field, the chances are that I might have a passing acquaintance with them or know someone who knows them.… I would Google them to get their e-mail and then e-mail them and ask them for what I need.” Just three interviewees mentioned IRs when talking about where they would begin a search for informa- tion. Interviewee D10 (an undergraduate student) explained that he would look in an academic database or in the IR if he were working on a research article or report while interviewee D09 (a fac- ulty member) explained that she might turn to the IR to track down something mentioned on a listserv post. In contrast, however, interviewee C06 (a master ’s student) saw a much larger role for the IR in her information-seeking behavior. Her assessment: “Recently I tend to use [name of IR] because I think the articles is more recent to me… the other advan- tage of it is I can search by author… the articles of the author and in [name of IR] will be showing, so I think it is good facts… it’s easier to use [name of IR] than other online databases because I have to link from my library gateway and to the online database… [name of IR], I think, is easier to use.” Benefits Associated with Using the IR When asked whether the IR enables them to access more information and/or better information, interviewees identified a wide range of benefits associated with us- ing the IR. These benefits can be roughly classified into the following categories: 1) Increased availability/accessibility/conve- nience; 2) Access to content more quickly after it is produced and access to content that is not usually available through the traditional publishing channels; 3) Abil- ity to visit just one place to look at all the work produced by an author or univer- sity; and 4) Ability to identify potential networking opportunities, especially to engage in some form of collaboration, whether intradepartmental, interdepart- mental, cross-disciplinary, and/or cross- institutional. Several interviewees pointed out that the IR provides them with much more convenient access than was previously available. Both interviewees A02 and E16 appreciated the greatly increased convenience of being able to access theses and dissertations without having to travel somewhere or use a microfiche. Inter- viewee E20 was able to avoid a trip to the library because she was able to access an electronic copy of one of her professor’s books through the IR. Interviewee A03 stated, “No one library can ever have everything you need but it’s… making it electronically available and getting more and more of that information out to make it as accessible is a big thing and also it makes the world a whole lot smaller when you’re looking for information in Austra- lia and you can dial it up on the Web.” Interviewee E17 pointed out, “I think this is where we need to be heading and that’s instead of having this elite publishing… it seems to me that we ought to find other ways of disseminating information other than print because having it available means I’m using it.” More specifically, many interviewees mentioned that the IR gives them access to content closer to the time when it was ac- tually produced and/or that it gives them access to types of content that are not usually available through the traditional publishing channels, such as conference papers, theses, and dissertations. Inter- viewee A01 stated, “It’s quicker. I think that’s the main thing. With a repository of this type, you can bypass the standard publishing process.” Interviewee E17 similarly explained, “If I have to wait for the library to get me a copy of something, I may not have time to do that and it’s so nice to have it available so I can use it.” Interviewee D14 stated, “I think in a larger sense, yes, institutional repositories will be useful.… I think intellectually and on a pragmatic level, yes, they’re great things and we need to have them because they capture materials we’re not going to find anyplace else like lectures or lecture notes and visiting speakers and things 38  College & Research Libraries  January 2011 like that. Documenting pictures from historical portions of the library’s past or the university’s past. There’s just a lot of cool stuff on them.” Several interviewees mentioned that using the IR is especially advantageous in that it enables them to visit just one place to look at all the work by one particular author or at all the work being produced by one particular university. Interviewee E17 explained, “One of the things I like about the [name of author] site [within the IR] is he’s doing it, he’s saying ‘this is all my work’. A lot of it I had never seen before because it was published in places I wasn’t familiar with.” She described directing her students to this site, say- ing, “Hey, why not get it from the man himself? Why not see what all he’s put up and see what kinds of things you can get?… Here’s a cool scholar who does really groundbreaking stuff and he lets you download his stuff.” Interviewee A01 similarly described the benefit of being able to access all of the work produced by one particular university in one place. He stated, “I think this is a great place if you’re curious about the [name of univer- sity] and types of research that’s going on here, it’s a great place to just come and browse and see what the heck’s going on here.” Another commonly mentioned ad- vantage of the IR is its potential to enable collaboration of various forms, including intradepartmental, interdepartmental, cross-disciplinary, and cross-institutional. Interviewee D11 explained, “What I was drawn to is the fact that it’s within the university so you can theoretically find people in different departments researching something similar or at least have some connection whether it’s really an objective clearcut connection or it’s a meta-connection but you could find people doing similar work across the different departments.” Interviewee E17 similarly asserted that the IR is “A great way to interpollinate or cross-pollinate between disciplines and departments and things.” She pointed out, “Wouldn’t it be cool to look at the kinds of things that were overlapping and see if we couldn’t get grants together or whatever and do more cross-disciplinary work.… We could do something not just cross-disciplinary but cross-institutional.… I would love to see every single university come up with something like this.” Discussion One of the major findings of this study is that, like IRs themselves, end-users’ understandings of them are very diverse. This is reflected in the fact that end-users report turning to IRs for a wide variety of content types ranging across many different types of genres and topics and purposes. Confirming Pickton and McK- night’s37 assertion that graduate students are an important group of potential IR users, three interviewees participating in this study were doctoral students and four were master’s students. Our findings also echoed what Pickton and McKnight found in terms of the types of content that end-users hope to find in IRs (that is, jour- nal articles, conference papers, and theses and dissertations). However, in contrast to Pickton and McKnight’s participants, many interviewees in our study were interested in being able to access unpub- lished works and raw data. Additionally, several interviewees expressed an interest in accessing items representing the every- day intellectual life of the university, such as lectures, presentations, and newslet- ters. Interviewees’ interest in a wide array of content suggests that end-users will particularly value IRs with wider scopes in terms of collection policies. Interviewees clearly did not view the IR as a stand-alone information system that served one well-defined purpose. Interviewees’ methods and reasons for visiting the IR were very diverse. Al- though this study does not confirm that most end-users tend to reach the IR via Google because of our study design, it found that interviewees reached the IR through various paths, such as clicking on a link to the IR from their university Unheard Voices: Institutional Repository End-Users  39 library’s homepage or searching for the name of the IR on their university’s homepage. The purposes for which inter- viewees used the IR echoed some of the ones already mentioned in the literature, such as Maness, Miaskiewicz, and Sum- ner’s38 findings that potential IR end-users wanted to access course content for use in designing one’s own courses, access raw data from research projects, and identify faculty members and graduate students with similar research interests. However, interviewees in our study mentioned other motivations as well, such as using the IR to access content that they could use as models for their own work and to help with everyday life information needs. Additionally, several interviewees mentioned using the IR simply for fun or their own enjoyment. Many interviewees also mentioned using the IR as a way to find out what research is being conducted at a particular university and to identify potential opportunities for networking and collaboration. The IR’s capacity to facilitate and support networking and collaboration could help to improve their future development and sustainability. The findings of this study indicate that end-users are still uncertain about the scope of IRs. The questions of what IRs are and what they offer remain largely unsettled. Interviewees expressed a great deal of confusion about whether an IR is different from a library database. Interviewees were also confused about whether IR content is peer-reviewed and/ or whether it has gone through some sort of approval process before appearing in the IR. Because of this lack of knowledge, end-users may be hesitant to trust some or all IR content. Clear definitions and delineations of scope on IR homepages may help in this regard. Furthermore, end-users are likely to find it helpful if IR policies are posted, especially those regarding what content is allowed in the IR and how such decisions are made. Overall, interviewees did consider the IR and its content to be more trustworthy than Google and Google Scholar, because the association of the IR with the library and with the university as a whole is im- pelling end-users to automatically confer the IR with some degree of credibility. However, the credibility judgment pro- cesses they described were rather compli- cated. Several interviewees explained that they evaluate an item’s trustworthiness not based on where they retrieved it from, but on the creator (that is, the author and his/her affiliations and the publisher) of the item. Interviewees’ credibility judg- ments of IR content were also based on how much content was in the IR, whether they believed that IR content had been peer-reviewed or passed some other type of approval process, and how they felt about the institution with which the IR is associated. Their credibility judgments, in general, were also strongly influenced by their past experience with an information source (such as a library catalog, search engine, or IR) and by their ability to cross- verify information using multiple sources. Although IRs were designed to at the very least supplement and potentially even re- place the traditional scholarly publishing paradigm, end-users remain uncertain about whether any parallel form of peer- review has been implemented within the IR. Although most interviewees have not yet developed a strong sense of loyalty toward the IR, they recognize the value and unique nature of IRs. They are will- ing to return to the IR and to recommend the IR to their peers. Many interviewees pointed out unique benefits associated with using the IR, such as gaining access to types of content that have traditionally been difficult to locate (such as conference papers, theses and dissertations, raw data sets), gaining prompt access to materials that they would like to use in their own work, being able to access all the work of one particular author, one particular department, and/or one particular univer- sity in one place, and having the oppor- tunity to identify potential collaborators within their own or other departments, disciplines, and universities. Factors in- 40  College & Research Libraries  January 2011 fluencing interviewees’ judgments about whether they would return to and/or rec- ommend the IR included the visibility of the IR, the unique nature and limitations of the content available through the IR, the perceived quality of IR content, the look and functionality of the IR, and any interactions between the extent of content in the IR and its functionality. Interviewees also brought up several areas for improvement for IRs. Interview- ees would like the IR to be more visible and transparent so that they know not only that it exists, but also what types of content they are likely to find there. They are also interested in increased breadth and depth of content and improved func- tionality in terms of the appearance and organization of the IR and its ease-of-use. Improvements in visibility can help IRs not only to reach the attention of potential new contributors and end-users, but also to retain the attention of existing contribu- tors and end-users. Increases in content and in transparency regarding the types of content housed in IRs will similarly help in this regard. Enhancements to the appear- ance and functionality of the IR will also help IRs to ensure that end-users continue to be willing to use and recommend the IR. Crow’s sentiment that the increased availability and accessibility offered through IRs will result in improved schol- arly communication and better scholar- ship39 was, in fact, echoed throughout several interviewees’ comments. End- users are interested in what IRs can offer and they are eager to use information systems with which they have had prior success. As Suber stated in his open ac- cess predictions for 2008, “Scholars who find articles in repositories must be led to realize that they are finding them in repositories. They need to see and credit the role of the repositories, not just the role of Google or OAIster or the search engine that brought them there.”40 The opportunity to be instrumental in improv- ing the scholarly communication system to such a degree that it leads to better scholarship is an exciting prospect for IRs. Interviewees recognized and appreciated the ability of the IR to provide access to published materials much closer to the time when they were actually produced and to provide access to materials that tend to be unavailable under the current scholarly communication system (such as raw data files and unpublished studies). Harnessing this opportunity can enable IRs to become more valuable to potential contributors and end-users. Conclusion This exploratory study of IR end-users has identified several important areas for future IR development: increased publicizing of the IR, increased content recruitment, and improved appearance and functionality. Extrapolating further from interviewees’ comments, it may be possible to improve the perceived trustworthiness of IR content by clearly stating the criteria, if any, for accepting content into the IR and by attaching watermarks or metadata to each item to indicate whether it has undergone peer review and/or been previously published. One important finding generated by this study is that IR end-users may be assum- ing that the reputation of the university applies uniformly to all work in the IR. For this reason, making clear the criteria for acceptance into the IR is of particular importance. Another potential area for future IR development is attempting to tailor the IR to better fit some of the per- haps previously unanticipated uses of the IR, such as brainstorming, networking, and collaborating through making adap- tations such as the implementation of Web 2.0 functionalities like the ability to rate and promote items. As IRs and end-use of IRs continue to evolve, it will be im- perative to continue to attempt to identify and solicit up-to-date feedback from this important group of IR stakeholders and to incorporate this feedback into plans for future development of the IR. Our study also suggests both the need for, and the potential value of, future studies involving IR end-users. Ideally, Unheard Voices: Institutional Repository End-Users  41 future studies could progress beyond the exploratory nature of our study and aim to include not only more participants, but also participants who are more representative of typical IR end-users. As technological advances continue, identifying IR end- users for possible participation in studies such as these is likely to become easier. For example, some IRs have begun to use frames on their Web sites in such a way that even if an end-user enters the IR through Google, the IR identification is prominently visible. These frames could be used to place announcements of research studies that would then be seen by far more, and more diverse, IR end-users. Knowing up front that the line between IR contributors and end-users is quite fuzzy and being able to recruit a much more representative sample of IR end-users can help researchers to conduct more rigorous studies in this area. Our experience in both conducting this exploratory study and reviewing the re- lated literature leads us to emphasize the twin needs for IRs to be made more appar- ent to end-users and for end-users to be made more visible to IR managers. Both contributors and end-users will benefit from increased attention and considered response to IR end-users. Awareness of end-users’ perceptions, motivations, and uses of the IR will enable IR managers to better tailor the IR to both contributors and end-users. Furthermore, such attention to the end-user’s experience can lead to increased end-use, which in turn, may help to motivate potential contributors who would like to have a priori evidence that their contributions will get used by others and that the IR truly is an efficient way to increase the reach and impact of their work. Notes 1. Eugenio Pelizzari, “Harvesting for Disseminating: Open Archives and the Role of Academic Libraries,” Acquisitions Librarian 17, no. 33/34 (2005): 47. 2. Raym Crow, The Case for Institutional Repositories: A SPARC Position Paper (Washington, D.C.: The Scholarly Publishing & Academic Resource Coalition, 2002), 16. Available online at www.arl.org/sparc/bm~doc/ir_final_release_102.pdf. [Accessed 2 July 2009]. 3. Clifford A. Lynch, “Institutional Repositories: Essential Infrastructure for Scholarship in the Digital Age,” portal: Libraries and the Academy 3, no. 2 (Apr. 2003): 328. 4. Ibid. 5. Crow, The Case for Institutional Repositories, 20–27. 6. Susan Gibbons, “Benefits of an Institutional Repository,” Library Technology Reports 40, no. 4 (Jul./Aug. 2004): 11–16. 7. Crow, The Case for Institutional Repositories, 23. 8. Joseph Branin, “Institutional Repositories,” in Encyclopedia of Library and Information Sci- ence, ed. Miriam A. Drake (Boca Raton, Fla.: Taylor & Francis Group, LLC, 2005), 237—48, draft available online at https://kb.osu.edu/dspace/bitstream/1811/441/1/inst_repos.pdf [accessed 2 July 2009]; Leslie Chan, “Supporting and Enhancing Scholarship in the Digital Age: The Role of Open-Access Institutional Repositories,” Canadian Journal of Communication 29, no. 3 & 4: 277—300, available online at http://eprints.rclis.org/archive/00002590/01/Chan_CJC_IR.pdf [accessed 2 July 2009]; Crow, The Case for Institutional Repositories; Gibbons, “Benefits of an Institutional Reposi- tory”; Lynch, “Institutional Repositories.” 9. Steve Hitchcock, Tim Brody, Christopher Gutteridge, Les Carr, and Stevan Harnad, “The Impact of OAI-Based Search on Access to Research Journal Papers,” Serials 16, no. 3 (Nov. 2003): 255–60. 10. Margaret Pickton and Cliff McKnight, “Is There a Role for Research Students in an Insti- tutional Repository? Some Repository Managers’ Views,” Journal of Librarianship and Information Science 39, no. 3 (2007): 153–61; Margaret Markland, “Institutional Repositories in the UK: What Can the Google User Find There?” Journal of Librarianship and Information Science 38, no. 4: 221–28. 11. “Budapest Open Access Initiative” (Feb. 14, 2002). Available online at www.soros.org/ openaccess/read.shtml. [Accessed 7 August 2009]. 12. Markland, “Institutional Repositories in the UK,” 222. 13. Dawn Schmitz, The Seamless Cyberinfrastructure: The Challenges of Studying Users of Mass Digitization and Institutional Repositories (Washington, D.C.: Council on Library and Information 42  College & Research Libraries  January 2011 Resources, 2008), 1. Available online at www.clir.org/pubs/archives/schmitz.pdf. [Accessed 2 July 2009]. 14. Dana McKay, “Institutional Repositories and Their Other Users: Usability beyond Authors,” Ariadne, no. 52 (July 2007). Available online at www.ariadne.ac.uk/issue52/mckay/. [Accessed 2 July 2009]. 15. Soo Young Rieh, Beth St. Jean, Elizabeth Yakel, Karen Markey, and Jihyun Kim, “Percep- tions and Experiences of Staff in the Planning and Implementation of Institutional Repositories,” Library Trends 57, no. 2 (Institutional Repositories: Current State and Future, ed. S.L. Shreeves & M.H. Cragin) (Fall 2008): 168–90. 16. Pickton and McKnight, “Is There a Role for Research Students in an Institutional Reposi- tory?” 156. 17. Elizabeth Gadd, Charles Oppenheim, and Steve Probets, “RoMEO Studies 3: How Academ- ics Expect to Use Open-Access Research Papers,” Journal of Librarianship and Information Science 35, no. 3 (Sept. 2003): 172. 18. Nancy Fried Foster and Susan Gibbons, “Understanding Faculty to Improve Content Recruitment for Institutional Repositories,” D-Lib Magazine 11, no. 1 (Jan. 2005). Available online at www.dlib.org/dlib/january05/foster/01foster.html. [Accessed 2 July 2009]. 19. Dorothea Salo, “Innkeeper at the Roach Motel,” Library Trends 57, no. 2 (Fall 2008): 98. 20. Ibid., 103. 21. Peter Suber, “Predictions for 2008,” SPARC Open Access Newsletter, no. 116. Available online at www.earlham.edu/~peters/fos/newsletter/12-02-07.htm#predictions. [Accessed 2 July 2009]. 22. Pelizzari, “Harvesting for Disseminating,” 47. 23. McKay, “Institutional Repositories and Their Other Users.” 24. M. Madhan, Y. Srinivasa Rao, and Shipra Awasthi, “Institutional Repository Enhances Visibility and Prestige of the Institute: The Case of National Institute of Technology, Rourkela,” paper presented at the National Conference on Information Management in Digital Libraries, Indian Institute of Technology, Kharagpur, August 2–4, 2006, available online at http://dspace. nitrkl.ac.in/dspace/bitstream/2080/310/1/madhan1.pdf [accessed 2 July 2009]; Michael Organ, “Download Statistics: What Do They Tell Us? The Example of Research Online, the Open Access Institutional Repository at the University of Wollongong, Australia,” D-Lib Magazine 12, no. 11 (Nov. 2006), available online at www.dlib.org/dlib/november06/organ/11organ.html [accessed 2 July 2009]; Carol Ann Hughes, “eScholarship at the University of California: A Case Study in Sustainable Innovation for Open Access,” New Library World 105, no. 1198/1199: 122. 25. Margaret Pickton and Cliff McKnight, “Research Students and the Loughborough Institu- tional Repository,” Journal of Librarianship and Information Science 38, no. 4 (2006): 203–19; Jack M. Maness, Tomasz Miaskiewicz, and Tamara Sumner, “Using Personas to Understand the Needs and Goals of Institutional Repository Users,” D-Lib Magazine 14, no. 9/10 (Sept./Oct. 2008), avail- able online at www.dlib.org/dlib/september08/maness/09maness.html [accessed 2 July 2009]. 26. Pickton and McKnight, “Research Students.” 27. Ibid., 215. 28. Maness, Miaskiewicz, and Sumner, “Using Personas.” 29. Alesia Zuccala, Charles Oppenheim, and Rajveen Dhiensa, “Managing and Evaluating Digital Repositories,” Information Research 13, no. 1 (Mar. 2008). Available online at http://infor- mationr.net/ir/13-1/paper333.html. [Accessed 2 July 2009]. 30. Pickton and McKnight, “Is There a Role for Research Students in an Institutional Reposi- tory?” 31. McKay, “Institutional Repositories and Their Other Users”; Schmitz, The Seamless Cyber- infrastructure. 32. Karen Markey, Soo Young Rieh, Beth St. Jean, Jihyun Kim, and Elizabeth Yakel, Census of Institutional Repositories in the United States: MIRACLE Project Research Findings (Washington, D.C: Council on Library and Information Resources, Feb. 2007). Available online at www.clir.org/pubs/ reports/pub140/pub140.pdf. [Accessed 2 July 2009]. 33. McKay, “Institutional Repositories and Their Other Users.” 34. Schmitz, The Seamless Cyberinfrastructure, 15. 35. Ole R. Holsti, Content Analysis for the Social Sciences and Humanities (Reading, Mass.: Addison-Wesley Publishing, 1969). 36. “Carnegie Classifications: Lookup and Listings,” The Carnegie Foundation for the Ad- vancement of Teaching, 2009. Available online at www.carnegiefoundation.org/classifications/ index.asp?key=807. [Accessed 18 June 2009]. 37. Pickton and McKnight, “Research Students.” 38. Maness, Miaskiewicz, and Sumner, “Using Personas.” 39. Crow, The Case for Institutional Repositories, 23. 40. Suber, “Predictions for 2008.”