546 Library Impact Data Project: Looking for the Link between Library Usage and Student Attainment Graham Stone and Bryony Ramsden Graham Stone is Information Resources Manager for Computing and Library Services and Bryony Ramsden is Subject Librarian for Computing and Library Services at University of Huddersfield; e-mail: g.stone@ hud.ac.uk, b.j.ramsden@hud.ac.uk. © 2013 Graham Stone and Bryony Ramsden, Attribution CC BY (http:// creativecommons.org/licenses/by/3.0/) The Library Impact Data Project was a six-month project funded by Jisc and managed by the University of Huddersfield to investigate this hy- pothesis: “There is a statistically significant correlation across a number of universities between library activity data and student attainment.” E- resources usage, library borrowing statistics, and library gate entries were measured against final degree award for 33,074 undergraduate students across eight U.K. universities. The research successfully demonstrated a statistically significant relationship between library resource use and level of degree result; however, any conclusions drawn are not indicators that library usage and student attainment have a causal relationship. he current financial climate has had a major impact on resource allocation to librar- ies. In the U.K., the recent Comprehensive Public Spending Review,1 the Browne Review of Higher Education Funding and Student Finance,2 and the in- creases in university fees have focused the need to produce more critical evaluation of university quality in terms of teaching and provision. As a result, academic li- braries need to work increasingly toward demonstrating value and excellence to students and funding bodies3 while providing students with high-quality facilities and support with less cost to the university. Based on original research at the University of Huddersfield, which in- vestigated the non/low use of library resources, the Library Impact Data Project was a six-month project funded by Jisc to investigate the hypothesis that: “There is a statistically significant correlation across a number of universities between library activity data and student attain- ment.” The project looked at usage data of 33,074 undergraduate students across eight U.K. universities. E-resources usage, library borrowing statistics, and library gate entry were measured against final degree award. The research successfully demonstrated a statistically significant relationship between library resource use and level of degree result; however, any conclusions drawn are not indicators that library usage and student attainment have a causal relationship. The article also dis- cusses issues that need to be considered when looking at the data in more depth crl12-406 Library Impact Data Project 547 and examines further research that could be undertaken. Literature Review Investigations into the relationship be- tween library use and undergraduate student attainment in higher education, have, until recently, been uncommon. Much of the research relating low library resource use and its potential impact was undertaken in 1960s and 1970s, with key analyses by the likes of Barkey,4 Lubans,5 and Mann,6 with Knapp7 reporting on de- vising a way of analyzing and embedding library usage into the college student cul- ture. Current research is predominantly based around school library use linked to student achievement. In a sample of 50,000 elementary school students, Ontar- io Library Association8 looked for a link between school library resources, reading tests, and standardised tests, finding a correlation between library staffing and reading achievement. Additionally, they found that a reduction in library staff cor- related with students engaging less with reading. Farmer9 examined 60 Southern California schools, using student stan- dardised reading scores against library training provision, and found that library training offered in information access had a strong relationship with reading scores. Dent found similar relationships between school library use and achievement in her work in Uganda,10 discovering that library access resulted in students attaining higher scores in some subjects than those with no access, despite the time spent on reading being similar overall. Researchers have also considered the relationship between library usage and successful outcomes for academics. This is often part of an effort to reinforce the importance of the library: for example, Tenopir11 emphasizes the need to consider measuring the value of libraries, rather than merely marketing them as impor- tant, to remind users of what libraries can do for them as a population, and as investors in their costs (be it via taxes for public libraries or tuition fees for academ- ic libraries). Tenopir has more recently been involved in specifically examining academic libraries,12 surveying faculty to measure the link between citation use, reading, and seeking information with grant-related activities, finding that the library supports key academic research activities and thus can be considered to make a vital contribution to university value.13 Additionally, Tenopir and Volen- tine14 have demonstrated that academic libraries supply extensive resource provi- sions for materials for their university’s research staff, with a particular focus on those who have received rewards or recognition for their work. Just under one half of all materials read by highly successful academics, including two thirds of all journal articles, are retrieved via the university library: 17 percent of materials obtained would not have been available elsewhere, with the library providing time-saving search software and an extensive online collection, thus allowing staff to concentrate on reading rather than finding. Some studies have begun to look at the relationship between university library us- age and undergraduate student outcomes, but they have been limited by a lack of data on e-resource usage. De Jager focused on the borrowing of books, including specific collection types (short loan and standard stock).15 Some courses were found to cor- relate borrowing with final passing grade, but further investigation of high-achieving students identified discrepancies between usage for specific courses: science high achievers borrowed very little from the standard stock, while humanities high achievers borrowed at high levels. De Jager points out that further investigation is necessary to discover where electronic re- sources play a part in achievement. Han et al16 have looked also into academic library use, comparing usage with grade point averages (GPA) at the Hong Kong Baptist University. They examined the borrowing habits of 8,701 graduates between 2007 and 2009, specifically books and AV materials, finding a positive significant relationship 548 College & Research Libraries November 2013 between borrowing and GPAs. Emmons and Wilkinson17 used a sample of 99 U.S. academic libraries to investigate the impact of libraries on student performance. They demonstrated the ratio of professional library staff to full-time students had a statistically significant relationship to both student retention and graduation rates. However, both Han et al and Em- mons and Wilkinson lack information on e-resource use. Over the past few years, research has been gathered at Huddersfield that suggests a relationship between overall library use and attainment, including e-resource usage. However, this research lacked statistical confirmation of said relationship.18 Pattern19 additionally conducted initial basic analysis of us- age, suggesting that e-resource access at moderate levels does not always lead to higher level degree attainment. Work at the University of Wollongong20 has also been investigating the link between at- tainment and library resource usage, with early results suggesting there is a link.21 In the United States, Megan Oakleaf’s work for the Association of College and Research Libraries22 emphasizes the im- portance of utilizing, among many other measures, student achievement in rela- tionship to library resource provisions, information skills teaching, and qualified staffing levels. The report embraces the use of evidence-based practice in libraries and advocates the use of cross-campus collaborations to gather data on scores and registrar records. Academic libraries in particular are considered in terms of fi- nancial value and impact on research and learning; but, as new students emerge, service is also emerging to become a key consideration.23 Value is shifting toward librarian expertise and experience rather than the collections the library houses, but there is also a shift toward how the library experience and interactions with staff and resources changes the informa- tion seeker and modifies their knowledge and helps them in achieving something in the process. It is important to note that other con- siderations need to be factored for when examining the link between libraries and degree results. The relationship cannot be considered a causal one; however, early work by both Huddersfield and Wollon- gong suggests the link is worthy of further investigation. Background The University of Huddersfield is a medium-sized university in the north of England of around 23,000 students and more than 800 academic staff spread across seven schools and two campuses. The university has a strong history of widening participation and a growing international research portfolio. Com- puting and Library Services (CLS) at the University of Huddersfield has under- taken a number of studies investigating the usage of library resources over the past ten years, in addition to analyzing usage through exercises such as the an- nual SCONUL statistics questionnaire return24 and as a means of measuring value for money for e-resources, such as cost per usage. In 2009, a project group was formed at the University of Huddersfield to revisit work that had originally been undertaken as part of an equality impact assess- ment, which looked at usage of library resources. The project group’s remit was to investigate non/low usage of library resources; as such, the team looked at three main indicators: • book loans using data from the Horizon Library Management System; • access to e-resources using click- throughs from MetaLib, which was Huddersfield’s e-resource system at the time of the initial research; and • access to the library building us- ing statistics from the Sentry gate entry system The results of this analysis showed that, for all three indicators above, non/ low usage appears to range from 30 to 50 Library Impact Data Project 549 F ig u r e 1 N on /L ow -u sa ge D at a C ha rt fo r th e Sc ho ol o f H um an a nd H ea lt h Sc ie nc es 550 College & Research Libraries November 2013 percent over a four-year period. Similarly to the research findings of Bridges,25 the study found that some disciplines used library resources less than others; figure 1 shows one of the original Huddersfield non/low usage charts for the School of Human and Health Sciences. This led the project group to consider that resources, previously thought to be good value for money (for example, e-journals, ag- gregated content, and the like) could be made to work much harder if non/low users could be engaged. It was suggested that it would be interesting to see if there was a relation- ship between the usage shown above and final student grade, and it was agreed to combine these data with final grades for full-time undergraduate students. The group looked at student attainment and usage for students between 2005/2006 and 2008/2009. To eliminate potential anoma- lies, the project discounted distance learn- ers, postgraduates, part-time students, sandwich courses, short courses, and courses with low numbers where ano- nymity could not be guaranteed. At this very early stage, the team noticed what appeared to be a relationship between us- age and attainment, for both e-resources usage and library borrowing. Data were produced for each course in the university and then presented to the school’s Teaching and Learning Com- mittees for discussion. This was seen as a potentially sensitive issue, and it was stated that this did not show a cause-and- effect relationship: for example, a number of other circumstances will affect student attainment, not least the quality of the teaching. However, academics were very supportive and, in some cases, used the data with students to encourage more use of the library’s resources. These data were then presented at the 2010 UKSG Confer- ence in Edinburgh,27 where colleagues in other universities were asked for com- ment. While this presentation attracted a great deal of interest, with a number of universities approaching Huddersfield to benchmark against the data, it was also suggested that the data had not yet been tested for statistical significance. It was therefore not yet known if the expe- rience at Huddersfield was a function of the sample data used, rather than a true reflection of a relationship existing in the wider population. In late 2010, as part of the Jisc Infor- mation Environment Programme 2009- 2011,28 the University of Huddersfield, along with seven partners—University of Bradford, De Montfort University, University of Exeter, University of Lin- coln, Liverpool John Moores Univer- sity, University of Salford, and Teesside University—were awarded funding for the Library Impact Data Project (LIDP), which aimed to support the hypothesis that: “There is a statistically significant correlation across a number of universi- ties between library activity data and student attainment.” Method Aims and Objectives By supporting the hypothesis, the LIDP aimed to give a greater understanding of the link between library activity data and student attainment, which would show a tangible benefit to the higher education (HE) community. However, as stated above, it is important to note that any re- lationship between use and attainment is not yet proven to be a causal relationship and there will be other factors that influ- ence student attainment. Table 1 shows the four work packages that the project undertook. The LIDP reported, in a series of blog posts under eight prearranged tags and a final report,29 this method of reporting allowed the project to continu- ously update on its progress. Legal Issues From the outset of the project, data protec- tion issues were seen as a potential risk and were discussed with Jisc Legal and the University of Huddersfield’s Legal and Data Protection Officers. The primary aims were to ensure data was maintained as anonymous due to its sensitive nature Library Impact Data Project 551 and to ensure data were obtained in a way that abided by legal and university regu- lations with notice provided to students that their resource use may be measured. The data have been fully anonymized and made available for use as part of an open data agreement. Small courses where the cohort is smaller than 35 or where only 5 or fewer students attained a specific de- gree result were excluded from the data to prevent identification. Quantitative Data Due to the short timescale of the project, potential issues with data were antici- pated at the proposal stage. A minimum requirement for data was defined as two out of the three indicators of e-resource use, book borrowing statistics, and library entry. It was felt that a minimum of two requirements (table 2) would reduce risk to the project, and it was hoped that, if participants did run into difficulties, they TabLe 2 Data requirements for Project Partners (all data required for at least one academic year, e.g. 2009/10) Mandatory data: • academic year of graduation e.g. 2009/10 • course title • length of course in years • type of course, e.g. undergraduate • grade achieved30 • school/academic department At least two sets of data are mandatory • number of items borrowed from library (excluding renewals) » either the total number borrowed by that student » or separate values for each academic year • number of visits to the library » either the total number of visits by that student » or separate values for each academic year • number of logins to e-resources (or some other measure of e-resource usage) » either the total number of logins made by that student » or separate values for each academic year TabLe 1 Library impact Data Project Work Packages Work Package Description 1. Project reports and outputs In guidance issued from the programme manager for the Activity Data strand, all projects are required to create a number of blog posts throughout the project 2. Data collection To supply partners with details of activity data required To seek advice from Jisc Legal regarding open data and data protec- tion For partners to supply activity data for collation Release of data under Open Data Commons Licence 3. Analysis of data Analysis of data from partners Collation of focus group data 4. Evaluation Business Plan for future work Issues and recommendations report 552 College & Research Libraries November 2013 would be able to provide at least one set of data versus attainment. As anticipated, a number of partners did run into some difficulties with the data. In addition, the capture of the data itself took a lot longer than anticipated. However, all partners were able to pro- vide at least one set of data across mul- tiple years; one partner was also able to provide computer log on data. Due to the nature of the data provided by the partners (that is, degree classifi- cations rather than percentage scores), it was not possible to run tests such as regression analysis or ANOVA, which require continuous or interval data. Therefore, degree results were considered as groups of students, allowing them to be compared for relationships using the Kruskal-Wallis (KW) test. While analyz- ing the data in this way does not prove a correlation, it does test for the presence of a relationship, and this was considered sufficient to the purposes of the research. The analytical process involved several steps to measure whether a significant relationship exists between result and library use (see table 3). The process was run for each set of data (that is, library entries, electronic resource access, and book borrowing), for each institution, as well as combining all institutions’ data, comparing each set with degree results. As the data were provided in large samples, the Monte Carlo Estimate was used to test simulated samples of the data repeatedly to ensure a significant result. All analysis except the Mann-Whitney test was measured at a significance level of 95% (p=0.05). The Mann-Whitney test was measured at a significance of p=0.05 divided by the number of times it was conducted for each set of data (for ex- ample: if three comparisons of book borrowing levels were made, the level required to produce a significant result for each comparison would be 0.0167).31 Qualitative Data Qualitative data collection was designed to gauge what obstacles discouraged use, and what provisions/support encouraged use, to understand further how to engage more with students and perhaps thereby aid them in attainment. Each institution was asked to run focus groups to gather information about why students may or may not choose to use library facilities or resources. A set of questions was designed to gather data on how and when students use the library, whether they had any difficulties doing so, how they felt their usage compared to others on their course and whether they felt the library resources and environment met their study needs. Guidelines were provided with an intro- ductory speech, as well as ethical infor- mation for attendees and consent forms. Each institution was allowed to modify questions to reflect their own resource provision, and to ask additional questions for their own benefit.32 Data gathered from TabLe 3 Stages of analysis for Measuring Whether a relationship exists between Degree result and Library usage Test Stage Purpose Kolmogorov-Smirnoff test Proves the data does not have a normal distribution, and thus that the KW test is a suitable measure of relationship Kruskal-Wallis test States whether there is a difference between groups of results i.e. between degree results Boxplot analysis Provides visual data in order to plan comparison of specific groups Mann-Whitney test Tests for a difference between specific groups Calculation of effect size Measures how large the difference is between those groups Library Impact Data Project 553 the groups were coded in a style based on grounded theory: the transcripts were initially examined for themes arising, and the themes refined to more specific classifications throughout several read- ings. Codes were then assigned on a final reading, with either single or multiple codes applied to each statement. Time re- strictions meant that only a comparatively basic analysis of qualitative data could be conducted, with coding assessed on the basis of frequency of appearance. Additionally, each student attending a focus group completed a brief ques- tionnaire33 to aid qualification of issues within the group, including questions on how often students visited the library, the main purposes of their visits, the number of items they might borrow per month on average, and how many they purchase. Coding of focus groups was found to be useful in spite of the restricted pro- cessing and analysis. A representative (fictional) statement is provided here as an example of the coding process: Student: I like to use the library for the Macs in the silent area. I use the design software, but I like how they are near the interior design books as it makes it easy to find stuff I need if I suddenly realize I’m missing something. A comment of this nature would be tagged with library resource use with regard to technology and books, as well as ease of use/proximity. Had the student repeatedly referred to a specific issue, it would have been counted for each time it was raised to represent its importance to the speaker. Results Quantitative Data Statistical analysis demonstrated that at a cross-institutional level, there is TabLe 4 Mann-Whitney u Test analyzing the Difference in borrowing Levels between First Class and Third/“Pass as Ordinary” Degrees ranks Degree result N Mean Rank Sum of Ranks Total loans First class 4,207 3,680.41 15,483,477.50 Third class/Pass – ordinary 2,417 2,672.12 6,458,522.50 Total 6,624 Test Statistics34 Total loans Mann-Whitney U 3,536,369.500 Wilcoxon W 6,458,522.500 Z -20.710 Asymp. Sig. (2-tailed) .000 Monte Carlo Sig. (2-tailed) Sig. .00035 99% Confidence Interval Lower Bound .000 Upper Bound .000 Monte Carlo Sig. (1-tailed) Sig. .00036 99% Confidence Interval Lower Bound .000 Upper Bound .000 554 College & Research Libraries November 2013 a positive relationship between book b o r r o w i n g a n d d e g r e e r e s u l t , a n d electronic resource access and degree result, but not between library entries and degree result. Thus, the more a book or e-resource is used, the more likely a student is to have attained a higher-level degree result. At an institutional level, where institutions were able to provide data, they demonstrated relationships in the same way. The example in table 4 is taken from the combined data analysis of all institutions providing loan data, comparing borrowing levels between degree results. Table 4 indicates that the difference is highly significant, as the significance level is very close to zero, even in the use of the Monte Carlo calculation. The effect size is small to medium sized at –0.25 (a medium effect size is 0.3), indicating a drop in borrowing from first- class to third-class degrees. Some individual institutions additionally demonstrated small, specific relationships between library entries and degree result, particu- larly in one institution where, in three years of data, there were significant dif- ferences between first-class degrees and ordinary or third-class degrees, but no difference between upper-level degrees overall. Most results showed effect sizes of small or medium levels (see table 5). (It should be noted here that a small effect size is still a significant result, indicat- ing that there are differences between groups.) Qualitative Data When asked about what they felt led to a good degree result, a combination of per- sonal qualities and referral to resources overall were described, suggesting that students did realize that their use of resources was linked to attainment, but indicating that they did not necessarily always appreciate the varying quality of resources. Responses varied between institutions, but attendees overall indi- cated that library resources were of great importance to them, regardless of what they could obtain freely on the Internet. The library was regarded as a resource in itself, as a place in which to not only find information but to use as a learning/ technology space or as a way to meet up with others on the course to discuss their coursework. Some identified the library as being a space that impaired their learning, due to noise levels being too high or low, or preferring proximity to home comforts. Many attendees dis- cussed a formal process of finding the information they required, regardless of the source of information, some with a systematic way of moving between types of resources, and often seeking information away from reading list provision. Technical issues of both access to information and general technology problems were frequently raised, and students did refer to staff for technical and resource support. TabLe 5 Summary of Data analysis results from all institutions institute athens/ e-resource Logins Loans Library entries 1 √ 2 √ X 3 √ 4 √ √ 5 √ √ √ 6 √ √ X 7 √ √ √ 8 √ √ √ Key: X - no relationship/minimal relationship √ - relationship Where there is a gap in the table, no data was provided. One institute provided problematic data, leading to it being excluded from analysis. It should also be noted that even though library entries often show relationships in this table, dif- ferences only appear between two degrees or are very small in one analysis only. Library Impact Data Project 555 Discussion Data Format While the research has successfully demonstrated a statistically significant relationship between library resource use and level of degree result, there are several issues that need to be considered. Had data been available in a continuous format, something that was not available from data resources at the time, a full analysis of correlation could have been conducted. The nature of the data ob- tainable will depend on data protection laws, as well as regulations set out by the institution: similar work conducted at the University of Wollongong37 allowed them to access average marks, but some of the LIDP project partners had prob- lems obtaining full data sets due to data retention and deletion policies at their in- stitution. What data the project obtained required extensive work on formatting it appropriately for analysis in SPSS, with the labeling of degree results sometimes varying between partners depending on the student data software used. Data Reliability The project partners are very aware that electronic resource data is increasingly problematic to fully understand usage levels. Both borrowing books and log- ging onto electronic resources does not guarantee that the item has been read, understood, and referenced. However, the issue is more complex with electronic re- sources, as several clicks to different data- bases may only return a single document, and heavy usage does not equate to high information-seeking or academic skills. Some courses embed information literacy skills into classes, leading to an initial spike in usage that is not matched as studying progresses. Additionally, students on particular courses such as history may be using more primary materials only avail- able outside of library resources: nonuse of library resources does not mean students are using poor quality information. The amount of data used to prove a relationship is very large; thus, it is more susceptible to demonstrating a relation- ship: data will be analyzed in the future to measure for relationships at a school or course level. Data of a smaller nature will allow for more collaboration with academics to direct student support and education more appropriately. While identifying a relationship is of great importance in both academic library use and in considering the importance of maintaining a public library service, identifying specific groups of high or low users of resources and their level of achievement will provide data which can be used more extensively to the benefit of library users. One area where a statistical signifi- cance was not found was for library gate entry data. However, it does appear that there is a difference between those stu- dents that were awarded a 1st and 3rd. This result was perhaps unsurprising. Students enter the library building for many reasons, as they will commonly contain group study facilities, lecture theatres, cafes, social spaces, and student services; therefore, a student is just as likely to be entering the building for these reasons, which may or may not have an influence on final grade. Qualitative Data With regard to qualitative data, time constraints meant that the method is simplistic and inevitably raises issues with compartmentalizing data into generalized labels and converting it into numeric data. However, it is still of use to gauge what might be considered of particular importance to students at the time of the group meeting. Group attendees are more likely to be “good” students, those who are interested in engaging with library staff, while those who are poor users are less likely to be motivated to attend meetings regardless of the offer of compensation. Groups may not be representative of the variety of courses offered, and some voices may be louder than others, thus skewing the responses. 556 College & Research Libraries November 2013 released under an Open Data license.38 The data have been made available in Excel, comma separated and plain text, and contain final grade and library usage figures for 33,074 students studying un- dergraduate degrees at the eight partner universities. To ensure complete anonym- ity for the partners, they are listed as LIB1 to LIB8. The names of the schools and/or departments at each university have been replaced by randomly generated Ids, and some courses have been “generalized” to remove elements that may identify the institution. Table 6 shows further informa- tion from the data. A further output from the project was a toolkit,39 which provides instructions for libraries to extract their own data and benchmark it against the anonymized project data described above. The toolkit discusses the extraction of the data and gives tips for statistical analysis and sug- gestions for further investigation. Data have already proven useful in library teaching for one partner institu- tion, where LIDP data have been used to engage student interest in inductions, pointing out that their use of library re- sources will impact on their final result and directing them to quality materials to curb use of poor-quality nonlibrary resources. Huddersfield is also using the data in a poster campaign. Lessons Learned A major issue for one of the partners was the retention of data within the university. It is vital for any project that wishes to use data for these purposes to include forward planning for the retention of data. To achieve this, all internal systems and departments need to communicate with each other. Data should never be deleted without first checking the implications of doing this on other departments within the univer- sity. Partners found that this was often based on arbitrary decisions rather than university policy. When examining e-resources usage data, the project has always noted that Conclusion and Further Research Project Aims and Objectives The Library Impact Data Project had a relatively straightforward aim, but a very short timescale in which to achieve it. One risk to the whole project was in getting eight universities to work to a common goal in a short space of time; the overall success of the project was very much down to the contributions of all the partners who made every deadline and, in many cases, provided additional information over and above the project’s specification. The project’s hypothesis was: Is there is a statistically significant cor- relation across a number of universities between library activity data and student attainment? As previously discussed, the project cannot support a correlation due to noncontinuous data for degree results. However, the project has successfully demonstrated that there is a statistically significant relationship between student attainment and two of the indicators— e-resources use and book borrowing statistics—and that this relationship has been shown to be true across all eight partners in the project that provided data for these indicators. It is true to say that, in some cases, there is less significance than in others; but one of the overall aims of the project, which was to test whether the original set of Huddersfield data was an anomaly, has been fully achieved. It is critical at this stage to reiterate that the results and any conclusions drawn from the project are not indicators that library usage and student attainment is a causal relationship. The project is keen to note that other factors will have an influ- ence on students’ achievements. Project Outputs Huddersfield composed several reports for each partner including a complete set of data and analysis of their own data. After consultation with the partners, the release of an anonymized set of data has been agreed. These data have now been Library Impact Data Project 557 the way these data are collected may be questionable; however, it is the only comparable data that can be collected and traced back to an individual. Al- though data from COUNTER reports are far more reliable, there is no way that these data can be attributed to an individual. Different institutions collect different data in this respect (for instance, EZProxy, Shibboleth, or Athens logins); however, many institutions do not collect these data at all. The project found that it underesti- mated the time taken to analyze the data; collection and analysis of the data took up four months of the six-month project. It is recommended that institutions take this into account before initiating this process internally. F i n a l l y, i t s h o u l d b e n o t e d t h a t project data were managed according to English law and that institutions in other countries need to make their own considerations in their data extraction/ analysis. Further Research In November 2011, the University of Hud- dersfield was approached by Jisc to submit a proposal for an extension to the original project. In December 2011, funding was approved to take this forward into phase II of the project. The aim of phase II will be to look at additional data such as gender, age, ethnicity, and country of origin to enrich the quality of data and identify some possible causal links. It is hoped that these data could also provide better management in- formation to show that value-added impact of libraries, university entry points, and fi- nal percentage mark, rather than grade, will be used to measure this. Phase II will also use some of the additional data described above to hold a number of case studies to better understand student behavior. Given the extent to which the findings from LIDP can influence teaching, staffing time, and resource selection, academic libraries can only continue to demonstrate and improve on their value for students and academics alike. TabLe 6 Notes from the Data release grades The awarded degree has been mapped to the following code: • A = first (1) • B = upper second (2:1) • C = lower second (2:2) • D = third (3) • E = pass without honours Library usage Where supplied by the project partner, the following library usage data measures are included: • ISSUES = total number of items borrowed from the library by that student (n.b. this may include renewals) • ERES = a measure of e-resource/database usage, e.g. total number of logins to Met- aLib or Athens by that student • VISITS = total number of times that student visited the library Other Notes Each graduate has been allocated an randomly generated unique ID • Where the course/school/department name was not supplied, it has been replaced with N/A • Where the measure of library usage was not supplied by the partner, the value is blank/empty 558 College & Research Libraries November 2013 Acknowledgements The Library Impact Data Project would like to thank Jisc for the project funding, in particular Andy McGregor for his support as Programme Manager. Special thanks go to Dave Pattern at the University of Huddersfield for his work on the original concept. The success of the project is down to the significant contribution of all project team members; Phil Adams, Leo Appleton, Iain Baird, Polly Dawes, Regina Ferguson, Pia Krogh, Ma- rie Letzgus, Dominic Marsh, Habby Matharoo, Kate Newell, Sarah Robbins, and Paul Stainthorp. Details of all members of the Project team can be found on the Library Impact Data project blog. Notes 1. HM Treasury, Spending Review 2010. CM 7942 (London: The Stationery Office, 2010), avail- able online at http://cdn.hm-treasury.gov.uk/sr2010_completereport.pdf [accessed 10 June 2012]. 2. Lord Browne of Madingley, Securing a Sustainable Future for Higher Education: An Indepen- dent Review of Higher Education Funding & Student Finance (Independent Report, 2010), available online at http://hereview.independent.gov.uk/hereview/ [accessed 10 June 2012]. 3. James G. Neal, “Stop the Madness: The Insanity of ROI and the Need for New Qualitative Measures of Academic Library Success,” in Proceedings of the ACRL 2011 Conference: A Declaration of Interdependence, Philadelphia, 30 March–2 April 2011, available online at https://www.ala.org/ ala/mgrps/divs/acrl/conferences/confsandpreconfs/national/2011/papers/stop_the_madness.pdf [accessed 10 June 2012]. 4. Patrick Barkey, “Patterns of Student Use of a College Library,” College and Research Libraries 20, no. 2 (1965): 115–18. 5. J. Lubans, “Nonuse of an Academic Library,” College and Research Libraries 32, no. 5 (1971): 362–67. 6. Peter H. Mann, Students and Books (London; Routledge and Keegan Paul, 1974). 7. Patricia B. Knapp, Monteith College Library Experiment (New York: Scarecrow Press, 1966). 8. Ontario Library Association, School Libraries and Student Achievement in Ontario. (Ontario: Ontario Library Association, 2006), available online at www.peopleforeducation.com/school- libraries/2006 [accessed 10 June 2012]. 9. Lesley S. J. Farmer, “Library Media Program Implementation and Student Achievement,” Journal of Librarianship and Information Science 38, no. 1 (2006): 21–32; doi:10.1177/0961000606060957. 10. Valeda F. Dent, “Observations of School Library Impact at Two Rural Ugandan Schools,” New Library World 107, no. 9/10 (2006): 403–21; doi:10.1108/03074800610702598. 11. Carol Tenopir, “The Value Gap,” Library Journal 134, no. 12 (2009): 20. 12. Carol Tenopir, Don W. King, Regina Mays, Lei Wu, and Andrea Baer, “Measuring Value and Return on Investment of Academic Libraries,” Serials 23, no. 3 (2010): 182–90; doi:10.1629/23182. 13. Judy Luther, University Investment in the Library: What’s the Return? A Case Study at the University of Illinois at Urbana-Champaign (London: Elsevier, 2008), available online at http:// libraryconnect.elsevier.com/whitepapers/0108/lcwp0101.pdf [accessed 10 June 2012]. 14. Carol Tenopir and Rachel Volentine, UK Scholarly Reading and the Value of Library Resources: Summary Results of the Study Conducted Spring 2011 (Knoxville: University of Tennessee Center for Information and Communication Studies, 2012), available online at www.jisc-collections.ac.uk/ Reports/ukscholarlyreadingreport/ [accessed 10 June 2012]. 15. Karin De Jager, “Impacts and Outcomes: Searching for the Most Elusive Indicators of Academic Library Performance,” in Meaningful Measures for Emerging Realities, Proceedings of the 4th Northumbria International Conference on Performance Measurement in Libraries and Information Services (Washington, D.C.: Association of Research Libraries, 2002), 291–97; Karin De Jager, “Successful Students: Does the Library Make a Difference?” Performance Measurement and Metrics 3, no. 3 (2002): 140–44; doi:10.1108/14678040210453564. 16. Shun Han, Rebekah Wong, and T.D. Webb, “Uncovering Meaningful Correlation between Student Academic Performance and Library Material Usage,” College and Research Libraries 72, no. 4 (2011): 361–70. 17. Mark Emmons and Frances C. Wilkinson, “The Academic Library Impact On Student Persistence,” College & Research Libraries 72, no. 2 (2011): 128–49. 18. Sue White and Graham Stone, “Maximising Use of Library Resources at the University of Huddersfield,” Serials 23, no. 2 (2010): 83–90; doi:10.1629/2383; Deborah Goodall and David Pat- tern, “Academic Library Non/Low Use and Undergraduate Student Achievement: A Preliminary Library Impact Data Project 559 Report of Research in Progress,” Library Management 32, no. 3 (2011): 159–70. 19. David Pattern, “Non/Low Library Usage and Final Grades,” Self-plagiarism Is Style Blog (2010), available online at www.daveyp.com/blog/archives/1385 [accessed 10 June 2012]. 20. Margie H. Jantti and Brian Cox, “Measuring the Value of Library Resources and Student Academic Performance through Relational Datasets,” in Proceedings of the Library Assessment Conference: Building Effective, Sustainable, Practical Assessment, Baltimore, Maryland, 25–27 October 2010, available online at http://ro.uow.edu.au/cgi/viewcontent.cgi?article=1120&context=asdpap ers [accessed 10 June 2012]. 21. Brian Cox and Margie H. Jantti, “Capturing Business Intelligence Required for Targeted Marketing, Demonstrating Value, and Driving Process Improvement,” Library and Information Science Research 34, no. 4 (2012): 308–16; doi:10.1016/j.lisr.2012.06.002. 22. Association of College and Research Libraries, Value of Academic Libraries: A Comprehensive Research Review and Report, res. Megan Oakleaf (Chicago: Association of College and Research Libraries, 2010), available online at www.ala.org/ala/mgrps/divs/acrl/issues/value/val_report.pdf [accessed 10 June 2012]. 23. Ibid., 23. 24. “SCONUL Statistical Questionnaire—Performance Portal” (2012), available online at http:// vamp.diglib.shrivenham.cranfield.ac.uk/statistics/sconul-statistical-questionnaire [accessed 10 June 2012]. 25. Laurie M. Bridges, “Who Is Not Using the Library? A Comparison of Undergraduate Academic Disciplines and Library Use,” portal: Libraries and the Academy 8, no. 2 (2008): 187–96; doi:10.1353/pla.2008.0023. 26. Sue White and Graham Stone, “Maximising Use of Library Resources at the University of Huddersfield,” in UKSG 33rd Annual Conference and Exhibition, 12–14 April 2010, Edinburgh International Conference Centre, available online at http://eprints.hud.ac.uk/7248/ [accessed 10 June 2012]. 27. Ibid. 28. “Jisc Activity Data” (2011), available online at www.jisc.ac.uk/whatwedo/programmes/ inf11/activitydata.aspx [accessed 10 June 2012]. 29. “Library Impact Data Project final blog post”, (2011) available online at http://library.hud. ac.uk/blogs/projects/lidp/2011/07/21/the-final-blog-post/ [accessed 10 June 2012]. 30. The U.K. uses degree classifications: First, Upper Second, Lower Second, Third, and Pass. There is no official conversion to U.S. Grade Point Average (GPA); however, the Fullbright Com- mission provides an unofficial chart with approximate grade conversions between U.K. classifica- tions and U.S. GPA. “Transcript—Postgraduate Study, US-UK Fulbright Commission: How Do I Convert a UK Result to a US GPA?” available online at www.fulbright.org.uk/study-in-the-usa/ postgraduate-study/applying/transcript#how%20do%20i%20convert [accessed 15 August 2012]. 31. Andy P. Field, Discovering Statistics Using SPSS: And Sex and Drugs and Rock ‘n’ Roll, 3rd ed. (London: Sage, 2009). 32. “Library Impact Data Project Toolkit” (2011), 13, available online at http://eprints.hud. ac.uk/11571/ [accessed 10 June 2012]. 33. Ibid., 14. 34. Based on 10000 sampled tables with starting seed 926214481. 35. Grouping Variable: Degree result. 36. Ibid. 37. Cox and Jantti, “Capturing Business Intelligence.” 38. “Library Impact Data Project Dataset,” available online at http://eprints.hud.ac.uk/11543/ [accessed 10 June 2012]. 39. “Library Impact Data Project Toolkit” (2011).