jan13_b.indd C&RL News January 2013 28 Information literacy (IL) is widely recognized as a necessary skill for the information age, and post-secondary institutions and libraries spend large amounts of time and resources on information literacy instruction (ILI) programs. With tightening post-secondary budgets and increasing emphasis on meeting institutional learning outcomes, there have been continued calls from librarians, educa- tors, academics, and library organizations to assess ILI.1 More than simply a means of being accountable to stakeholders, ILI assessment is a way of demonstrating librarians’ contribu- tions to student learning, gaining feedback to improve librarian teaching, bolstering instruc- tional program performance, and increasing student learning.2 Assessing IL is particularly difficult, how- ever, as IL skills are integrated in multiple ways into different courses and measurable learning outcomes are not always the result of formal ILI alone. Information literacy in- struction is also difficult to assess as it often consists of one-shot sessions, requiring all in- formation to be communicated in a short time frame. Despite these challenges, librarians are increasingly attempting to meaningfully assess teaching and learning in ILI sessions. While IL learning outcomes vary widely due to various research assignments, institution- specific contexts, and other considerations, many ILI programs have a set of common learning outcomes that describe “what and how a student is expected to learn after ex- posure to teaching”3 and that are guided by the ACRL’s Information Literacy Competency Standards for Higher Education.4 These stan- dards and their related outcomes are useful guides for the creation of questions for tools that assess IL.5 The library and information studies lit- erature is full of examples of local tests and assessment tools being developed, with more or less rigour, that map their questions to the ACRL standards. At least three standardized tests for assessing IL have been developed: SAILS,6 ILT,7 and iSkills.8 Each of these tests has been carefully constructed and checked for both reliability and validity, and they have demonstrated utility in assessing IL. However, there are disadvantages in using these standardized tests, including cost, lack of flexibility, and length of time required to administer the test. Recognizing these problems in the existing tools and seeing a need for a local solution to these problems, librarians from four Alberta post-secondary institutions launched Information Literacy in Alberta Assessment Pilot (ILAAP), a pilot project to create a custom assessment tool that responds to the unique needs of local Nancy Goebel, Jessica Knoch, Michelle Edwards Thomson, Rebekah Willson, and Sara Sharun Making assessment less scary Academic libraries collaborate on an information literacy assessment model Nancy Goebel is head librarian at the University of Alberta-Augustana, e-mail: nancy.goebel@ualberta. ca; Jessica Knoch is reference librarian at Grant MacEwan University, e-mail: knochj@macewan.ca; Michelle Edwards Thomson is librarian at Red Deer College’s Learning Commons, e-mail: michelle.edwards. thomson@rdc.ab.ca; Rebekah Willson is librarian at Mount Royal University Library, e-mail: bwillson@ mtroyal.ca, Sara Sharun is librarian at Mount Royal University, e-mail: ssharun@mtroyal.ca © 2013 Nancy Goebel, Jessica Knoch, Michelle Edwards Thomson, Rebekah Willson, and Sara Sharun January 2013 29 C&RL News institutions and provides a more appropriate model for promoting and assessing IL skills among Alberta students. Background The pilot project grew out of a discussion among a small group of librarians and their shared acknowledgement that they were be- coming increasingly engaged in assessment of student learning and its associated account- ability at their institutions. This group decided to start a pilot involving the librarians from a few colleges and universities in Alberta that have a focus on IL instruction. Participating institutions in the pilot included: Augustana Campus Library of the University of Alberta (Camrose, Alberta); Red Deer College (Red Deer, Alberta); Mount Royal University (Cal- gary, Alberta), and Grant MacEwan Univer- sity (Edmonton, Alberta). Each participating library has a solid history of undergraduate IL instruction, but varying levels of experience assessing that instruction. The team members decided to build a custom tool, since each institution had experienced frustration with either a less than adequate local tool or dis- appointing engagement with other existing standardized IL tests. Process The assessment tool emerged as a post-test questionnaire, chosen because it would be brief, easy to distribute and collect, and would suit a multitude of instructional styles across four different institutions. Designed as a tool for assessment of one-shot sessions, the questionnaire was comprised of three sec- tions. First were two demographic questions that gathered data on the student’s program and year of study. Second was a pool of 17 summative, evaluative multiple-choice ques- tions, of which participating librarians were encouraged to pick two-to-three questions that best matched the learning outcomes of their courses. The questionnaire ended with two formative, open-ended questions asking students to describe the most useful takeaway and to note any lack of clarity following the session. All questions were created collabora- tively by the four partners and were mapped to outcomes from the ACRL standards. In addition, questions were written to focus on skills for first- and second-year students and were developed to address a wide variety of IL skills and knowledge delivered by librarians across the institutions. Librarian colleagues were recruited to use the tool in their first- and second-year ILI sessions, and documentation was written for use across all four institutions to provide information to librarians who wished to use the tool in their classes. The team used WASSAIL,9 locally pro- duced, open source IL assessment software created to manage question and response data in a library environment. Once entered into WASSAIL, the post-test questions could easily be used by team members to gener- ate customized questionnaires requested by participating librarians. Questionnaires were delivered to students via a Web link during the last five minutes of class. Librarians were able to request their individual class scores, which not only provided some incentive for librarians to participate in the pilot, but also made it easy for individuals to act upon their assessment data immediately. Initial results During the eight months of the pilot, re- sponses from 918 individuals were collected from 77 classes assessed across the four institutions. While issues with question de- sign were raised over the course of the pilot and questions requiring substantial revision were identified, strong themes nonetheless emerged from the data, which provide some interesting preliminary information about students’ IL skills. Questions with the most correct responses revolved around selection and evaluation of resources. For example, 98% of students surveyed were able to determine the best type of sourc- es for their research paper (ACRL 3.4.a), and 92% of students were able to describe criteria used to determine if an article was scholarly (ACRL 1.4b). Questions about access received a lower number of correct responses. For C&RL News January 2013 30 example, only 67% of students knew where to find a book on the shelf (ACRL 2.3.b), an issue that was also reflected in the qualitative comments. Common themes from the qualitative comments about the ILI sessions included minimized library anxiety and relief about the ability to get further assistance at the refer- ence desk, if needed. The qualitative data also indicated a slight prevalence of students who felt uncertain about citation, evaluation, and access at the end of the sessions. ILAAP team members will use these findings to inform the review of question design and quality before drawing conclusions or making changes to curriculum/pedagogy based on students’ answers to these questions. Lessons learned The ILAAP team has learned a great deal over the course of the pilot year, both about the design of the research project and about the practicalities of administering the instrument. One of the biggest lessons learned was the importance of constant communication. The team members quickly discovered that they needed to stay in close contact with one another; it was very likely that if one discovered an issue that needed to be re- solved or an administrative task that had to be completed at one institution, the others were facing a similar situation. The team avoided duplication of work by sharing reports and solutions with one another and building from each other’s work. The team also had to communicate regularly what was happening with the project, both to their senior administration and to the other librar- ians at their institutions who were helping collect data. This regular communication meant that as interest in the project grew within the Alberta library community, each institution’s director was able to knowledgeably field inquiries that came her way and advocate on behalf of the project. It also meant that librarians involved in administering the questionnaires and gathering much of the data remained invested in the project. Benefits Working on this project has allowed team members, and therefore their institutions, to establish a shared vision of what ILI assess- ment might look like when implemented across multiple institutions, a process en- riched by the variety of perspectives brought to the table by team members. The lengthy discussions involved in creating this shared vision allowed team members to see how other individuals interpreted and addressed each of the ACRL standards in their ILI sessions. These discussions also allowed the team members a rare glimpse into the classrooms of other institutions. Rather than choosing one of the stan- dardized tests on the market, the ILAAP team was able to develop a local solution that best meets the needs of local students, and because the institutions shared imple- mentation costs, the tool became a far more cost-effective solution than some of the commercially available ILI assessment tools. ILAAP members now have a locally produced tool that is applicable to how they teach, what they teach, and how they articulate their results. Already, they have been able to report initial findings to their administrations. Future directions As the pilot phase of the project nears com- pletion, the team will be working to revise and then validate the questions in order to ensure that the tool provides an accurate indication of student learning. The tool will be assessed for gaps in scope and eventually the pool of questions will be expanded so that there are more options for addressing each standard, as well as options to assess students at a higher level of learning. Analysis of the quantitative and qualitative data will continue, including efforts to define statistical relevance in the results. In the long term, the goal is to make the tool widely available to academic libraries throughout Alberta. The data generated by this project will document the state of basic IL learning across Alberta, and will be used January 2013 31 C&RL News by the ILAAP team to bring the project back to its original goal: to build a model for pro- moting and assessing IL skills required for student success across Alberta. Notes 1. ACRL, Instruction Section Research and Scholarship Committee, “Research Agenda For Library Instruction and Information Literacy,” Library & Information Science Research 25 (2003): 479-487. doi:10.1016 /S0740-8188(03)00056-2. 2. Megan Oakleaf, “The Information Lit- eracy Instruction Assessment Cycle: A Guide for Increasing Student Learning and Improv- ing Librarian Instructional Skills,” Journal of Documentation 65, no.4 (2009): 539-560. doi:10.1108/00220410910970249. 3. John Biggs and Catherine Tang, “De- signing Intended Learning Outcomes,” in Teaching For Quality Learning at University: What the Student Does (Maidenhead, UK: McGraw-Hill Education, 2007), 64. 4 . “ I n f o r m a t i o n L i t e r a c y C o m - petency Standards For Higher Educa- tion,” ACRL, www.ala.org/acrl/standards /informationliteracycompetency (accessed December 6, 2012). 5. Ada Emmett and Judith Emde, “As- sessing Information Literacy Skills Using the ACRL Standards as a Guide,” Reference Services Review 35, no. 2 (2007): 210-229. doi:10.1108/00907320710749146. 6. Lisa G. O’Connor, Carolyn J. Radcliff, and Julie A. Gedeon, “Applying Systems Design and Item Response Theory to the Problem of Measuring Information Literacy Skills,” College & Research Libraries 63, no. 6 (2002): 528-543. 7. Lynn Cameron, Steven L. Wise, and Susan M. Lottridge, “The Development and Validation of the Information Literacy Test,” College & Research Libraries 68, no. 3 (2007): 229-237. 8. Irvin R. Katz, “Testing Information Lit- eracy in Digital Environments: ETS’s iSkills Assessment.” Information Technology & Libraries 26, no. 3 (2007): 3-12. 9. WASSAIL, www.library.ualberta.ca /augustana/infolit/wassail/. 2013 CRL Primary Source Awards Center for Research Libraries G L O B A L R E S O U R C E S N E T WO R K Center for Research Libraries G L O B A L R E S O U R C E S N E T WO R K Center for Research Libraries G L O B A L R E S O U R C E S N E T WO R K Center for Research Libraries G L O B A L R E S O U R C E S N E T WO R K The Center for Research Libraries’ Primary Source Awards recognize faculty, librarians, students, and others within the CRL community for innovative use or promotion of primary source materials in three areas: • ACCESS • RESEARCH • TEACHING CRL welcomes nominations from staff at CRL libraries; awardees will be announced at CRL’s Annual Meeting on April 19 and featured in the publication FOCUS on Global Resources. Both awardees and their nominators can choose between a gift certificate for Powell’s Books or an iPad. NOMINATION DEADLINE: January 31 Visit www.crl.edu/primary-source-awards or contact Don Dyer at 800-621-6044, ext. 317, ddyer @crl.edu. Image courtesy of Archivo Histórico de la Policía Nacional, Guatemala.