Beyond the Library: Using Multiple, Mixed Measures Simultaneously in a College-Wide Assessment of Information Literacy 236 Beyond the Library: Using Multiple, Mixed Measures Simultaneously in a College-Wide Assessment of Information Literacy Brandy Whitlock and Nassim Ebrahimi Brandy Whitlock is Professor and Instruction Librarian at Anne Arundel Community College, e-mail: bmwhitlock@aacc.edu; Dr. Nassim Ebrahimi is currently Associate Vice President of Institutional Re- search, Effectiveness and Planning at Baltimore City Community College, formerly Director of Learning Outcomes Assessment at Anne Arundel Community College, e-mail: nebrahimi@bccc.edu. © 2016 Brandy Whitlock and Dr. Nassim Ebrahimi, Attribution-NonCommercial 3.0 (http://creativecommons.org/licenses/ by-nc/3.0/) CC BY-NC 3.0. To get the best sense of how graduating students demonstrate informa- tion literacy skills and how the institution can improve student learning, the Assessment in Action (AiA) project at Anne Arundel Community Col- lege (AACC) deployed a combination of indirect measures and authentic assessment of student work, utilizing assessment tools flexible enough to be deployed across the college. The results of AACC’s AiA project have provided college practitioners and stakeholders with evidence of the extent to which graduating students demonstrate crucial information literacy skills and with data that can inform decisions about how to foster more effective teaching and learning. Introduction The college-wide core competencies at Anne Arundel Community College (AACC) are described as “fundamental learning outcomes” that are “vital to success in work and in life” and are defined as “learning and life skills from the college experience beyond the specific content each class provides.”1 A process at AACC for assessing these core learning outcomes began in the 2012–2013 academic year. As one of AACC’s college- wide core competencies, information literacy was slated for assessment beginning in the 2013–2014 academic year. That same year, as a participant in the Association of College & Research Libraries’ Assessment in Action (AiA) program, a team contributed to the assessment of students’ information literacy by further investigating the mechanisms at the college meant to develop and assess this competency, from program curricula to the teaching strategies and research assignments deployed by faculty. AACC’s AiA project provides an example of “action research,” outlined by Reason and Bradbury as a “participatory, democratic process” that is “concerned with develop- ing practical knowing in the pursuit of worthwhile human purposes,” a process that takes an “emergent, developmental form” and “seeks to bring together action and doi:10.5860/crl.77.2.236 Beyond the Library 237 reflection, theory and practice, in participation with others, in the pursuit of practical solutions to issues of pressing concern.”2 For action research, communities of practice produce communities of inquiry that value many forms of knowledge in the pursuit of practical outcomes. Establishing timelines for the AiA project, developing assessment tools and pro- cesses, and identifying strategies for improvement have required consultation and collaboration among many college practitioners and other stakeholders, most notably AACC’s Office of Learning Outcomes Assessment (LOA); the Committee on Teaching and Learning’s LOA Subcommittee—a subcommittee mandated to include representa- tion from across the college; AACC’s Assessment Fellows, faculty members who work in their departments to encourage and share results of assessment efforts; and Andrew G. Truxal Library’s administration and faculty. The AiA team included the Instruction Librarian at AACC, a faculty member who leads information literacy programming at the college; AACC’s Director of Learning Outcomes Assessment, whose office collaborates with practitioners and other stakeholders to spearhead and document learning assessment at the college, including assessment of AACC’s college-wide core competencies; and a faculty member who teaches and assesses information literacy skills in credit-bearing courses and who has served as one of AACC’s Assessment Fellows. The Director of LOA serves as the co-chair of the LOA Subcommittee, and a librarian faculty member is required to serve on the subcommittee in an advisory role, a role the Instruction Librarian filled for the duration of the AiA project. All degree-bearing programs at the college have been tasked with mapping in their curricula when AACC’s college-wide core competencies are assessed, resulting in a robust curriculum map. Once learning outcomes and a curriculum for achieving them have been established, the goals of assessing student learning are to gauge the results of student participation in a curriculum and to identify ways to improve student learning. By reflecting on the assessment process, the college community can more closely align learning outcomes, learning activities, and learning assessments. After establishing a set of college-wide core competencies and creating a curriculum map to identify when AACC’s degree-bearing programs should develop and assess these competencies, it was critical to assess the extent to which students who have moved through a program’s curriculum could demonstrate these competencies, all the while working to refine learning outcomes and to identify ways to revise and support pro- gram curricula to improve student proficiencies. In conjunction with the Office of LOA, the LOA Subcommittee helps create, revise, and formally approve the processes and tools used for assessing AACC’s college-wide core competencies. AACC’s LOA Plan outlines a systematic approach for assessing AACC’s ten college-wide core competencies, wherein two core competencies are as- sessed over a two-year period, allowing one fall semester for planning, one spring semester for testing and preliminary data gathering, and then more data gathering during the next fall and spring semesters.3 Although each pair of core competencies is assessed over a two-year period, those periods overlap so that in most academic years, assessments of four core competencies transpire simultaneously. During the 2013–2014 academic year, for instance, communication and technology fluency were in the second year of their assessment, and information literacy and personal wellness were in the first year of their assessment. Because each college-wide core competency is assessed during a two-year period, many practitioners and other stakeholders can be involved in assessment designs and deployments by contributing to the development of assessment questions, procedures, and tools. AACC’s LOA Plan establishes a cyclical process, so each college-wide core competency is slated for assessment every five years or so, allowing for revisions to 238 College & Research Libraries March 2016 how AACC’s core competencies are defined, how competencies are manifested through curricula, and how learning is assessed. The processes for describing, refining, and assessing college-wide core competencies at AACC continue to evolve. Since every degree-bearing program at AACC has identified when each of AACC’s college-wide core competencies are scheduled to be assessed in their required curricula, studying graduating students provides the best information to ascertain the success of AACC’s programs in leading students to achieve the college’s core competencies. It can be very difficult to capture data from other student cohorts because students often do not follow a similar path through a program’s curriculum to graduation. Community colleges, especially, draw many students who attend part-time, who transfer, who experience breaks in formal education, or who decide to change programs of study so that the only dependable college-wide learning assessments of AACC’s core com- petencies can occur just before graduation. It’s the only time that a substantial cohort of students in any particular program, much less across programs, can be expected to be able to demonstrate competency in all college-wide learning outcomes. As one of AACC’s college-wide core competencies, information literacy is described as “recognizing when information is needed and locating, evaluating and using infor- mation appropriately.”4 Like all of AACC’s college-wide core competencies, information literacy is understood to be integral to meaningful civic engagement and to academic, professional, and personal enrichment. Because AACC’s vision is to educate people who “are among the best-prepared citizens and workers of the world,” graduating students must be able to demonstrate that they can find, evaluate, and utilize informa- tion effectively.5 So the pressing concern is to determine if sufficient mechanisms are in place at the college to lead students to demonstrate appropriate, vital information literacy skills by the time they graduate. Literature Review In assessing information literacy skills as learning outcomes, much literature exists that examines why and how academic librarians should establish and improve the contributions that academic libraries make toward student learning.6 Most literature on assessing students’ information literacy skills also focuses on evaluating students engaged in particular disciplines, courses, or assignments,7 prompting Hernon to ask: “Will libraries embrace broader assessment and carve out a role, or will they continue to focus on course-level assessment?”8 Far less literature exists that explicates methods for assessing simultaneously the sufficiency of multiple instructional mechanisms, both inside and outside the library’s purview, in leading students to demonstrate desired information literacy competencies. In college-wide summative assessments of information literacy, or summative as- sessments of information literacy across many disciplines, researchers have generally used indirect or direct methods, but not a combination of both.9 Dugan and Hernon, however, encourage the use of multiple, mixed measures, especially when their use can help tie learning outcomes assessments to institutional learning goals.10 They argued for academic librarians “to develop knowledge, measures, and data collection techniques that cut across” differing perspectives—perspectives like “the user in the life of the library, the user and the library in the life of the institution, and the library and institu- tion in the life of the user”—in order to achieve “a more complete view” of how and to what extent students demonstrate the information literacy skills they’ve learned, with the resulting perspective resembling more “of a ‘jig saw’ puzzle entitled ‘the library as a partner and contribution to achievement of the institutional mission.’”11 Likewise, a number of library advocates have started to promote using multiple, mixed measures and tying information literacy learning outcomes to institutional learning outcomes.12 Beyond the Library 239 Examples of methodologies of college-wide information literacy assessment espe- cially relevant for AACC’s AiA project included assessments of information literacy skills at the California State Universities (CSU) and Berkeley College. In the early 2000s, CSU began a multiyear, multiphase assessment of information literacy skills. Among those phases and subsequent efforts at CSU, the deployment of quantitative and qualitative measures are described, as well as the use of direct and indirect mea- sures, involving many constituencies across multiple institutions in the assessment of information literacy skills.13 Charles outlined the process used at Berkeley College to create an information literacy curriculum map and mentioned a number of assess- ment tools and objects used and gathered throughout a student’s information literacy curriculum, namely “rubrics (including discussion board rubrics), research journals, pre and post-tests, concept maps, website evaluation scorecards, audience response systems, peer-assessment, and bibliographies,” which assisted librarians “in measuring IL competencies as skills or enduring traits, such as lifelong learning skills.”14 Addition- ally, departments had identified within their majors courses from which information literacy artifacts could be culled, and “‘assessment days’ were scheduled on an annual basis …for representative faculty to work together in looking at the samples of student work and to allow the findings to illuminate changes that should be made to the cur- riculum and teaching strategies.”15 Using a mix of direct and indirect measures to investigate multiple learning mecha- nisms makes it more difficult to pinpoint causations and correlations, but the aim of learning outcomes assessment is to gather evidence that will inform changes to aca- demic goals and experiences, not to test hypotheses.16 Ratteray stresses that agreeing on what assessment data mean does not complete the cycle of assessment; the cycle closes only when improvements have been implemented.17 It’s rigorous educational research that endeavors to establish causations and correlations, but assessment ef- forts are concerned primarily with measuring the achievement of learning outcomes and, in keeping with action research, with being able to use assessment information to improve learning, not with proving that a particular learning activity or curriculum was entirely responsible for yielding those results.18 In fact, causation and correlation are often difficult to prove in learning outcomes assessment, especially over the span of degree-bearing programs. After all, students don’t learn only through engagement with formal academic curricula: “Many variables exist within a population of students that might affect their information literacy learning outcomes.”19 Students learn, as well, by engaging in extracurricular school activities, work-related pursuits, and personal interests, and these kinds of learning experiences are often extraordinarily difficult to document, much less measure. Methodology Little more than a decade ago, Serban noted a lack of “comprehensive models that would guide community colleges in developing an assessment approach that coher- ently integrates all levels, from courses and programs to the overall institution,” but such models are uncommon in higher education overall.20 AACC’s AiA project is uniquely positioned to speak from a community college perspective about informa- tion literacy competency assessment in a way that may be relevant to any stakeholder in higher education because it has employed multiple and mixed assessment tools simultaneously: a robust curriculum map, which ties together institutional, program- matic, and course-level learning outcomes; a faculty survey, which has examined how faculty members have developed and assessed students’ information literacy skills in the context of particular course curricula; results of authentic assessment of student artifacts, culled from their coursework, which librarians scored with an institutionally 240 College & Research Libraries March 2016 created rubric; and a review of the content of the assignments that produced those student artifacts. In fall of 2013, the LOA Subcommittee and the AiA team considered, from a myriad of organizations and institutions, a variety of assessment tools to measure information literacy skills, including standardized assessments (specifically, Project SAILS and iSkills), as well as surveys and scoring rubrics. During the previous academic year, the LOA Subcommittee had devised and used a rubric for scoring student artifacts for communication skills and found the process informative and rewarding. The rubric brought forward by the LOA Subcommittee and ultimately used in scoring student artifacts for information literacy (see appendix A) was adapted from the University of Maryland University College’s Graduate School Management and Technology’s Information Literacy Rubric for Outcomes Assessment and from St. John’s University’s Information Literacy Rubric. One row heading was used from St. John’s University’s rubric, “choice of sources,” and is described in AACC’s rubric to be when a student “chooses appropriate sources and content of information.” Three row headings and respective cell descriptions were used from the University of Maryland University College’s rubric: evaluation (student “critically assesses sources and content of informa- tion”); incorporation (student “uses information to accomplish a specific purpose”); and ethical use (student “understands and complies with institutional policies related to access and use of information, demonstrating an understanding of academic integrity”). Although librarians would not be able to score most student artifacts for evaluation of sources, it is a critical component of information literacy and so was included among the scoring criteria for those artifacts that could provide evidence of evaluation, such as annotated bibliographies. Column headings were consistent with AACC’s rubric for assessing communication skills and with Miami Dade College’s Information Literacy rubric. AACC librarians provided additional review and suggested minor revisions for clarity and consistency. To minimize disruption to the teaching and learning environment, while maximiz- ing the ability to assess and use assessment information to improve learning, works from a random sample of students who had applied by the deadline to graduate with an associate’s degree were collected for purposes of scoring. In spring of 2014 (SP 14), fall of 2014 (FA 14), and spring of 2015 (SP 15), the Registrar’s office helped identify all students who had applied by the deadline to graduate with an associate’s degree (SP 14: N = 1,026; FA 14: N = 606; SP 15: N = 1,030). A random sample was selected, equaling 40 percent of the total (SP 14: n = 410; FA 14: n = 242; SP 15: n = 411). The Director of LOA sent an e-mail to all identified students, notifying them of the upcoming assessments and providing directions for opting out. Each semester, a number of students chose not to participate and were removed from the assessment (SP 14: 7 students opted out, N = 1,019; FA 14: 9 students opted out, N = 597; SP 15: 11 students opted out, N = 1,019). For the students remaining in the random sample, the Director of LOA examined their schedules in consultation with the curriculum map to identify courses that were aligned with information literacy and that could provide an artifact of student work for scoring. Of the 410 students in the spring 2014 random sample, 50 had completed their coursework and were not enrolled in any courses in the spring semester; so 360 students (35 percent) of those students applying by the deadline to graduate with an associate’s degree remained in the study and constituted the random sample. Of the 242 students in the fall 2014 random sample, 31 were not enrolled in any courses that semester, so 211 students (35 percent) constituted the random sample. Of the 411 students in the spring 2015 random sample, 48 were not enrolled in any courses, so 363 (36 percent) constituted the random sample. In the event that the identified course had been canceled, the next course in the student record was considered. This Beyond the Library 241 methodology was repeated until a target course was identified for each randomly selected student. The Director of LOA sent an e-mail to department chairs to share with them the process and to identify the instructors of students who were randomly selected to participate. The Director of LOA then contacted instructors of the target courses, along with their respective department chairs or directors, providing them with the information literacy scoring rubric and asking them to send one relevant sample of a target student’s coursework to the Office of LOA by the end of the semester. An e- mail reminder was sent to all remaining instructors a few weeks before the deadline. In fall 2014 and spring 2015, department chairs or directors were also copied on the reminder e-mail. Once the Office of LOA received a student artifact, all student, course, and instructor identifiers were removed; each work was assigned a random identification number; and a rubric was attached to each artifact in preparation for scoring. Electronic copies of student work were stored in a password-protected file. If an instructor indicated that the selected student was not currently enrolled in his or her course, the Director of LOA contacted the instructor of the next active course in the student’s record, along with the respective department chair or director. In spring of 2014, 20 faculty members submitted 78 student artifacts (7.5 percent of the total) for scoring. The initial goal of 15 percent of the sample was not met, so the Director of LOA re-examined student works submitted for assessment of communica- tion skills in spring and fall of 2013, selecting artifacts for inclusion in the information literacy assessment if they could be scored for information literacy skills (SP 2013: n = 30; FA 2013: n = 21). In fall of 2014, 64 faculty members submitted 76 student artifacts (13 percent of the total). In spring of 2015, 105 faculty members submitted 117 samples of student work (10 percent of the total). Each semester, though the target sample size was not met, all schools at the college were represented among the student artifacts submitted. Scoring sessions took place on June 2, 9, and 13, 2014; January 9 and February 13, 2015; and June 11, 30, and July 24, 2015. Each semester, four library faculty volunteer evaluators were trained to utilize the assessment tools and were then asked to score the assignments and student works. Evaluators scored blindly from one another, and, to help increase reliability, two separate evaluators scored one-third of the student artifacts and the assignments. The assessment of student artifacts was designed to ascertain whether graduating students were demonstrating appropriate information literacy skills in their course- work, while an assignment checklist (see appendix B) and faculty survey (see appendix C) did more to assess the sufficiency of different instructional mechanisms at AACC to facilitate the development of students’ information literacy skills, mechanisms like the college’s curriculum map, the library’s instructional efforts, and course-embedded research assignments. The checklist used to score assignments was adapted from materials originally developed for a study of AACC students’ use of the library’s on- line resources. Faculty submitting student works for information literacy assessment were asked to reflect on information literacy in the selected course and to respond to the survey. The faculty survey was dispatched to better understand how information literacy skills are taught and assessed, and how the library’s resources and services are used to support and deploy curricula. It is difficult to assess a process fully, like teaching and learning information literacy skills, by studying only the products of that process, like research papers, portfolios, presentations, and so on. Studying products can rarely address all aspects of the pro- cesses that generated them. For example, a student’s research paper (a product) doesn’t 242 C ollege & R esearch Lib raries M arch 2016 TABLE 1 Score Distributions—Choice of Sources, Incorporation and Ethical Use Choice of Sources Incorporation Ethical Use A ll (n = 3 22 ) SP & F A 13 (n = 5 1) SP 14 (n = 7 8) FA 14 (n = 7 6) SP 15 (n = 1 17 ) A ll (n = 3 22 ) SP & F A 13 (n = 5 1) SP 14 (n = 7 8) FA 14 (n = 7 6) SP 15 (n = 1 17 ) A ll (n = 3 22 ) SP & F A 13 (n = 5 1) SP 14 (n = 7 8) FA 14 (n = 7 6) SP 15 (n = 1 17 ) Exemplary (4) 20 (6%) 1 (2%) 8 (10%) 2 (3%) 9 (8%) 20 (6%) 2 (4%) 8 (10%) 2 (3%) 8 (7%) 14 (4%) 1 (2%) 6 (8%) 1 (1 %) 6 (5%) (3.5) 9 (3%) 1 (2%) 2 (3%) 2 (3%) 4 (3%) 8 (2%) 1 (2%) 2 (3%) 2 (3%) 3 (3%) 8 (2%) 1 (2%) 2 (3%) 2 (3%) 3 (3%) Proficient (3) 129 (40%) 16 (31%) 35 (45%) 24 (32%) 54 (46%) 98 (30%) 15 (29%) 34 (44%) 17 (22%) 32 (27%) 85 (26%) 9 (18%) 19 (24%) 22 (29%) 35 (30%) (2.5) 30 (9%) 3 (6%) 7 (9%) 11 (14%) 8 (7%) 17 (5%) 5 (10%) 2 (3%) 6 (8%) 4 (3%) 21 (7%) 4 (8%) 3 (4%) 10 (13%) 4 (3%) Developing (2) 78 (24%) 14 (27%) 19 (24%) 17 (22%) 27 (23%) 93 (29%) 11 (22%) 21 (27%) 21 (28%) 40 (34%) 100 (31%) 17 (33%) 31 (40%) 13 (17%) 39 (33%) (1.5) 9 (3%) 2 (4.0%) 2 (3%) 4 (5%) 2 (2%) 15 (5%) 3 (6%) 1 (1%) 2 (3%) 9 (8%) 16 (5%) 3 (6%) 4 (5%) 2 (3%) 7 (6%) Emerging (1) 22 (7%) 12 (24%) 3 (4%) 3 (4%) 5 (4%) 46 (14%) 10 (20%) 9 (12%) 12 (16%) 16 (14%) 54 (17%) 9 (18%) 12 (15%) 14 (18%) 20 (17%) (0.5) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 3 (1%) 0 (0%) 0 (0%) 2 (3%) 1 (1%) 2 (1%) 2 (4%) 0 (0%) 0 (0%) 0 (0%) Does Not Meet Emerging 5 (1%) 0 (0%) 1 (1%) 1 (1%) 3 (3%) 7 (2%) 2 (4%) 0 (0%) 1 (1%) 4 (3%) 8 (2%) 3 (6%) 0 (0%) 2 (3%) 3 (3%) Missing (Not Scored) 20 (7%) 2 (4%) 1 (1%) 12 (16%) 5 (4%) 15 (5%) 2 (4%) 1 (1%) 11 (14%) 0 (0%) 14 (4%) 2 (4%) 1 (1%) 10 (13%) 0 (0%) Beyond the Library 243 usually contain much evidence about how the student conducted research (a process). The decision to use the assignment checklist, along with the deployment of the faculty survey, provided additional insight into aspects of AACC’s instructional mechanisms that could not be gleaned from a review of student artifacts. These indirect measures helped to document how students were being asked to develop and demonstrate information literacy skills. Results Student Artifact Scoring Results Samples of student work were scored on a 4-point Likert-type scale (Emerging [1] to Exemplary [4]) in three categories: choice of sources, incorporation, and ethical use. Raters also had the option to score student works as “does not meet emerging” (0). Since instructors were not specifically asked to submit works for information literacy scoring in spring and fall 2013, results for spring 2013 and fall 2013 were combined. Student works, across four instructional units, were submitted from 100-level courses (143 samples [45 percent]) and 200-level courses (172 samples [55 percent]). See table 1. Percent agreement for student works that were double-scored reached acceptable levels when scale categories were combined. See tables 2 and 3. Therefore, average scores across double-raters were used for analyses. All average scores across the three categories were below Proficient (3.0), with students scoring highest in choice of sources. See figure 1. For all semesters assessed, 100-level course score distributions were consistent across incorporation and ethical use. See table 4. Score distributions were higher for choice of sources, with more than 50 percent of students scoring 2.5 or higher for all semesters assessed. Score distributions from 200-level courses varied across all three categories. See table 5. TABLE 2 Inter-rater Reliability, All Samples SP13, FA13 & SP14 (4–4, 3–3, 2–2, 1–1) SP13, FA13 & SP14 Percent Agreement (n = 76) FA14 Percent Agreement (n = 76) SP15 Percent Agreement (n = 117) Choice of Sources 43.4% 29.6% 51.35% Incorporation 53.8% 18.5% 29.73% Ethical Use 41.0% 40.7% 32.43% *Choice of Sources n = 112 TABLE 3 Inter-rater reliability, All Samples, Combined Categories Combined Categories (4/3–4/3 & 2/1/0–2/1/0) SP13, FA13 & SP14 Percent Agreement (n = 76) FA14 Percent Agreement (n = 76) SP15 Percent Agreement (n = 117) Choice of Sources 64.1% 51.9% 64.86% Incorporation 74.3% 44.4% 70.27% Ethical Use 69.2% 55.6% 56.76% *Choice of Sources n = 112 244 College & Research Libraries March 2016 FIGURE 1 Average Scores—Full Sample 2.53 2.28 2.182.16 2.15 1.90 2.67 2.58 2.282.53 2.19 2.25 2.63 2.21 2.21 0.00 0.50 1.00 1.50 2.00 2.50 3.00 3.50 4.00 Choice of Sources Incorporation Ethical Use All (n=322) SP & FA13 (n=51) SP14 (n=78) FA14 (n=76) SP15 (n=117) TABLE 4 Score Distributions — 100-Level Courses 100-Level Courses Choice of Sources Incorporation Ethical Use SP 13 , F A 13 & SP 14 (n = 4 8) FA 14 (n = 3 5) SP 15 (n = 6 1) SP 13 , F A 13 & SP 14 (n = 4 8) FA 14 (n = 3 5) SP 15 (n = 6 1) SP 13 , F A 13 & SP 14 (n = 4 8) FA 14 (n = 3 5) SP 15 (n = 6 1) Exemplary (4) 1 (2%) 1 (3%) 3 (5%) 5 (10%) 1 (3%) 4 (7%) 3 (6%) 0 (0%) 3 (5%) (3.5) 2 (4%) 0 (0%) 3 (5%) 3 (6%) 1 (3%) 1 (2%) 3 (6%) 1 (3%) 2 (3%) Proficient (3) 19 (38%) 10 (29%) 26 (43%) 11 (22%) 9 (26%) 16 (26%) 12 (24%) 10 (29%) 17 (28%) (2.5) 3 (6%) 1 (3%) 4 (4%) 3 (6%) 1 (3%) 4 (7%) 2 (4%) 0 (0%) 3 (5%) Developing (2) 13 (26%) 12 (34%) 15 (25%) 12 (24%) 9 (26%) 23 (38%) 12 (24%) 8 (23%) 19 (31%) (1.5) 3 (6%) 2 (6%) 1 (2%) 2 (4%) 0 (0%) 3 (5%) 5 (10%) 1 (1%) 4 (7%) Emerging (1) 7 (14%) 1 (3%) 4 (4%) 11 (22%) 6 (17%) 6 (10%) 8 (16%) 8 (23%) 11 (18%) (0.5) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 1 (3%) 1 (2%) 1 (2%) 0 (0%) 0 (0%) Does Not Meet Emerging 0 (0%) 0 (0%) 2 (3%) 1 (2%) 0 (0%) 3 (5%) 2 (4%) 1 (3%) 2 (3%) Missing (Not Scored) 2 (4%) 8 (23%) 3 (5%) 2 (4%) 7 (7%) 0 (0%) 2 (4%) 6 (17%) 0 (0%) Beyond the Library 245 FIGURE 2 Percentage of Students Earning Exemplary/3.5/Proficient/2.5 by Course Level 52% 41% 39% 66% 49% 42% 0% 20% 40% 60% 80% 100% Choice of Sources Incorporation Ethical Use 100-level (n=143) 200-level (n=172) TABLE 5 Score Distributions — 200-Level Courses 200-Level Courses Choice of Sources Incorporation Ethical Use SP 13 , F A 13 & SP 14 (n = 7 8) FA 14 (n = 4 1) SP 15 (n = 5 6) SP 13 , F A 13 & SP 14 (n = 7 8) FA 14 (n = 4 1) SP 15 (n = 5 6) SP 13 , F A 13 & SP 14 (n = 7 8) FA 14 (n = 4 1) SP 15 (n = 5 6) Exemplary (4) 8 (10%) 1 (2%) 6 (11%) 5 (6%) 1 (2%) 4 (7%) 4 (5%) 1 (2%) 3 (5%) (3.5) 1 (1%) 2 (5%) 1 (2%) 0 (0%) 1 (2%) 2 (4%) 0 (0%) 1 (2%) 1 (2%) Proficient (3) 32 (41%) 14 (34%) 28 (50%) 38 (48%) 8 (20%) 16 (29%) 16 (20%) 12 (29%) 18 (32%) (2.5) 7 (9%) 10 (24%) 4 (7%) 4 (5%) 5 (12%) 0 (0%) 5 (6%) 10 (12%) 1 (2%) Developing (2) 20 (25%) 5 (12%) 12 (21%) 20 (25%) 12 (29%) 17 (30%) 36 (46%) 5 (12%) 20 (36%) (1.5) 1 (1%) 2 (5%) 1 (2%) 2 (3%) 2 (5%) 6 (11%) 2 (3%) 1 (2%) 3 (5%) Emerging (1) 8 (10%) 2 (5%) 1 (2%) 8 (10%) 6 (15%) 10 (18%) 13 (16%) 6 9 (16%) (0.5) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 1 (2%) 0 (0%) 1 (1%) 0 (0%) 0 (0%) Does Not Meet Emerging 1 (1%) 1 (2%) 1 (2%) 1 (1%) 1 (2%) 1 (2%) 1 (1%) 1 (2%) 1 (2%) Missing (Not Scored) 1 (1%) 4 (10%) 2 (4%) 1 (1%) 4 (10%) 0 (0%) 1 (1%) 4 (10%) 0 (0%) 246 College & Research Libraries March 2016 More than 60 percent of students scored 2.5 or higher in choice of sources, and fewer than 50 percent of students scored 2.5 or higher in incorporation and ethical use. Scores from both 100-level and 200-level courses were lower for ethical use (see figure 2). Course-level differences were found in distributions in all three categories with a greater percentage of students with artifacts from 200-level courses scoring 2.5 or higher in every category. Average scores for students with artifacts submitted from 200-level courses were consistently higher in choice of sources, incorporation, and ethical use, but gains were not statistically significant. See figure 3. Assignment Checklist Scoring Results Assignments that required or allowed students to demonstrate information literacy skills were evaluated on the inclusion of instructions from a total of eight categories: specifying number of sources students were expected to find, use, and cite; specify- ing the types of sources that were acceptable; requiring that students know how to evaluate sources for bias and credibility; requiring source variety to ensure students know how to get information from different types of sources; requiring intermediary steps in the research process; distinguishing between direct and indirect quotations; specifying a documentation style; and specifying requirements about the currency of information used. See table 6. FIGURE 3 Average Scores by Course Level 2.44 2.24 2.15 2.61 2.32 2.20 0.00 0.50 1.00 1.50 2.00 2.50 3.00 3.50 4.00 Choice of Sources Incorporation Ethical Use 100-level (n=143) 200-level (n=172) TABLE 6 Percent Included in Assignment, All Samples Full Sample (n = 322) SP13 Submitted Assignments (n = 30) FA13 Submitted Assignments (n = 21) SP14 Submitted Assignments (n = 78) FA14 Submitted Assignments (n = 76) SP15 Submitted Assignments (n = 117) Number of Sources 44.2% 43.3% 52.4% 42.3% 20.7% 80.6% Type of Sources 42.6% 43.3% 33.3% 44.9% 58.6% 40.9% Evaluate Sources 7% 6.7% 9.5% 6.4% 24.1% 9.6% Source Variety 11.6% 16.7% 9.5% 10.3% 10.3% 11.8% Intermediary Steps 34.9% 30% 9.5% 43.6% 17.2% 14.0% Direct and Indirect 2.3% 0% 0% 3.8% 17.2% 8.6% Documentation Style 46.5% 43.3% 42.9% 48.7% 13.8% 48.4% Currency of Info 2.3% 0% 9.5% 1.3% 31.0% 5.4% Beyond the Library 247 To help increase reliability, two separate evaluators scored one third of the assign- ments. See table 7. Faculty Survey Data In the spring and fall semesters of 2014 and in the spring of 2015, a survey was admin- istered to a total of 57 instructors teaching 28 different courses in order to ascertain the instructors’ familiarity with information literacy, as well as their teaching methods, expectations of students, and use of library resources and services. For all three semesters assessed, the majority of respondents taught traditional face- to-face classes (53 percent) with course duration of 15 weeks (68 percent). Respondents were primarily full-time faculty (60 percent). Full-time faculty rank varied, with the highest number of responses being provided by tenured (25 percent) and assistant pro- fessors (23 percent). Years in service to AACC also varied, with most faculty members working at AACC for one to four years (25 percent). Discussion The data collected in spring of 2013, fall of 2014, and spring of 2015 from scoring as- signments and student works reveals, as a whole, a large proportion of graduating students did not demonstrate appropriate information literacy skills, specifically in TABLE 7 Inter-rater Reliability, All Samples (n = 322) Percent Agreement (0-0 and 1-1) SP13, FA13, & SP14 FA14 SP15 Number of Sources 88.9% 82.8% 86.8% Type of Sources 72.2% 41.4% 73.7% Evaluate Sources 83.3% 75.9% 89.5% Source Variety 86.1% 89.7% 92.1% Intermediary Steps 86.1% 82.8% 97.4% Direct and Indirect 91.7% 82.8% 89.5% Documentation Style 88.9% 86.2% 92.1% Currency of Info 97.1% 65.5% 100.0% FIGURE 4 Percentage of Students Earning Exemplary/3.5/Proficient/2.5—All Samples 58% 44% 40%41% 45% 29% 67% 59% 38% 53% 36% 46% 64% 40% 41% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Choice of Sources Incorporation Ethical Use All (n=322) SP & FA13 (n=51) SP14 (n=78) FA14 (n=76) SP15 (n=117) 248 College & Research Libraries March 2016 TABLE 9 Course Delivery Mode (n = 57) SP14 (n = 22) FA14 (n = 23) SP15 (n = 12) Total Traditional 12 (54%) 11 (48%) 7 (58%) 30 (53%) Online 7 (32%) 11 (48%) 5 (42%) 23 (40%) Hybrid 3 (14%) 1 (4%) 0 (0%) 4 (7%) TABLE 10 Course Duration (n = 57) SP14 (n = 22) FA14 (n = 23) SP15 (n = 12) Total 15 weeks 14 (64%) 16 (70%) 9 (75%) 39 (68%) 13 weeks 0 (0%) 1 (4%) 1 (8%) 2 (4%) 8 weeks 7 (32%) 5 (22%) 1 (8%) 13 (23%) Other 1 (4%) 1 (4%) 1 (8%) 3 (5%) TABLE 11 Faculty Rank (n = 57) SP14 (n = 26) FA14 (n = 19) SP15 (n = 12) Total Tenured 5 (23%) 5 (22%) 4 (33%) 14 (25%) Non-Tenured 6 (27%) 3 (13%) 0 (0%) 9 (16%) Professor 1 (4%) 2 (9%) 2 (16%) 5 (9%) Associate Professor 5 (23%) 2 (9%) 1 (8%) 8 (14%) Assistant Professor 7 (32%) 4 (17%) 2 (16%) 13 (23%) Instructor 2 (9%) 2 (9%) 1 (8%) 5 (9%) Did Not Respond 0 (0%) 1 (4%) 2 (16%) 3 (5%) TABLE 8 Full-Time/Part-Time Status (n = 57) SP14 (n = 22) FA14 (n = 23) SP15 (n = 12) Total Full-time 15 (68%) 11 (48%) 8 (66%) 34 (60%) Adjunct 4 (18%) 6 (26%) 2 (16%) 12 (21%) Did Not Respond 3 (14%) 6 (26%) 2 (16%) 11 (19%) TABLE 12 Faculty Years in Service (n = 57) SP14 (n = 22) FA14 (n = 23) SP15 (n = 12) Total Less than 1 Year 2 (9%) 1 (4%) 0 (0%) 3 (5%) 1–4 Years 9 (41%) 2 (9%) 3 (25%) 14 (25%) 5–9 Years 1 (4%) 6 (26%) 2 (16%) 9 (16%) 10–14 Years 5 (23%) 4 (17%) 2 (16%) 11 (19%) 15–19 Years 2 (9%) 1 (4%) 3 (25%) 6 (11%) Over 20 Years 1 (4%) 3 (13%) 0 (0%) 4 (7%) Did Not Respond 2 (9%) 6 (26%) 2 (16%) 10 (18%) TABLE 13 Instructor Information Literacy Level Strongly Agree Agree Disagree Strongly Disagree Not Applicable (N/A) Did not Respond SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) I am confident that I know the definition of information literacy. 12 (55%) 8 (35%) 4 (33%) 9 (41%) 11 (48%) 7 (58%) 1 (5%) 4 (17%) 0 (0%) 0 (0%) 0 (0%) 1 (8%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 20 (44%) 20 (44%) 4 (9%) 1 (2%) 0 (0%) 0 (0%) Information literacy should be one of AACC’s core competencies. 16 (73%) 10 (43%) 5 (42%) 6 (27%) 13 (57%) 7 (58%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 26 (58%) 19 (42%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) The course entered at the beginning of the survey should be mapped to information literacy in AACC’s curriculum map. 12 (55%) 9 (39%) 3 (25%) 10 (46%) 3 (13%) 6 (50%) 0 (0%) 1 (4%) 1 (8%) 0 (0%) 0 (0%) 2 (16%) 0 (0%) 3 (13%) 0 (0%) 0 (0%) 2 (9%) 0 (0%) 21 (47%) 18 (40%) 1 (2%) 0 (0%) 3 (7%) 2 (4%) I am confident that I can assess students’ information literacy skills. 11 (50%) 8 (35%) 2 (16%) 11 (50%) 10 (43%) 8 (66%) 0 (0%) 4 (17%) 2 (16%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 1 (4%) 0 (0%) 19 (42%) 21 (47%) 4 (9%) 0 (0%) 0 (0%) 1 (2%) I am confident that I can assess whether an information source that a student uses in a paper of presentation is appropriate. 11 (50%) 10 (43%) 5 (42%) 11 (50%) 9 (39%) 7 (58%) 0 (0%) 2 (9%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 1 (4%) 0 (0%) 0 (0%) 1 (4%) 0 (0%) 21 (47%) 20 (44%) 2 (4%) 0 (0%) 1 (2%) 1 (2%) TABLE 14 Instructor Average Information Literacy Responses (n = 57) Minimum Maximum Mean SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) I am confident that I know the definition of information literacy. 1 2 1 4 4 4 3.45 3.17 3.17 1 4 3.31 Information literacy should be one of AACC’s core competencies. 3 3 3 4 4 4 3.73 3.43 3.42 3 4 3.58 The course entered at the beginning of the survey should be mapped to information literacy in AACC’s curriculum map. 3 2 1 4 4 4 3.55 3.44 2.83 1 4 3.50 I am confident that I can assess students’ information literacy skills. 3 2 2 4 4 4 3.50 3.18 3.00 2 4 3.34 I am confident that I can assess whether an information source that a student uses in a paper of presentation is appropriate. 3 2 3 4 4 4 3.50 3.38 3.42 2 4 3.44 When students complete my course successfully, I am confident that they have demonstrated relevant information literacy skills. 1 2 1 4 4 4 3.27 3.09 2.73 1 4 3.18 TABLE 15 Instructor Opinion for Students (n = 57) Always Often Rarely Never Not Applicable (N/A) Did not Respond SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) In this course, students are expected to do research independently and incorporate information, apart from what is provided in class, into graded course assignments. 12 (57%) 5 (22%) 4 (33%) 8 (38%) 11 (48%) 5 (42%) 1 (5%) 3 (13%) 1 (8%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 1 (5%) 4 (17%) 2 (17%) 17 (38%) 19 (42%) 4 (9%) 0 (0%) 0 (0%) 5 (11%) When creating or revising a research assignment for students, I consult or collaborate with a librarian. 2 (10%) 1 (4%) 0 (0%) 4 (19%) 2 (9%) 2 (17%) 5 (24%) 8 (35%) 5 (42%) 10 (48%) 6 (26%) 3 (25%) 0 (0%) 2 (9%) 0 (0%) 1 (5%) 4 (17%) 2 (17%) 3 (7%) 6 (13%) 13 (29%) 16 (35%) 2 (4%) 5 (11%) To prepare students for their research assignments, I teach students how to find appropriate resources. 8 (38%) 5 (22%) 2 (17%) 8 (38%) 7 (30%) 7 (58%) 3 (14%) 3 (13%) 1 (8%) 2 (10%) 3 (13%) 0 (0%) 0 (0%) 1 (4%) 0 (0%) 1 (5%) 4 (17%) 2 (17%) 13 (29%) 15 (33%) 6 (13%) 5 (11%) 1 (2%) 5 (11%) To prepare students for their research assignments, I ask a librarian to teach a face-to- face session. 2 (10%) 2 (9%) 0 (0%) 2 (10%) 1 (4%) 2 (17%) 2 (10%) 3 (13%) 4 (33%) 12 (57%) 9 (39%) 4 (33%) 3 (14%) 4 (17%) 0 (0%) 1 (5%) 4 (17%) 2 (17%) 4 (9%) 3 (7%) 5 (11%) 21 (47%) 7 (15%) 5 (11%) I ask that a librarian be embedded in my course online. 5 (24%) 2 (9%) 3 (25%) 1 (5%) 2 (9%) 0 (0%) 2 (10%) 1 (4%) 2 (17%) 9 (43%) 11 (48%) 4 (33%) 4 (19%) 3 (13%) 1 (8%) 1 (5%) 4 (17%) 2 (17%) 7 (15%) 3 (7%) 3 (7%) 20 (44%) 7 (15%) 5 (11%) I encourage students to visit the library’s reference desk or to make an appointment with particular librarians to address questions about research. 5 (24%) 5 (22%) 3 (25%) 6 (29%) 3 (13%) 2 (17%) 6 (29%) 3 (13%) 2 (17%) 4 (19%) 7 (30%) 3 (25%) 0 (0%) 1 (4%) 0 (0%) 1 (5%) 4 (17%) 2 (17%) 10 (22%) 9 (20%) 9 (20%) 11 (24%) 1 (2%) 5 (11%) I point students to open-access online instructional tools (webpages, videos, etc.) that teach information literacy skills and/or concepts. 7 (33%) 3 (13%) 3 (25%) 9 (43%) 6 (26%) 2 (17%) 3 (14%) 7 (30%) 3 (25%) 2 (10%) 2 (9%) 2 (17%) 0 (0%) 1 (4%) 0 (0%) 1 (5%) 4 (17%) 2 (17%) 10 (22%) 15 (33%) 10 (22%) 4 (9%) 1 (2%) 5 (11%) I require or encourage students to use resources provided or vetted by the library in order to complete research assignments successfully. 7 (33%) 4 (17%) 2 (17%) 11 (52%) 5 (22%) 5 (42%) 1 (5%) 4 (17%) 1 (8%) 2 (10%) 5 (22%) 2 (17%) 0 (0%) 1 (4%) 0 (0%) 1 (5%) 4 (17%) 2 (17%) 11 (24%) 16 (35%) 5 (11%) 7 (15%) 1 (2%) 5 (11%) TABLE 15 (CONTINUED) Instructor Opinion for Students (n = 57) Always Often Rarely Never Not Applicable (N/A) Did not Respond SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) I require students to use particular kinds of resources (e.g., books, journal articles, newspaper articles, multimedia resources, etc.) to complete research assignments successfully. 11 (52%) 10 (43%) 4 (33%) 6 (29%) 6 (26%) 3 (25%) 4 (19%) 3 (13%) 1 (8%) 0 (0%) 0 (0%) 2 (17%) 0 (0%) 0 (0%) 0 (0%) 1 (5%) 4 (17%) 2 (17%) 21 (47%) 12 (27%) 7 (15%) 0 (0%) 0 (0%) 5 (11%) I stage research assignments so that student work can be assessed at multiple points in the research process. 4 (19%) 2 (9%) 2 (17%) 11 (52%) 8 (35%) 5 (42%) 4 (19%) 2 (9%) 1 (8%) 2 (10%) 7 (30%) 1 (8%) 0 (0%) 0 (0%) 1 (8%) 1 (5%) 4 (17%) 2 (17%) 6 (13%) 19 (42%) 6 (13%) 9 (20%) 0 (0%) 5 (11%) I assess the research my students gather before they incorporate that research into their other coursework (papers, presentations, debates, portfolios, etc.). 5 (24%) 3 (13%) 2 (17%) 7 (33%) 4 (17%) 0 (0%) 5 (24%) 5 (22%) 5 (42%) 4 (19%) 6 (26%) 2 (17%) 0 (0%) 1 (4%) 1 (8%) 1 (5%) 4 (17%) 2 (17%) 8 (18%) 11 (24%) 10 (22%) 10 (22%) 1 (2%) 5 (11%) I require that students show me how they have evaluated the research that they want to cite. 2 (10%) 0 (0%) 1 (8%) 3 (14%) 8 (35%) 3 (25%) 11 (52%) 4 (17%) 4 (33%) 5 (24%) 6 (26%) 1 (8%) 0 (0%) 1 (4%) 1 (8%) 1 (5%) 4 (17%) 2 (17%) 2 (4%) 11 (24%) 15 (33%) 11 (24%) 1 (2%) 5 (11%) I evaluate students on their ability to create accurate citations that are formatted in a particular style (e.g., MLA, APA, Chicago Style, etc.). 10 (48%) 9 (39%) 6 (50%) 7 (33%) 2 (9%) 4 (33%) 1 (5%) 4 (17%) 0 (0%) 3 (14%) 4 (17%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 1 (5%) 4 (17%) 2 (17%) 19 (42%) 9 (20%) 5 (11%) 7 (15%) 0 (0%) 5 (11%) I use rubrics when assessing students’ information literacy skills. 11 (52%) 6 (26%) 7 (58%) 7 (33%) 8 (35%) 2 (17%) 3 (14%) 3 (13%) 0 (0%) 0 (0%) 2 (9%) 0 (0%) 0 (0%) 0 (0%) 0 (8%) 1 (5%) 4 (17%) 2 (17%) 17 (38%) 15 (33%) 6 (13%) 2 (4%) 0 (0%) 5 (11%) TABLE 16 Average Instructor Opinion for Students (n = 57) Minimum Maximum Mean SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) SP 14 (n = 2 2) FA 14 (n = 2 3) SP 15 (n = 1 2) In this course, students are expected to do research independently and incorporate information, apart from what is provided in class, into graded course assignments. 2 2 2 4 4 4 3.52 3.11 3.30 2 4 3.33 When creating or revising a research assignment for students, I consult or collaborate with a librarian. 1 1 1 4 3 3 1.90 1.88 1.90 1 4 1.89 To prepare students for their research assignments, I teach students how to find appropriate resources. 1 1 2 4 4 4 3.05 2.78 3.1 1 4 2.92 To prepare students for their research assignments, I ask a librarian to teach a face-to-face session. 1 1 1 4 4 3 1.67 1.73 1.8 1 4 1.70 I ask that a librarian be embedded in my course online. 1 1 1 4 4 4 2.12 1.69 2.22 1 4 1.90 I encourage students to visit the library’s reference desk or to make an appointment with particular librarians to address questions about research. 1 1 1 4 4 4 2.57 2.33 2.50 1 4 2.46 I point students to open-access online instructional tools (webpages, videos, etc.) that teach information literacy skills and/or concepts. 1 1 1 4 4 4 3.00 2.56 2.60 1 4 2.79 I require or encourage students to use resources provided or vetted by the library in order to complete research assignments successfully. 1 1 1 4 4 4 3.10 2.44 2.70 1 4 2.79 I require students to use particular kinds of resources (e.g., books, journal articles, newspaper articles, multimedia resources, etc.) to complete research assignments successfully. 2 2 1 4 4 4 3.33 3.37 2.90 1 4 3.35 I stage research assignments so that student work can be assessed at multiple points in the research process. 1 1 1 4 4 4 2.81 2.26 2.89 1 4 2.55 I assess the research my students gather before they incorporate that research into their other coursework (papers, presentations, debates, portfolios, etc.). 1 1 1 4 4 4 2.62 2.22 2.22 1 4 2.44 I require that students show me how they have evaluated the research that they want to cite. 1 1 1 4 3 4 2.10 2.11 2.44 1 4 2.10 I evaluate students on their ability to create accurate citations that are formatted in a particular style (e.g., MLA, APA, Chicago Style, etc.). 1 1 3 4 4 4 3.14 2.84 3.6 1 4 3.00 I use rubrics when assessing students’ information literacy skills. 2 1 3 4 4 4 3.38 2.95 3.78 2 4 3.18 Beyond the Library 253 ethical use. However, the data is not meant to be punitive. Instead, the data is being used to begin concerted college-wide and departmental dialogues around information literacy and student learning. The data may, in some areas of the college, lend itself to more detailed evaluations of student progress. These dialogues and efforts are critical to improving the quality of the learning experiences of our students. Score distributions for student artifacts submitted in spring and fall semesters of 2013 were similar across choice of sources and incorporation, with more than 40 percent of students scoring 2.5 or higher in both categories. See figure 4. Score distributions for works submitted in the spring semester of 2014 were similar across choice of sources and incorporation, with more than 50 percent of students scoring 2.5 or higher in both categories. The number of submitted works in spring 2014 was larger, when a specific request for information literacy samples was made, which likely resulted in the submission of more works appropriate for information literacy scoring. Asking for information literacy samples may have resulted in higher scores for works sub- mitted in spring of 2014. In fall of 2014 and spring of 2015, score distributions were similar across incorporation and ethical use, with fewer than 50 percent of students scoring 2.5 or higher in both categories, and with students scoring highest in choice of sources. For all semesters assessed, average scores across the three categories were below Proficient (3), with students on average scoring highest in choice of sources and lowest in ethical use. The faculty survey results (see tables 8–16) revealed that faculty members who were teaching courses already mapped to information literacy generally felt confident that they understood information literacy and that they could teach and assess information literacy skills. All faculty members surveyed agreed or strongly agreed that informa- tion literacy should be a college-wide core competency at AACC. Almost all agreed or strongly agreed that their courses should continue to be mapped to information literacy. The survey results confirm that AACC faculty members value information literacy as a learning outcome at the institution and in their courses. Other data from the faculty survey, along with data from the assignment check- list, indicated that, overall, in the directions provided to students for completing research projects, faculty do not regularly express specific expectations about how students should find, incorporate, and cite information. Though 59 percent of survey participants responded that they always or often encouraged or required students to use sources provided or vetted by the library, only 20 percent responded that they always or often consulted or collaborated with librarians when creating or revising research assignments; 28 percent responded that they always or often taught students how to find appropriate sources; 16 percent always or often asked a librarian to teach in face-to-face classes; and 10 percent always or often had a librarian embedded in their online courses. While 25 percent of participants responded that they always or often point students toward open-access instructional resources, only 19 percent encourage students to visit the library’s reference desk or to make an appointment with particular librarians to address questions about research. Though 62 percent of respondents evaluated students’ adherence to a particular citation style, only 42 percent always or often assessed students’ research before it was incorporated into other coursework, and only 28 percent always or often required students to show how they evaluated information sources. Assignment checklist data showed that very few of the submitted assignment directions specified that students should evaluate sources for credibility or bias (7 percent), and fewer than half of the assignments submitted detailed the number of sources students should use, the types of sources acceptable for students to cite, or a particular citation style that students should employ. See table 6. 254 College & Research Libraries March 2016 Next Steps At the beginning of the fall 2015 semester, finalized data from the college-wide as- sessment of information literacy were shared with the college community to facilitate further dialogue about ways to foster student learning, but many strategies for im- provement were already underway. Noticing that a number of faculty members had submitted artifacts that were not appropriate for assessing information literacy skills, AACC’s Instruction Librarian and Director of LOA led breakout sessions at AACC’s January 2015 faculty orientation and that same month at AACC’s adjunct faculty conference, sessions titled “Information Literacy: What It Is, What It Isn’t, and Tips and Strategies to Improve Student Learning.” Especially because the preliminary data showed that AACC’s graduating students regularly scored below proficient levels, information literacy was the theme of AACC’s 2015 Summer Institute, a faculty profes- sional development opportunity co-sponsored by AACC’s Office of LOA every May. Dozens of faculty members participated in all or part of the daylong program, which included a keynote speaker who contextualized and problematized the phenomenon of information proliferation and the evolution of information literacy as a competency. Some breakout sessions featured panels of faculty members who discussed the iterative processes they’ve used for creating, deploying, and revising their research assignments. In other breakout sessions, the Instruction Librarian led workshops for improving research assignments so that they might help students to experience more meaningful engagement with information and to develop more sophisticated information literacy skills. The Instruction Librarian also led breakout sessions during AACC’s fall 2015 faculty orientation, “High Impact Practices for Developing and Assessing Information Literacy,” and provided another session for the 2015 cohort of new AACC faculty later in the fall semester. Additionally, the LOA Subcommittee has begun discussions for improving the college’s curriculum map. When departments first mapped the college-wide core competencies to courses in their programs, they worked from a general definition of information literacy, but they did not have the benefit of resources that evolved from the first college-wide assessment of that core competency, namely, the rubric used to score student artifacts and the checklist used to score corresponding assignment directions. These tools will help LOA Subcommittee members and AACC Assessment Fellows spur discussions in their departments to tighten alignments of program and course curricula with information literacy learning outcomes. Once the curriculum map has been improved, those courses still aligned to information literacy can be targeted for additional institutional resources and support, including from the library. For courses still aligned with information literacy learning outcomes, as well as those no longer aligned, departments will likely need to revise course descriptions, course outlines, or shared teaching materials, like research assignments common across all or many sections of a course. One of the primary contributions of AACC’s AiA project has been to reinforce for the college community the library’s centrality to the development and assessment of information literacy skills. Librarians were consulted, along with others across the college, in the creation of the rubric used to assess student learning. In light of the data produced by the project, librarians decided to delay releasing a new online, general student library tutorial and to revise the tutorial again to incorporate aspects of the information literacy scoring rubric. Librarians collaborated with each other, along with the AiA team, on the revision of the assignment checklist and on the creation of the faculty survey deployed for the project. Librarians alone scored student artifacts, and the scoring sessions fostered invaluable discussions among librarians about how to define, document, and improve information literacy competencies, discussions Beyond the Library 255 that librarians have continued with other practitioners and stakeholders at the col- lege through formalized professional development opportunities for college faculty and staff, in one-on-one collaborations with fellow practitioners, and through college committee work. After consulting with AACC’s Associate Vice President for Learn- ing, AACC’s Library Director and Instruction Librarian are drafting an Information Literacy Plan, subject to the LOA Subcommittee’s approval, in which the library will continue to lead the campus in addressing the information literacy assessment data and in working to close the assessment loop. The library is poised to support faculty and staff improvement efforts across the curriculum and will document any further improvement efforts before the next college-wide assessment of graduating students’ information literacy skills, slated to begin in 2018. 256 College & Research Libraries March 2016 Appendix A. Information Literacy Assessment Tool Information Literacy Rubric Exemplary 4 Proficient 3 Developing 2 Emerging 1 Choice of Sources Chooses appropriate sources and content of information. Chooses scholarly, discipline- specific information, including primary sources where appropriate. Chooses reliable information, including primary sources where appropriate. Chooses adequate information, including primary sources where appropriate. Chooses irrelevant and/or unreliable sources unsuited to academic research. Evaluation* Critically assesses sources and content of information. Thoroughly analyzes information sources for currency, relevance, accuracy, authority and objectivity. Sufficiently analyzes information sources for currency, relevance, accuracy, authority and objectivity. Partially analyzes information sources for currency, relevance, accuracy, authority and objectivity. Insufficiently analyzes information sources for currency, relevance, accuracy, authority and objectivity. Incorporation Uses information to accomplish a specific purpose. Expertly synthesizes and presents information to fully achieve a specific purpose with clarity and depth. Sufficiently synthesizes and presents information to fully achieve a specific purpose with some clarity and depth. Partially synthesizes and presents information with little clarity or depth. Inadequately synthesizes and presents information with little or no clarity or depth. Ethical Use Understands and complies with institutional policies related to access and use of information, demonstrating an understanding of academic integrity. Fully demonstrates understanding of ethical and legal guidelines for published, confidential and proprietary information. Mostly demonstrates understanding of ethical and legal guidelines for published, confidential and proprietary information. Partially demonstrates understanding of ethical and legal guidelines for published, confidential and proprietary information. Fails to demonstrate understanding of ethical and legal guidelines for published, confidential and proprietary information. *This criterion will be added only if we are able to obtain samples of student work that also capture process. **Evaluators are encouraged to assign a zero to any work sample or collection of work that does not meet “Emerging” level performance. Beyond the Library 257 Appendix B. Assignment Checklist Does the Assignment… Yes No N/A Specify the number of sources you expect student to find, use and cite • Ask students to distinguish between primary, secondary and tertiary sources • Clearly indicate which level of sources are appropriate for this assignment • Require students to analyze and synthesize multiple sources • Adhere to the general rule of 1 or 2 sources per page, i.e. 5-10 sources per 5 page paper Specify the types of sources that are acceptable • Peer reviewed journals • Any journal, magazine or newspaper article from a library database • Websites with certain domains (e.g., .edu or .gov) • Any site found on the open web • Wikipedia or similar open modification sites • Blogs • Opinion pieces • How-to books or articles Require that students know how to evaluate sources for bias and credibility • Clarify expectations for reliability of sources • Allow you to assess the quality of sources students use Require source variety to ensure students know how to get information from difference types of sources Require intermediary steps between assignment and final due date • Provide a timeline for submission of steps • Ask for submission of a thesis sentence or paragraph • Ask for submission of a written proposal • Require an annotated bibliography 258 College & Research Libraries March 2016 Appendix C. Instructor Survey Distinguish for your students between direct and indirect quotations • Ensure students know how to indicate the source of ideas as well as the source of words of others Specify a documentation style that must be used in the assignment • MLA • APA • Chicago • Other Specify requirements about the currency of information cited Adapted from materials used by J. Lathrop & J. Rabin (2011) to study AACC students’ use of online library resources. Page 1 of 4 Appendix : Instructor Survey Page 2 of 4 Beyond the Library 259 Page 2 of 4 Page 3 of 4 260 College & Research Libraries March 2016 Page 4 of 4 Notes 1. “AACC’s Core Competencies,” Anne Arundel Community College, accessed , http://www.aacc.edu/loa/corecompetencies.cfm. 2. Peter Reason and Hilary Bradbury, Handbook of Action Research (London: SAGE Publica- tions, 2006), 3. 3. “Learning Outcomes Assessment (LOA) Plan,” Anne Arundel Community College, ac- cessed , http://www.aacc.edu/loa/plan.cfm. 4. “AACC’s Core Competencies.” 5. “Mission and Vision Statements,” Anne Arundel Community College, accessed , http://www.aacc.edu/aboutaacc/vision.cfm. 6. See Richard W. Meyer, “Focusing Library Vision on Educational Outcomes,” College & Research Libraries News 56, no. 5 (1995): 335–37; Sarah M. Pritchard, “Determining Quality in Academic Libraries,” Library Trends 44, no. 3 (1996): 572–94; Bonnie Gratch Lindauer, “Defining and Measuring the Library’s Impact on Campuswide Outcomes,” College & Research Libraries 59, no. 6 (1998): 546–570, http://crl.acrl.org/content/59/6/546.full.pdf; Patricia Iannuzzi, “We Are Teaching but Are They Learning: Accountability, Productivity, and Assessment,” Journal of Aca- demic Librarianship 25, no. 4 (1999): 304–5, doi:10.1016/S0099-1333(99)80031-7; Kenneth R. Smith, “New Roles and Responsibilities for the University Library: Advancing Student Learning through Outcomes Assessment,” Journal of Library Administration 35, no. 4 (2002): 29–36, doi:10.1300/ J111v35n04_07; Bruce T. Fraser, Charles R. McClure, and Emily H. Leahy, “Toward a Framework for Assessing Library and Institutional Outcomes,” portal: Libraries and the Academy 2, no. 4 (2002): 505–28, doi:10.1353/pla.2002.0077; Peggy L. Maki, “Developing an Assessment Plan to Learn about Student Learning,” Journal of Academic Librarianship 28, no. 1–2 (2002): 8–13, doi:10.1016/ S0099-1333(01)00295-6; Edward K. Owusu-Ansah, “Information Literacy and Higher Education: Placing the Academic Library in the Center of a Comprehensive Solution,” Journal of Academic Librarianship 30, no. 1 (2004): 3–16, doi:10.1016/j.jal.2003.11.002; Peter Brophy, Measuring Library Performance: Principles and Techniques (London: Facet, 2006); Joseph R. Matthews, The Evaluation and Measurement of Library Services (Westport, CT: Libraries Unlimited, 2007); Nancy O’Hanlon, “Information Literacy in the University Curriculum: Challenges for Outcomes Assessment,” portal: Libraries and the Academy 7, no. 2 (2007): 169–89, doi:10.1353/pla.2007.0021; Randall Schroeder and Beyond the Library 261 Kimberly Babcock Mashek, “Building a Case for the Teaching Library: Using a Culture of Assess- ment to Reassure Converted Campus Partners While Persuading the Reluctant,” Public Services Quarterly 3, no. 1–2 (2007): 83–110; Megan Oakleaf, The Value of Academic Libraries: A Comprehensive Research Review and Report (Chicago: Association of College & Research Libraries, 2010); Megan Oakleaf, Michelle S. Millet, and Leah Kraus, “All Together Now: Getting Faculty, Administrators, and Staff Engaged in Information Literacy Assessment,” portal: Libraries and the Academy 11, no. 3 (2011): 831–52, doi:10.1353/pla.2011.0035; Christopher Stewart, “Measuring Information Literacy: Beyond the Case Study,” Journal of Academic Librarianship 37, no. 3 (2011): 270–72, doi:10.1016/j. acalib.2011.03.003; Peter Hernon, “Outcomes Assessment Today: An Overview,” in Higher Education Outcomes Assessment for the Twenty-First Century, ed. Peter Hernon, Robert E. Dugan, and Candy Schwartz (Santa Barbara, CA: ABC-CLIO, 2013), 1–17; Laura Saunders, “Information Literacy as a Student Learning Outcome: Institutional Accreditation,” in Higher Education Outcomes Assess- ment for the Twenty-First Century, ed. Peter Hernon, Robert E. Dugan, and Candy Schwartz (Santa Barbara, CA: ABC-CLIO, 2013), 127–41. 7. See Stephanie Sterling Brasley, “Effective Librarian Discipline Faculty Collaboration Models for Integrating Information Literacy into the Fabric of an Academic Institution,” New Directions for Teaching and Learning 114 (2008): 71–88, doi:10.1002/tl.318; Dorothy Ann Warner, A Disciplin- ary Blueprint for the Assessment of Information Literacy (Westport, CT: Libraries Unlimited, 2008); Oakleaf, Value of Academic Libraries. 8. Peter Hernon, “Library Engagement in Outcomes Assessment,” in Higher Education Out- comes Assessment for the Twenty-First Century, ed. Peter Hernon, Robert E. Dugan, and Candy Schwartz (Santa Barbara, CA: ABC-CLIO, 2013), 161. 9. See Lisa G. O’Connor, Carolyn J. Radcliff, and Julie A. Gedeon, “Applying Systems Design and Item Response Theory to the Problem of Measuring Information Literacy Skills,” College & Research Libraries 63, no. 6 (2002): 528–43, doi:10.5860/crl.63.6.528; O’Hanlon, “Information Literacy in the University Curriculum”; Sue Samson, “Information Literacy Learning Outcomes and Student Success,” Journal of Academic Librarianship 36, no. 3 (2010): 202–10, doi:10.1016/j. acalib.2010.03.002; Eleonora Dubicki, “Faculty Perceptions of Students’ Information Literacy Skills Competencies,” Journal of Information Literacy 7, no. 2 (2013): 97–125, doi:10.11645/7.2.1852; J. B. Hill, Carol Macheak, and John Siegel, “Assessing Undergraduate Information Literacy Skills Us- ing Project SAILS,” Codex 2, no. 3 (2013): 23–27, http://acrlla.org/journal/index.php/codex/article/ view/77; David A. Hubert and Kati J. Lewis, “A Framework for General Education Assessment: Assessing Information Literacy and Quantitative Literacy with ePortfolios,” International Journal of ePortfolio 4, no. 1 (2014): 61–71, http://www.theijep.com/pdf/IJEP130.pdf; Wendy Holliday, Betty Dance, Erin Davis, Britt Fagerheim, Anne Hedrich, Kacy Lundstrom, and Pamela Martin, “An Information Literacy Snapshot: Authentic Assessment across the Curriculum,” College & Research Libraries 76, no. 2 (2015): 170–87, doi:10.5860/crl.76.2.170. 10. Robert E. Dugan and Peter Hernon, “Outcomes Assessment: Not Synonymous with Input and Outputs,” Journal of Academic Librarianship 28, no. 6 (2002): 376–80, doi:10.1016/S0099- 1333(02)00339-7. 11. Ibid., 380. 12. See Maki, “Developing an Assessment Plan”; Shaun Jackson, Carol Hansen, and Lauren Fowler, “Using Selected Assessment Data to Inform Information Literacy Planning with Cam- pus Partners,” Research Strategies 20, no. 12 (2004): 44–56, doi:10.1016/j.resstr.2005.10.004; Bonnie Gratch Lindauer, “The Three Arenas of Information Literacy Assessment,” Reference & User Services Quarterly 44, no. 2 (2004): 122–29; Thomas P. Mackey and Trudi E. Jacobson, “Develop- ing an Integrated Strategy for Information Literacy Assessment in General Education,” Journal of General Education 56, no. 2 (2007): 93–104; Matthews, Evaluation and Measurement; Brasley, “Ef- fective Librarian Discipline Faculty”; Oakleaf, Value of Academic Libraries; Melissa Bowles-Terry, “Library Instruction and Academic Success: A Mixed-Methods Assessment of a Library Instruction Program,” Evidence Based Library and Information Practice 7, no. 1 (2012): 82–95, available online at http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/12373. 13. See Patricia Davitt Maughan, “Assessing Information Literacy among Undergraduates: A Discussion of the Literature and the University of California-Berkeley Assessment Experience,” College & Research Libraries 62, no. 1 (2011): 71–85, doi:10.5860/crl.62.1.71; Kathleen Dunn, “Assessing Information Literacy Skills in the California State University: A Progress Report,” Journal of Aca- demic Librarianship 28, no. 1/2 (2002): 26–35, doi:10.1016/S0099-1333(01)00281-6; Ilene F. Rockman, “Strengthening Connections between Information Literacy, General Education, and Assessment Efforts,” Library Trends 51, no. 2 (2002): 185–98; Mary M. Somerville, Lynn D. Lampert, Katherine S. Dabbour, Sallie Harlan, and Barbara Schader, “Toward Large Scale Assessment of Information and Communication Technology Literacy: Implementation Considerations for the ETS ICT Literacy Instrument,” Reference Services Review 35, no. 1 (2007): 8–20, doi:10.1108/00907320710729337. 14. Leslin H. Charles, “Using an Information Literacy Curriculum Map as a Means of Com- 262 College & Research Libraries March 2016 munication and Accountability for Stakeholders in Higher Education,” Journal of Information Literacy 9, no. 1 (2015): 55–56, doi:10.11645/9.1.1959. 15. Ibid., 56. 16. Richard P. Keeling, Andrew F. Wall, Ric Underhile, and Gwendolyn J. Dungy, Assessment Reconsidered: Institutional Effectiveness for Student Success (Washington, DC: International Center for Student Success and Institutional Accountability, 2008), 28. 17. Oswald M. T. Ratteray, “The Strategic Triad Supporting Information Literacy Assessment,” in Outcomes Assessment in Higher Education, ed. Peter Hernon and Robert E. Dugan (Westport, CT: Libraries Unlimited, 2004), 146. 18. Keeling et al., Assessment Reconsidered, 35. 19. Samson, “Information Literacy Learning Outcomes,” 207. 20. Andreea M. Serban, “Assessment of Student Learning Outcomes at the Institutional Level,” New Directions for Community Colleges 2004, no. 126 (2004): 26, doi:10.1002/cc.151.