C&RL News December 2018 598 Juggling the value of performance-based assessment of student information literacy competencies with the limited time and re- sources required to do this type of assess- ment remains an ongoing challenge for many librarians. This article chronicles our initial foray into content analysis, a fairly labor-intensive methodology, but one which allowed us to examine student approaches to the research process as narrated in their own words in the form of a prefocus essay. Our goal was to gather data that would help to inform our university library’s information literacy cur- riculum. What follows documents our process, methodology, results, and lessons learned in order to aid those at other institutions in their assessment planning. Colgate University is a selective, private liberal arts institution with an approximate student body of 2,900. The information literacy program is well-established, with an on-going presence in the Core Curriculum and First-Year Seminar courses, as well as frequent requests for library instruction in upper-level courses. For over a decade, professors at Colgate have worked in conjunction with library fac- ulty to support a prefocus essay assignment, which asks students to report their process of navigating library resources as they work to refine a broad research topic to a more focused research question.1 Overall, the main objective is for the stu- dent to identify an existing scholarly conversa- tion in the literature and, from that conversa- tion, articulate a focused topic. Through the examination of various re- sources, they move from a broad initial topic (i.e., “I’d like to explore women artists in Mexico”) and conclude with a more focused research topic (i.e., “After performing these searches, I have decided to settle my research project on Remedios Varo and the influence of French surrealism in Mexico City after World War II”). The body of the essay should delineate the process that led the student to their focused conclusion, identifying resources consulted, search terms tried, search strategies attempted, etc. These student narratives seemed to be a particularly rich source of data for librarians to plumb for potential insights regarding student information literacy skills. Librarians are rarely afforded the opportunity to see an explana- tion of research processes directly from the student’s perspective. In the fall of 2016, we set out to organize a content analysis of pre- focus essays that would span disciplines and course levels to give the Colgate librarians an overview of how students conduct research and where they are stumbling in the process. Jesi Buell is instructional design and web librarian, email: jbuell@colgate.edu, and Lynne Kvinnesland is information literacy librarian, email: lkvinnesland@ colgate.edu, librarian at Colgate University in Hamilton, New York © 2018 Jesi Buell and Lynne Kvinnesland Jesi Buell and Lynne Kvinnesland Exploring information literacy assessment Content analysis of student prefocus essays mailto:jbuell%40colgate.edu?subject= mailto:lkvinnesland%40colgate.edu?subject= mailto:lkvinnesland%40colgate.edu?subject= December 2018 599 C&RL News Process Historically, Colgate librarians have used the one-minute paper as the primary means of assessing library instruction sessions. Results from this evaluation effort consistently in- dicate that library instruction is valuable in helping them to build and strengthen their information literacy skills. Although it is good to have this annual data, we were ready to explore a more robust and direct method of assessing student information literacy com- petencies and understandings. Although neither of us had experience with content analysis as a methodology, it was an appealing way to “gain direct information from study participants without imposing precon- ceived categories or theoretical perspectives.”2 Planning and implementation of this study progressed over the course of one year, from July 2016 through May 2017. First, we sought and received institutional review board approval for our study. A review of the literature identified a number of previ- ous studies in which the researchers used a qualitative methodology that included coding of student narratives or interview transcripts.3 These were helpful to us in exploring the potential of this assessment method. Although the results of these previous studies provided some indication of what our study would reveal, we felt that the uniqueness of the pre- focus essay assignment, which targets just the topic-formulation component of the research process, might shed new light on the approach to research and the challenges encountered by Colgate students in particular. During the fall 2016 semester, student pre- focus essays were solicited from faculty who had assigned it. From the 90 students essays collected, an anonymized, stratified random sampling yielded a final set of 40 essays from eight different courses in the Arts and Social Sciences. Additionally, the researchers gath- ered copies of the assignment prompts from each of the participating course instructors, since there was some variation among faculty regarding how they set up this assignment for their students. All classes received a library instruction session, although not always by the same librarian. The student sample breakdown was as follows: 15 first years, 18 sophomores, 1 junior, and 6 seniors. The tool we used for our analysis was MAXQDA. While we were fairly unfamiliar with the software, it was recommended by campus faculty who were engaged in qualitative analysis in their own research and teaching. With the assistance of two sociology professors,4 we acquired familiarity with the functionality of this tool and, and after do- ing some practice coding on sample student essays written in prior semesters, we dived in, developing a coding scheme and testing intercoder reliability. Results Highlights of our results include: • Library resources. Over half of all students turned to library resources (the catalog or databases) as the first-step in their research, and 100% of these students men- tioned being aware of library tools, due to library instruction. • Problems. In their essays, 80% of students mentioned problems, ranging from anxiety, difficulty choosing a topic, and time management. One frequent problem was the inability to narrow their topic from a broad, overarching topic to something more specific. Of those surveyed, 32.5% did not narrow their topics from a very broad original topic. The reasons for this issue were confusion and mismanage- ment of time. Forty-three percent of those who were unsuccessful thought they had narrowed their topic but, in reality, still had a very broad topic. Another significant issue (60%) was inef- fective search strategies, be it ineffective or limited search efforts. Students are not spend- ing enough time with the search tools to learn how to use them well. Instead, they made decisions about their results based on quick and superficial ways of searching. In terms of keyword choice, the prefocus essays had examples of repeating the same limited terms in multiple databases or of students using very C&RL News December 2018 600 complex or focused initial search terms. These ineffective search strategies were perceived by students as paucity of information on a topic rather than any weakness in their search skills. This often quickly led them to turn to Google as an alternative search tool. • Emotions. Affective elements also played a significant role in the students’ ap- proach to the research process. A quarter of the students expressed some negative emotion (anxiety, anxious, worried, daunt- ing, intimidated, intimidating, overwhelm- ing, frustrating, frustrated, nervous, hesitant, hesitated). • Scholarly conversation. Upperclass- men were twice as likely to use bibliographies to gather more resources. Acknowledgement of the author of a resource also became great- er as a student progressed in academic level. • The human component. While 80% of the students reported experiencing dif- ficulties, only 34% reached out to librarians for assistance. Upperclassmen more readily sought assistance from either a librarian or faculty member. Lessons learned Looking back on whether this high-effort assessment generated correspondingly valu- able results, our thoughts are predominantly positive. Our study yielded many benefits, including direct insight into student research practices, as well as baseline data upon which to build possible future replications of the study. Retrospectively, we recognize some weak- nesses in our study, such as small sample size, unequal representation across the first-year to senior spectrum, and no control group. Nevertheless, the results gave us a number of things to consider regarding how we conduct our library instruction sessions: • How can we allay feelings of anxiety and/or perception of the research process as overwhelming? • How can we encourage more students to request assistance from a librarian? • How can we promote student under- standing of the two “frames” most relevant to the prefocus essay (i.e., Research as Inquiry and Scholarship as Conversation)? • How can we leverage the prefocus essay in more information literacy instruction sessions in light of the fact that results indicated that students found this assignment to be valuable in helping them understand research as process that takes time? • Because we often saw openly available tools like Google Scholar used, should we in- corporate more instruction on these tools, since the use is prevalent anyway? For anyone considering using content analysis of student work for their information literacy assessment, we would recommend a less-is-more approach. The scale of our original vision was much too large given the amount of time it takes to develop a coding scheme, read each student artifact at least twice, do the actual coding in a thoughtful manner, and then compile the results. If we opt to replicate this assessment effort, we will use fewer codes and tighten our focus, perhaps targeting only problems that students encounter when con- ducting research. It is also important to stay organized (in terms of labels, files versions, etc.) and carefully create your coding scheme. Our assessment would have benefitted from more practice with MAXQDA and the methodology prior to beginning the formal analysis. We now better understand how to mitigate our ambitiousness and structure the schema to address specific issues. Even if results echo earlier studies, the value of this project was that it exposed unique issues that impact our student body. Colgate librarians will use these results to inform future assessment initiatives and to structure discussions around our pedagogy. Notes 1. An example prompt is available at https://libguides.colgate.edu/prefocus. The (continues on page 606) https://libguides.colgate.edu/prefocus C&RL News December 2018 606 Among the most rewarding experiences and outcomes of the project have been the collaborative spirit and warm camaraderie among ARFIS team members. We have been enriched professionally and personally by working together and sharing our results in scholarly and informal venues, and by broad- ening our international network of colleagues and friends. Notes 1. Diane Mizrachi, “Undergraduates’ aca- demic reading format preferences and behav- iors,” The Journal of Academic Librarianship 41, no. 3 (2015): 301–11. 2. Laura Saunders, Serap Kurbanoğlu, Mary Wilkins Jordan, Joumana Boustany, Brenda Chawner, Matylda Filas, Ivana He- brang Grgic, et al., “Culture and competen- cies: A multi-country examination of refer- ence service competencies,” Libri 63, no. 1 (2013): 33–46. 3. Laura Saunders, Serap Kurbanoğlu, Joumana Boustany, Guleda Dogan, Peter Becker, Eliane Blumer, Sudatta Chowdhury ,et al., “Information behaviors and information literacy skills of LIS students: an international perspective,” Journal of Education for Library and Information Science 56 (2015): S80. 4. Serap Kurbanoğlu, Joumana Boustany, Sonja Špiranec, Esther Grassian, Diane Miz- rachi, and Loriene Roy, eds., Information Literacy: Moving Toward Sustainability: Third European Conference, ECIL 2015, Tallinn, Es- tonia, October 19–22, 2015, Revised Selected Papers, vol. 552, Springer, 2016, pp. 427–64. 5. Serap Kurbanoğlu, Joumana Boustany, Sonja Špiranec, Esther Grassian, Diane Miz- rachi, Loriene Roy, and Tolga Çakmak, eds., Information Literacy: Key to an Inclusive Society: 4th European Conference, Ecil 2016, Prague, Czech Republic, October 10-13, 2016, Revised Selected Papers, vol. 676, Springer, 2017, pp. 215–64. 6. Diane Mizrachi, Joumana Boustany, Serap Kurbanoğlu, Güleda Doğan, Tania Todorova, and Polona Vilar, “The Academic Reading Format International Study (ARFIS): Investigating Students Around the World,” in European Conference on Information Literacy, pp. 215-227, Springer, Cham, 2016. 7. Diane Mizrachi, Alicia M. Salaz, Serap Kurbanoğlu, Joumana Boustany, and ARFIS Research Group, “Academic reading format preferences and behaviors among university students worldwide: A comparative survey analysis,” PloS ONE 13, no. 5 (2018): e0197444. 8. S e e h t t p s : / / w w w . i n s i g h t . m r c . ac.uk/2018/04/16/gdpr-research-changes/. 9. See http://arfis.co. authors would like to credit Mary Jane Petrowski and Padma Kaimal for developing this assignment at Colgate. 2. Hsiu-Fang Hsieh and Sarah E. Shannon, “Three Approaches to Qualitative Content Analysis,” Qualitative Health Research 15, no. 9 (2005): 1279. 3. Claire McGuinness and Michelle Brien, “Using Reflective Journals to Assess the Re- search Process,” Reference Services Review 35, no. 1 (2007): 21–40; Anna Hulseberg and Michelle Twait, “Sophomores Speaking: An Ex- ploratory Study of Student Research Practices,” College & Undergraduate Libraries 23, no. 2 (2016): 130–50; Paula R. Dempsey and Heather Jagman, “I Felt Like Such a Freshman”: First-Year Students Crossing the Library Threshold,” portal: Libraries and the Academy 16, no. 1 (2016): 89–107; Erine Rinto, Melissa Bowles-Terry, and Ariel J. Santos, “Assessing the Scope and Fea- sibility of First-Year Students’ Research Paper Topics,” College and Research Libraries 77, no. 6 (2016); Eleonora Dubicki, “Writing a Research Paper: Students Explain their Process,” Refer- ence Services Review 43, no. 4 (2015): 673–88. 4. The authors would like to express their gratitude to Professors Chris Henke and Alicia Simmons for their invaluable assistance in learn- ing MAXQDA and how to develop a coding scheme. (“Exploring information literacy assessment . . .” continues from page 600) https://www.insight.mrc.ac.uk/2018/04/16/gdpr-research-changes/ https://www.insight.mrc.ac.uk/2018/04/16/gdpr-research-changes/ http://arfis.co