oct13_b.indd October 2013 477 C&RL News The Library Research Services (LRS) pro-gram area at the University of Connecti- cut is integral to the instruction of students, both undergraduate and graduate, and LRS members spend much of their time planning and teaching classes designed to increase the students’ information literacy and research skills. LRS members are subject specialist librarians, and they teach one- or multi- session classes geared toward research in a specific discipline. Sessions may include an introduction to library resources, database instruction, citation instruction, overview of library services, and other appropriate topics. In the summer of 2010, the LRS team implemented a pilot program that produced two surveys to assess the effectiveness and value of these instruction sessions. One was for students, and the other for faculty. Data from each survey were correlated and dis- tributed to the appropriate subject librarians following their instruction sessions. The goal of this program was not to test the informa- tion literacy of the students. The results were given to each of the individual librarians who taught the session so that they could use them as they saw fit to tweak their information sessions and provide better service to their constituencies. This was the first attempt by the LRS area to evaluate our instruction sessions in a stan- dardized way. In order to develop an under- standing of assessment programs for library instruction, we conducted an environmental scan to examine the instruction assessment tools developed by other institutions, as well as a literature review to familiarize ourselves with the current methods and findings. Researchers have stressed that before an actual survey or other assessment tool can be created, the librarians involved must first decide what exactly they hope the goals of implementing such a tool will be.1 We took this to heart and began by meeting with the LRS subject librarians to discuss what questions they would like to see included and what questions they did not think this survey would be able to answer in a meaningful way. Method We entered our finalized survey questions into Survey Monkey. For both the student survey and faculty survey, we developed a set of statements followed by a five-point Likert scale, with 1=strongly disagree and 5=strongly agree. We also included several open-ended questions, where respondents were asked to elaborate on issues such as what they felt was the most valuable part of the session, what they would like to see included in a future session, and whether they felt the session was worthwhile and appropriate to their course. The surveys were ready for distribution, via e-mail, at the end of September 2011 after classes had already begun. Following an instruction session, the individual liaison would send the links to both surveys to the faculty member with a request to share the student link with their class and to fill out the Jennifer Lanzing and Anna Kijas Assessing library instruction sessions A pilot project at the University of Connecticut Libraries Jennifer Lanzing is history and political science librarian, e-mail: jennifer.lanzing@lib.uconn.edu, and Anna Kijas is music and dramatic arts librarian, at the University of Connecticut, e-mail: anna.kijas@uconn.edu © 2013 Jennifer Lanzing and Anna Kijas C&RL News October 2013 478 faculty survey. Surveys were accepted until the close of the fall semester. The survey creators gathered the data in Survey Monkey and used Excel to sort and analyze the data. We then shared aggregate data with the liai- sons as a whole, as well as individual survey results with the appropriate liaison. The goals of the analysis included: learn- ing where students and faculty felt that more instruction was needed and discovering if there were variations or similarities among the different class levels from freshmen to graduate students. At the end of the fall semester, we de- cided to continue the pilot program through the spring semester. This was partly due to our delay in making the surveys available in the fall semester and so that we could maximize our data pool. Some minor edits were made to the surveys. For example, the original surveys did not ask the survey taker to provide the name of the librarian who had taught the session. This made it difficult and time-consuming to match up surveys with the correct librarian. In order to protect confiden- tiality of the liaisons, we only wanted to share survey results with the specific librarian who had taught that session. Asking for the name of the librarian on the survey would greatly help with this issue. This change was made in the revised survey. Also, due to the varied answers when asked to provide the course name, we clari- fied this question in the revised survey by including an example of a course number (i.e., ENGL 1010). The distribution method for the spring semester differed slightly, as well, giving the liaisons a couple of options for getting the links to students and faculty. One option was to give students five minutes at the end of a library session to fill out the survey, if possible. The option remained to e-mail the links to the faculty. The surveys closed at the end of the spring semester in May 2012. Results The results from both the student and faculty surveys were overwhelmingly positive from both fall 2011 and spring 2012. Almost all respondents found the instruction sessions to be useful and worthwhile. When we analyzed the results, a few patterns became clear. The most common answer to “Question 7: What would you like to see in a future session?” was “Nothing,” or some variation on that idea. However, in Fall 2011, 13.1% responded that they would like more focus on RefWorks instruction. This was surprising to us because we offer several workshops throughout the year that are specifically geared toward RefWorks and teaching students the various features. What this finding may suggest is that the RefWorks workshops are not adequately advertised and students are unaware that they are available or that additionally scheduled workshops are necessary. For spring 2012, this number was down to 6.3%. Based on the results of the survey from fall 2011, some subject specialists decided to hold RefWorks workshops specifically for the academic departments with which they worked. This may account for part of the decrease in respondents requesting more RefWorks assistance in the spring. Several students expressed a desire to see more databases covered in the instruction ses- sion. Often a librarian will focus on several databases important to the specific discipline or relevant to a course assignment or project. In fall 2011, 10.7% of student respondents wished a greater number of databases were covered in the session. A surprising number of students (9.8% in fall 2011, 9.4% in spring 2012) said that they wished the session had included a section on simple logistics about using a university library. This includes information regarding how to find books in the stacks, informa- tion about tutoring services provided in the library, and basic computer skills. Freshmen usually receive basic library orientation in- formation in their English classes, but many students will not retain the lessons by the time they need to apply them to their own research. After seeing the survey results, some subject specialists have incorporated some October 2013 479 C&RL News basic library information into their one-shot instruction sessions. Graduates were the only group to ask for longer sessions in the future. Also, graduate students were less likely than undergraduates to need additional help with basic library skills, such as finding books on the shelves or navigating the library Web site. Instead, graduate students’ responses indicated that they were mostly concerned with learning about relevant databases and where to find information specific to their research. They wanted more in-depth, high-level research help. Most liaison librarians discuss the session with the faculty member in advance so both parties understand the intended instructional goals. The faculty responses were over- whelmingly positive, and they only included a few suggestions for content that could be included in a future session. One comment expressed an interest in having someone from the Writing Center co- teach an information session with the liaison librarian, while another said that next time, he or she would want to have the information session after a paper is assigned rather than before so that students can better envision how to apply what they are learning. One answer that came up in about 12.5% of the surveys in spring 2012, but was virtu- ally absent from fall 2011, was more focus on searching techniques. Also, 10.2% of the students asked for more instruction on the basic research paper process. This did not come up in the fall 2011 surveys either. One factor that might account for these differences between semesters is a higher level of respondents from the humanities and social sciences in spring 2012, departments that typically require term papers at least in the upper undergraduate levels. The surveys from fall 2011 had a higher percentage of respondents from the hard sciences and engineering, where a typical research paper at the end of the semester is not often assigned. One way this request was addressed by individual subject specialists is to include more interactive exercises in their sessions, which allow students to have hands- on experience with different search strategies and databases. Conclusion There were some problems we encountered in the analysis of the survey results. One is- sue we had was the lack of clarity in some of our survey questions. Students gave wildly different answers to questions such as “What is this session for?” and “Where did this ses- sion take place?” In the next iteration of the survey, we knew we needed to be clearer as to what exactly we were asking for from the students. We added an example of a course name to the former and an example of a classroom name to the latter. Other problems we encountered revolved around the limitations of the Survey Monkey software. We would need to go into the results manually and pull the data for each individual liaison. We realized that this was far too time consuming for anyone to make this part of their day-to-day workflow. We needed a product that would allow each liaison to log on and retrieve only his or her survey results. Survey Monkey does not have that functionality and this compromises our promise of confidentiality made to the liaisons and the respondents. To get the functionality we required, we decided to try out a different survey software called Qualtrics. Qualtrics is now the standard tool that our subject specialists use to evaluate their information sessions. Each librarian has the ability to send a URL link to the appropriate faculty member and ask him or her to fill out the faculty survey and to share the student version with the class. The Qualtrics software allowed us to create an account with the survey template for each librarian. An individual librarian is able to log in and see only his or her survey results. The director of the LRS area is able to view all results. This protects anonymity and eliminates the need for a third party to determine the (continues on page 494) C&RL News October 2013 494 rial own work. Apart from being illegal, it is also unethical to represent such work as original in a second publication, and from an academic standpoint, to expand one’s bibliography with multiple versions of the same material. The proliferation of journals that publish review articles, often ghostwrit- ten and often under sponsorship by com- mercial interests, has markedly increased the potential for self-plagiarism, and abuses have become widespread. Careers have suffered because of un- awareness, inattention, or lack of under- standing of the potential impact of self- plagiarism, on the part of researchers. How- ever, there are guidelines, mechanisms, and services now in place to address the need for monitoring articles and detecting self- plagiarism. Furthermore, if an author follows the accepted protocol for citing published works, whether his or her own or another author’s, then the author must cite the source completely and appropriately, paying close attention to copyright regulations. Now that researchers are turning more and more to librarians for assistance with copyright issues, librarians are now also in a position to prevent self-plagiarism. We also believe that in order to make research- ers more aware of the issue, librarians themselves must become more familiar with the ramifications and implications of self-plagiarism, as well as its impact on sci- entific publishing. We believe that if such a process were accomplished in every case, self-plagiarism would not be an issue. Notes 1. George P. Chrousos, Sophia N. Ka- lantaridou, Andrew N. Margioris, Achille Gravanis, “The ‘self-plagiarism’ oxymo- ron: Can one steal from oneself?” Euro- pean Journal of Clinical Investigation, 42 (2012): 231–2. 2. Jef Akst, “When is self-plagiarism ok?” The Scientist, September 9, 2010, www. t h e - s c i e n t i s t . c o m / ? a r t i c l e s . v i e w / a r t i c l e N o / 2 9 2 4 5 / t i t l e / W h e n - i s - s e l f -plagiarism-ok-/, accessed September 11, 2013. 3. Miguel Roig, “Avoiding plagiarism, self-plagiarism, and other questionable writing practices: A guide to ethical writ- ing,” http://ori.hhs.gov/education/products /roig_st_johns/Introduction.html, accessed September 11, 2013. 4. Liviu Andreescu, “Self-Plagiarism in Academic Publishing: The Anatomy of a Misnomer,” Science and Engineering Ethics, November 21, 2012 (Epub ahead of print). 5. American Psychological Association, Publication Manual of the American Psycho- logical, 6th ed. (Washington, D.C., American Psychological Association, 2010), 1.10, p. 15–16. 6. Ivan Oransky, “A different tack: A notice of redundant publication, rather than a retraction, for duplication,” “Retraction Watch,” http://retractionwatch.wordpress. com/2012/12/21/a-different-tack-a-notice -of-redundant-publication-rather-than-a -retraction-for-duplication, accessed Septem- ber 11, 2013. appropriate librarian and distribute results. Each librarian is responsible for keeping track of his or her survey results and making changes to his or her instruction sessions when necessary. The survey provides another tool for in- struction librarians to use in evaluating their sessions and improving the overall quality of the library instruction program. Note 1. Lawrie H. Merz and Beth L. Mark, Assessment in College Library Instruction Programs (Chicago: Association of College and Research Libraries, 2002); Diana D. Shonrock, Evaluating Library Instruction: Sample Questions, Forms, and Strategies for Practical Use (Chicago: American Library As- sociation, 1996). (“Assessing library instruction sessions,” cont. from page 479)