may08b.indd Suzanne Julian and Kimball Benson Clicking your way to library instruction assessment Using a Personal Response System at Brigham Young University APersonal Response System (PRS), or click­ers, is an effective method for gathering assessment data during library instruction sessions. As students respond to predeter­ mined questions, data is gathered and stored for analysis. Questions can check students’ progress, comprehension, and self­assessment. Clickers gather data during an instruction ses­ sion rather than relying on students to fi ll out an evaluation form following the class. No matter what type of instruction is used to teach information literacy skills, assessing student learning is critical to the success of the student and the instruction program. A PRS is a useful tool for gathering data and customizing instruction to student needs. Effective use of clickers provides meaning­ ful data on students’ progress, comprehension, and opinions or self­assessment. Many instruc­ tion programs currently use paper or online assessment tools to evaluate these aspects of their library sessions. The clickers provide the same ability to collect data as these other methods, but they also provide an additional benefit of being flexible and interactive. The PRS collects and downloads clicker responses allowing the teacher to review the stored data after the session. The software supports true/false, multiple choice, and Lik­ ert­scale type questions. The system will store the data and provide reports or a download­ able version of the results. This data is then ready to use to analyze a session or a group of sessions. The BYU experience During the fall semester 2006, a PRS was purchased for the library instruction program at Brigham Young University (BYU). The pri­ mary purpose of purchasing the system was to create interaction during library instruction sessions and captures the attention of students. By serendipity, we soon discovered that click­ ers provided valuable data on how students thought and were a useful measure of students’ learning and opinions. The first few semesters of use have been a process of trial and error in designing questions and using the data. The BYU General Education program requires students to take a First­Year writ­ ing class, which includes a research unit towards the end of the semester. As part of the research unit, students are brought to the library for two library instruction session during their scheduled class times. Librarians provide instruction for those two sessions, which are weighted toward demonstration and hands­on practice. Because they spend considerable time watching demonstrations, some students loose focus during the session and miss important information. Clickers seem to increase attention through active participation and provide valuable feedback in the session. Three library instructors began using the clickers in fall 2006. As the semester pro­ gressed, three additional teachers used the clickers in their sessions. All clicker results were stored and downloaded to a spread­ sheet. After all session data was calculated, Suzanne Julian is information literacy librarian, e-mail: suzanne_julian@byu.edu, and Kimball Benson is First- Year writing manager, e-mail: kimball_benson@byu. edu, at Brigham Young University © 2008 Suzanne Julie and Kimball Benson 258C&RL News May 2008 mailto:suzanne_julian@byu.edu we had asked 29 questions with 1,544 re­ sponses. The average number of clicker ques­ tions per session was 5.4. Of the 29 questions asked, only four provided valuable session and program information. Two determined where students were in the research process and two tested their knowledge of credible sources and Boolean connectors. This infor­ mation helped us focus on three areas of as­ sessment: students’ progress, comprehension, and self­assessment. Student progress One advantage the PRS offers over other methods of information gathering is the abil­ ity to instantly assess student progress in areas that require preparation, such as selecting re­ search questions or appropriate articles. The question that was most informative for our teachers was: “Where are you in the research process?” The four responses were: 1. Have general idea of my topic. 2. Have narrowed topic into a research question. 3. Have started searching for information. 4. Haven’t started. With the clickers, 53 percent of the stu­ dents reported that they “haven’t started” their research project and 29 percent indicated a “general idea” of their topic. In sessions where this question was asked and then an­ swered by having students raise their hands in response, the three instructors observed that very few students would raise their hands on the answer “haven’t started.” The teachers realized that some students may feel more pressure to give the pleasing answer rather than the honest answer to this ques­ tion when raising their hands. The data from these questions influenced where the teacher started his or her instruction and set the pace for the session. Student comprehension A second advantage to the clickers is the ability to test learning comprehension. Hav­ ing students demonstrate their abilities by performing a task, such as finding a peer­re­ viewed article, is the best method for check­ ing comprehension but is not always practical in large classes with time constraints. Care­ fully prepared clicker questions can provide a quick survey of students’ comprehension. For example, showing images of magazines and journals and asking students to identify the items that are scholarly can quickly de­ termine if students have learned the concept and can distinguish different types of sources. The results can lead to further discussion that clarify misunderstanding, provide examples of the principle being taught, and give stu­ dents confidence in their knowledge. Student opinions or self-assessment The third benefit to using clickers as an as­ sessment tool is to gauge students’ opinions of their own learning. They are able to self­ report on the usefulness of the information presented, their skills, and their attitudes. Instructors can create questions that solicit opinions on the information presented, such as: “Do you understand [the concept] more now than when we started?” “How will you use it in the future?” “What was the most valuable information you learned today?” Responses are limited to true/false or multiple choice formats, which force answers into a given set and can limit the amount of infor­ mation collected. However, carefully worded questions can provide data on trends and possible areas of concern. Students can also demonstrate the skills they have learned during the class session. A quick series of slides in which students iden­ tify appropriate Boolean connectors, trunca­ tion, and other search techniques taught during the session can present a snapshot of what concepts students misunderstood, did not learn, or need additional training to mas­ ter. When these results are compared across classes or across teachers, they can indicate trends in student learning or areas needing improved instruction. Lessons learned Although clicker technology is an engaging classroom tool and an effective method for gathering assessment data, there are cautions May 2008 259 C&RL News in designing and using clicker questions. The two areas that impacted our results the most were the wording of the questions and the frequency with which clickers were used during an instruction session. The types of clicker questions have a major impact on the effectiveness of a class session. Carefully worded questions that fi t the context of the instruction provide the best experience for students and the most valuable data for teachers. We found that questions such as “Are you glad to be here?” were not as effective as “What do you want to learn from the session today?” Fun questions can be used occasion­ ally to add humor to the session and bring students’ attention back to the information being presented. Questions with fun answers were an effective way to interject humor into the instruction and bring mental rest for a few moments. One question that generated some fun discussion was “Why are you in college?” The four responses they could choose were: 1. Fancy diploma. 2. Future “big bucks.” 3. Find that “someone special.” 4. To learn cool things. Of course, these questions were not in­ cluded in the statistical data used for evalu­ ation. The frequency of clicker questions used during a class impacts the quality of the instruction session. New technology is en­ tertaining but can quickly become the focus of a session if not used as part of an appro­ priate learning activity. Another important consideration when incorporating clickers in the instruction session is time. It takes three to five minutes to help students test their clickers at the beginning of class and the same amount of time to open a poll, close the poll, and discuss the results. We discov­ ered it was important to carefully monitor the amount of time being devoted to clicker questions and evaluate if their use enhanced the instruction. The last item of importance that we learned is to plan your assessment before beginning the instruction program for the semester. Determining what you want to evaluate, constructing well­written questions, and planning who (teachers, students, admin­ istrators) will be included in the assessment ensures reliable results that can be used to effectively assess the information literacy program. Conclusion Much of our focus in teaching is to actively in­ volve students in learning. Clicker technology affords a useful means to encourage all stu­ dents, including reluctant ones, to be active participants in a library instruction session. It also provides library teachers and administra­ tors a nonintrusive way to gather meaningful data on student progress, comprehension, and opinions or self­assessment. This infor­ mation can provide program level assessment without relying on teacher observation or students responses to an after­the­instruction electronic or paper survey. A suite of services offered by the Association of Research Libraries LibQUAL+ Charting Library Service Quality... ® Looking for library service assessment tools? Over 1000 libraries worldwide have used the LibQUAL+® Web-based survey Learn about LibQUAL+® at the ALA Annual Conference Consultations available at Booth 549 www.libqual.org/events 260C&RL News May 2008