july07b.indd Bonnie Gratch-Lindauer Information literacy-related student behaviors Results from the NSSE items In the November 2005 C&RL News, the Institute for Information Literacy’s College Students Surveys Project Group1 reported their activities and progress in developing information literacy­related items to be in­ cluded on the 2006 National Survey of Student Engagement (NSSE) as experimental items. Ten items were included and administered to 12,044 students at 33 institutions on the 2006 NSSE. As experimental items, the pur­ pose was to test them on a wide array of institutions and students to determine if the questions were good and what lessons may be learned from them. Frequency data were provided in late sum­ mer 2006, and the analysis and correlation findings were released to the Project Group members just in time for the 2007 ALA Mid­ winter Meeting in Seattle, where they were able to discuss the nine tables of data, and observations made by Robert Gonyea (as­ sociate director and project manager at the Indiana University Center for Postsecondary Research) and formulate follow­up questions. After receiving answers to the follow­up questions, the group feels ready to share these findings. First, though, a brief review of the project group’s activities and purpose for those unfamiliar with their work. Project background The project grew out of interest from some ACRL members to study the national col­ lege student engagement surveys for items related to information literacy. The charge evolved to include an analysis of seven na­ tional standardized college student surveys. After initial investigation of the seven sur­ veys, the project group decided to concen­ trate on one survey—NSSE—examining it in depth. Survey items on NSSE “represent empirically confirmed ‘good practices’ in undergraduate education. That is, they re­ flect behaviors by students and institutions that are associated with desired outcomes of college.”2 At the 2005 ALA Midwinter Meet­ ing the Project Group’s discussion centered on NSSE director George Kuh’s suggestion to focus item development work on student behaviors that contribute to what we defi ne as information literacy, in addition to stu­ dent interactions with librarians or their li­ brary experiences. Since there was plenty of time to prepare the items for the 2006 NSSE, the project group members decided to seek broader input. This broader input was obtained from a six­month adapted Delphi process to gather evidence from a polling of library and infor­ mation science educators and practitioners. This resulted in a ranked list of behaviors and activities that was reviewed at the project group’s summer meeting. Members reviewed the findings from the field along with the items then on the 2005 NSSE to determine the final items for submission. The resulting items were reviewed and endorsed by the Executive Committee of the Institute for Information Literacy and the ACRL Executive Board. In mid­August six new items were submitted to the NSSE staff for consideration. After NSSE Bonnie Gratch-Lindauer is coordinator of librar y instructional services/information competency at the City College of San Francisco, e-mail: bgratch@ccsf. edu © 2007 Bonnie Gratch-Lindauer C&RL News July/August 2007 432 1. In your experience at your institution during the current school year, about how often have you done each of the following? (Response options included very often, often, sometimes, and never.) a.Asked a librarian for help (in person, e­mail, chat, etc.) b.Went to a campus library to do aca­ demic research c. Used your institution’s Web­based library resources in completing class as­ signments 2.Which of the following have you done or do you plan to do before you graduate from your institution? (Response options included done, plan to do, do not plan to do, and have not decided.) a. Participate in an instructional ses­ sion led by a librarian or other library staff member b. Participate in an online library tutorial Information literacy related items used on 2006 National Survey of Student Engagement 3. To what extent does your institution emphasize each of the following? (Response options included very much, quite a bit, some, and very little.) a. Developing critical, analytical abilities? b. Developing the ability to obtain and effectively use information for problem­ solving? c. Developing the ability to evaluate the quality of information available from various media sources (TV, radio, newspa­ pers, magazines, etc.)? 4.To what extent has your experience at this institution contributed to your knowl­ edge, skills, and personal development in the following areas? (Response options included very much, quite a bit, some, and very little.) a. Evaluating the quality of information? b. Understanding how to ethically use information in academic work (proper citation use, not plagiarizing, etc.)? staff review and revisions, four items were used on the 2006 NSSE for analysis (see sidebar above). Findings The findings are very encouraging and overall support modest to high signifi cant positive relationships between the two in­ formation literacy scales and eight scales de­ rived from NSSE items, particularly among seniors with gains in practical competence and general education. Of course, none of these findings imply causality. However, as explained by Gonyea: The correlations between the in­ formation literacy scales and the other NSSE measures are as good or nearly as good as other scales on NSSE. What this indicates is that all these behaviors and perceptions go together, as roughly the same students that use the campus library resources actively also report that they are “deep learners,” “collab­ orative learners,” and so on. The diffi ­ cult question is to try to understand the unique contributions of engagement with learning information literacy and certain outcomes. In other words, if I were to put deep learning, active and collaborative learning, student­faculty interaction into a regression model and control for them would “active learn­ ing in information literacy” still have a strong signifi cant contribution?3 The findings were reported on nine tables, which can only be briefly described here. The sidebar above lists the information lit­ eracy­related items, which can be consulted when specific items are referenced in this article.4 Table 1, “Information Literacy Item Frequencies by Class Rank,” reports the frequency with which first­year students and seniors engaged in various activities. Perhaps as expected, seniors report doing these activi­ July/August 2007 433 C&RL News ties a bit more often than fi rst­year students mean of 3 for first­year students and 3.1 for and also report greater gains related to in­ seniors is essentially equivalent to an average formation literacy. Project group members response of “quite a bit.” were a bit disappointed, however, with the Table 4 shows the basic relationship be­ lower percentages for use of online tutorials tween the two IL scales and eight other NSSE and noted the need to consider revising the scales, including four of the fi ve benchmarks wording of that item, perhaps even consoli­ of effective educational practice, a “deep dating it into the fourth item. A revised item learning” scale, and three scales that measure dealing with instruction and online tutorials students’ self­reported gains in knowledge, that would connect more directly with course skills, and personal development. The re­ work might be worded as: sults demonstrate modest to high positive Which of the following have you done or significant correlations, which means “as do you plan to do before you graduate from scores increase in the information literacy your institution? scales they also increase in benchmarks and a) Participate in a research skills instruc­ gains.”5 The four NSSE benchmarks are: 1) tional session conducted by a librarian or level of academic challenge, 2) active and complete an online research skills tutorial collaborative learning, 3) student interactions connected to your course work. with faculty members; and 4) supportive Table 2, “Information Literacy Scales—Re­ campus environment. liability Statistics and Component Items,” reports the items that were Information literacy-related activities used to construct the two information literacy scales, as well as the reliability Table 1 reports the frequency with which students statistics for each scale. The Active engaged or planned to engage in specifi c informa­ Learning in Information Literacy tion literacy related activities. 1. How often:Asked a librarian for help (in person, scale consists of the first three items e­mail, chat, etc.) listed in Table 1 (see sidebar) and the 2. How often: Gone to a campus library to do Institutional Emphasis and Contribu­ research for a course assignment tions in Information Literacy scale is 3. How often: Used your institution’s Web­based composed of items six through ten. library resources when completing class assign­ The items asking about participation ments in an instructional session and in an 4. How often: Participate in an instructional online library tutorial were not in­ session led by a librarian or other library staff cluded in these scales because they member did not correlate well with the fi rst 5. Have done or plan to do before graduation: three items. Participate in an online library tutorial 6. Institutional emphasis: Developing critical Table 3, “Information Literacy thinking and analytical abilities Scales—Descriptive Statistics by 7. Institutional emphasis: Developing the ability Class Rank,” provides the descriptive to obtain and effectively use information for prob­ statistics for each scale, comparing lem­solving first­year students to seniors. On 8. Institutional emphasis: Developing the abil­ the Active Learning in Information ity to evaluate the quality of information available Literacy scale, where a mean of 2 is from various media sources (TV, radio, newspapers, equivalent to an average response of magazines, etc.) “sometimes” and a mean of 3 to “of­ 9. Institutional Contribution: Evaluating the qual­ ten,” the mean for first year students ity of information is 2.3 and for seniors 2.5. On the 10. Institutional Contribution: Ethical use of infor­ mation sources in academic work (proper citation Institutional Emphasis and Contribu­ use, not plagiarizing, etc.) tions in Information Literacy scale, the C&RL News July/August 2007 434 The Deep Learning scale is composed of twelve items: • worked on a paper or project that re­ quired integrating ideas or information from various sources, • included diverse perspectives (different races, religions, genders, political beliefs, etc.) in class discussions or writing assignments, • put together ideas or concepts from dif­ ferent courses when completing assignments or during class discussions, • discussed ideas from your readings or classes with faculty members outside of class, • discussed ideas from your readings or classes with others outside of class (students, family members, co­workers, etc.), • synthesizing and organizing ideas, information, or experiences into new, more complex interpretations and relationships, • analyzing the basic elements of an idea, experience, or theory, such as examining a particular case or situation in depth and considering its components, • making judgments about the value of information, arguments, or methods, such as examining how others gathered and inter­ preted data and assessing the soundness of their conclusions, • applying theories or concepts to practi­ cal problems or in new situations, • examined the strengths and weaknesses of your own views on a topic or issue, • tried to better understand someone else’s views by imagining how an issue looks from his or her perspective, and • learned something that changed the way you understand an issue or concept. Table 4 shows that the highest positive correlations between the Institutional Empha­ sis and Contributions in Information Literacy Scale are with the NSSE Gains in General Education scale (.67 for seniors) and the Gains in Practical Competency scale (.63 for seniors). The Gains in Practical Competency scale is composed of fi ve items: • acquiring job or work­related knowl­ edge and skills, • working effectively with others, • using computing and information tech­ nology, • analyzing quantitative problems, and • solving complex real­world problems. The Gains in General Education scale is composed of four items: • writing clearly and effectively, • speaking clearly and effectively, • acquiring a broad general education, and • thinking critically and analytically. The Gains in Personal and Social Develop­ ment scale is composed of seven items: • developing a personal code of values and ethics, • understanding yourself, • understanding people of other racial and ethnic backgrounds, • voting in local, state, or national elec­ tions, • learning effectively on your own, • contributing to the welfare of your com­ munity, and • developing a deepened sense of spiri­ tuality. Table 5, “Frequencies by Major for Se­ niors,” illustrates that some majors seem to have more frequent engagement or higher ratings. For example, the first two items show that arts and humanities majors are more actively engaged with the library, but business majors are less engaged. For the last item, which asks the extent to which their experience at their institution has contributed to their knowledge, skills, and personal de­ velopment in the ethical use of information sources in academic work (proper citation use, not plagiarizing, etc.), the seniors in the social sciences display the highest percentage of “quite a bit” and “very much” responses. Table 6, “Information Literacy Item Fre­ quencies by Living Arrangement (First­Year Students Only),” confirms the expectation that students living on campus more often use library resources, and Table 7, “Informa­ tion Literacy Item Frequencies by Enrollment Status and Class Rank,” reveals that full­time students are more engaged with library ser­ vices and resources than part­time students. July/August 2007 435 C&RL News Table 8 ,“Information Literacy Item Frequen­ cies by Gender and Class Rank,” shows only slightly higher scores for females, but nothing really substantial. Two interesting exceptions are for seniors. The data for the third item—us­ ing Web­based resources—show that females are 12 percent more likely to do this very often. The last item shows a higher percentage of senior females (51 percent marked “very much” compared to 42 percent males), who reported that their experience at their institution contrib­ uted to their knowledge, skills, and personal development in the ethical use of information sources in academic work. Table 9, “Information Literacy Item Frequen­ cies by Race/Ethnic Status and Class Rank,” illustrates that African American and Latino students report higher levels of engagement on the first three items than Whites. One statistic noted in the observations made by Gonyea is that 31 percent of the African­American seniors marked “often” or “very often” to the question about how often they asked a librarian for help (in person, e­mail, chat, etc). Totaling percent­ ages for “often” and “very often” results in 31 percent for African­Americans and 18 percent for Whites, a fairly sizeable difference. Next steps What lies ahead for these information literacy and library­use related items? NSSE staff have not yet decided, but Gonyea states that they will certainly keep them on the table when looking at new versions of the NSSE or NSSE modules. He encourages librarians to consider including these items on their local administra­ tion of the NSSE. He also explained that “due to con­ straints on the length of the NSSE instrument, it’s unlikely that all the info lit items would be incorporated on the core NSSE instru­ ment. However, every four years or so we plan to do a thorough review of the items . . .” and that “some of the information literacy items could be considered for inclusion on the core NSSE instrument.6 (continues on page 441) dickinson college library webster university libraryagnes scott college library perry dean rogers | partners architects Designers for Libraries & Academic Institutions 177 milk street boston massachusetts 02109 t 617 423 0100 f 617 426 2274 w perrydean.com C&RL News July/August 2007 436 to a small number of schools, and what factors have contributed to this evolution. What is the value proposition of a remote database service for alumni? This is a question that each school will need to address. The fact that a minority of U.S. and Canadian schools have adopted this approach indicates that it doesn’t fit every reality. All of us strive to fi nd innovative and effective ways of reaching out to alumni, and there will be many different approaches that can help us build a sense of community. The partnerships we develop on campus will be critical in this endeavor, as we reinvent the library in the wired world of the 21st century. Notes 1. Catherine Wells, “Alumni Access to Re­ search Databases: The Time is Now,” College & Research Libraries News 67, (2006): 413. 2. Jean Sykes, “Look After Your Alumni,” Library + Information Update 1, 2 (2002): 50. 3. “Alumni Mustering Support to Save USC Library School,” Library Journal 109, 8 (1984): 847. 4. Sykes, 51. 5. Christine Smith, “Library and Infor­ mation Services for Alumni.” SCONUL Newsletter 24 (2001): 40. 6. Wells, 415–16. 7. Scott Carlson, “More Colleges Move to Offer Online Library Materials to Alumni,” Chr onicle of Higher Education 52, 34 (2006): A43. 8. Carlson, A43. 9. Anthony W. Ferguson, “Back Talk— Alumni Remote Access to Online Re­ sources,” Against the Grain 12, 4 (2000): 86. (“Information literacy-related...” continued from page 436) Another option that NSSE staff are discuss­ ing is “the feasibility and utility of a modular approach by which additional survey items, tested and robust, could be selected by insti­ tutions and/or consortia to be included with their NSSE administration. This is also a pos­ sibility for the information literacy items.”7 Per­ haps the final option is to do further testing of these items by editing them somewhat and running them again as experimental items in 2008, possibly including a regression analysis of several benchmark scales with the “active learning in information literacy” scale. There may even be an opportunity to work with in­ stitutional colleagues. As Gonyea mentioned, “I’m working now with the writing­across­the­ curriculum (writing program administrators), who are interested in testing some experimen­ tal items in 2008. There may be a connection to your work.”8 The project group is interested in hearing your comments. You may also want to request that these items be included in an upcoming NSSE sur­ vey at your institution. A subgroup of commu­ nity college members from the project team is currently working to identify items for possible inclusion on the Community College Survey of Student Engagement. Notes 1. Current Project Group members are Bonnie Gratch Lindauer, chair; Lisa Janicke Hinchliffe; Kwasi Sarkodie­Mensah; Polly Boruff­Jones; Margit Watts; Scottie Cochrane; Ann Roselle; Troy Swanson; Ellen Sutton; and MaryAnn Sheble. 2. “NSSE Facts,” nsse.iub.edu/html/quick _facts.cfm The NSSE Web site provides a wealth of information and reports, including “Ac­ creditation Toolkits,” which map NSSE items to specific regional accreditation standards. 3. From a February 8, 2007, e­mail with Robert Gonyea. 4. If you would like a copy of the nine tables, request them from Bonnie Gratch Lindauer, bgratch@ccsf.edu. 5. “ILT Summary” attachment to January 17, 2007, e­mail from Gonyea. This article’s “Find­ ings” section is based almost entirely on this document of his comments and observations of the nine tables. 6. February 8, 2007, e­mail with Gonyea. 7. January. 27, 2006, e­mail with Gonyea. 8. February 8, 2007, e­mail with Gonyea. July/August 2007 441 C&RL News mailto:bgratch@ccsf.edu