College and Research Libraries 476 College & Research Libraries November 1983 Long-Range Effectiveness of Library Use Instruction John Cornell Selegean, Martha Lou Thomas, · and Marie Louise Richman If recent literature reviews are an indica- tion, interest in bibliographic instruction is on the rise. 1 However, most studies have been undertaken without sufficient emphasis on evaluating study outcomes. 2 Werking suggested that the costs in dol- lars and staff time involved in full-scale evaluations were the main reasons for their lack in most library instruction stud- ies.3 Brewer and Hills points out that "the absence of any generally accepted criteria perhaps helps to explain the trend in li- brary instruction to favour evaluation ac- cording to relative standards.'' 4 Given the current constraints on finan- cial resources available to higher educa- tion, the need for thorough program eval- uation/justification techniques is becom- ing more, not less, important. Library in- structional research funds are becoming less available for specific, local impact pro- grams and are being granted more and more to studies investigating such broad impact programs as standardized instruc- tional evaluation techniques. 5 Of library instruction studies with eval- uations, the evaluation efforts seem to fall into one or more of three categories- opinion surveys, knowledge testing, and actual library use observation. 6 Two good examples of the use of observation for in- structional evaluation can be found in works by Adams/ and Kramer and Kra- mer. 8 It is interesting to note that Kramer and Kramer used aggregate library circu- lation records in place of actual observa- tion in their attempts to correlate library use and freshman persistence at their in- stitution. The study is important in that it used objective measures for library use- book loan records-rather than relying on data provided by multiple observers such . as was found in the Adams investigation. Opinion surveys have probably seen the most use in library instruction evaluation efforts. 9 Studies by Lubans, Frick, Olev- nik, King, and Person are representative of the range of opinion survey efforts in li- brary instruction evaluation. 10 The major drawbacks of opinion surveys are that questions tend to reflect the biases of the instrument's developers, and the data generated do not measure the effective- ness of the instruction. 11 The pretest/posttest paradigm is becom- ing more popular in bibliographic educa- tion research, as it utilizes easily quantifi- able, objective data in evaluating instructional effectiveness. Hughes and Flandreau used this technique to deter- mine bibliographic information acquisi- tion and retention in students at Berea College in Berea, Kentucky. 12 Similarly, Wiggens, Frick, and Olevnik used the pre- test/posttest research design in library in- struction evaluation. 13 One problem with most pretest/posttest studies is that the variables measured have very specific local application and cannot be generalized to other settings. John Cornell Selegean is administrative analyst in the Office of Information and Systems Management, Univer- sity of California-Irvine; Martha Lou Thomas is coordinator of library instruction in the reference department of the University of California Library-Irvine; and Marie Louise Richman is also an administrative analyst in the Office of Information and Systems Management, University of California-Irvine. This study was funded by the University Research Grants for Librarians Program at the University of Califor- nia, an operation instituted, monitored, and administered by the Librarians Association of the University of Cali- fornia (LAUC). The association's persistence in obtaining research monies made possible the pursuit of this in- quiry. Additionally, the advice and assistance of Dennis J. Galligani, special associate in the Office of the Vice Chancellor Academic Affairs, was invaluable throughout the project, and is gratefully acknowledged. That is, one institution may evaluate sub- sequent student performance on such nuts-and-bolts tasks as card catalog read- ing ability, while another school may eval- uate students on general knowledge of how to research a term paper. Both col- leges would be evaluating library use in- struction, yet their results are not directly comparable. · Another problem with most pretest/ posttest library instruction studies is that evaluation is usually limited to shor.t-term information retention. Thus, long-term retention of library instruction training, which may be a more effective indicator of program effectiveness, is not usually ex- amined. One recent study used a panel research design and multiple regression tech- niques to evaluate long-term library skills retention in students who took a library skills course. 14 The study found that stu- dents who actively used the learned skills after the course had the best long-term skills retention. However, the study found no significant relation between li- brary skills retention and SAT scores or eventual grade point averages. The long-term skills retention study rep- resents a step forward in libr~ instruc- tion evaluation methodology .15 The use of a measur-e not directly associated with a li- brary course may provide generalizability of results not usually available in library instructional evaluation efforts. Hardesty et al. hinted that their statisti- cally insignificant SAT -score and grade- point-average results might have been re- lated to an "ecological fallacy" (other extraneous, uncontrolled variables). 16 For instance, prior intellectual abilities (mea- sured by SAT scores) were not matched . for the library skills and control groups. This could have resulted in an inappro- priate comparison between figurative ap- ples and oranges instead of equivalent student groups. Another study on the long-range effects of library use instruction on subsequent academic performance was done by P. S. Breivik. 17 In this study, ter:m paper writing scores and long-range course completion rates were found to be significantly im- proved for students participating in a li- Research Notes 477 brary orientation course. The current study was conducted to evaluate the impact of the ''Biblio Strat- egy" course on eventual student aca- demic success, as measured by grade point average, student persistence, and graduation rate. The specific hypotheses tested were that students completing the library instruction course would have sta- tistically higher grade point averages at graduation or upon leaving UCI than stu- dents who did not take the course and that the ''Biblio Strategy'' students would also have significantly higher persistence and graduation rates. An additional goal of this study was to develop an evaluation tool that could be applied to a broad range of library use in- struction courses. Such a tool could make comparisons between programs at differ- ent institutions much easier than has been previously possible. METHODOLOGY Course ''Biblio Strategy,'' a two-unit course for credit, has been offered each quarter at the University of California-Irvine (UCI) since spring 1974. Lectures on the organization of knowledge, the research process, and information resources are reinforced by assignments within the library. Comple- tion of the course is marked by each stu- dent's compilation of an annotated bibli- ography of thirty citations on a subject of choice. The course is particularly recom- mended for those simultaneously taking classes where a research paper is required. Enrollment in a single section of ''Biblio Strategy'' ranges from twelve to thirty students per quarter. Subjects The initial population consisted of 512 undergraduates who completed the li- brary use course between fall quarter 1975 and spring quarter 1979. Of the 512 "Bib- lio Strategy'' students, 278 who had no re- corded SAT scores were dropped from the analysis, leaving 234 students in the final study sample. A control sample of 234 stu- dents who did not take the library instruc- tion course was randomly selected bfa means of the SPSS utility SAMPLE. 8 478 College & Research Libraries Three variables were used as criteria for the pairwise matching of the ''Biblio Strat- egy'' students and the members of the control group-college major, class level, and combined SAT scores. All matchings were done with data from the fall quarter of the academic year in which the ''Biblio Strategy'' member of each pair took the li- brary instruction course. Students were matched exactly on college major (e.g., history majors were paired with history majors). Students were matched exactly on class level (e.g., freshmen with fresh- men). Finally, student pairs were matched on combined SAT mathematics and verbal scores to within one standard deviation of each other. Outcome Variables Outcome variables were grade point av- erage (measured on a 4 point scale), stu- dent persistence (in quarters of atten- dance after the course), and graduation rates. Grade point averages were obtained as of the end of spring quarter 1982 or when a student left UCI, whichever came first. Persistence rate was defined as the number of quarters a student remained at the university after the library use course was taken. Graduation was treated as a bi- polarvariable, with students either gradu- ating or not by the end of the spring 1982 quarter. Data Analysis Grade point averages and student per- sistence rates were analyzed using stu- November 1983 dents' t-tests for paired data. 19 Graduation rate was analyzed using the chi-square statistic. 20 . · RESULTS Mean variable values for study and match students can be found in table 1. These data provide a comparison between students who took the ''Biblio Strategy'' course and the matched control group. The statistical significance of the results is shown in table 2. As shown, the SAT scores analysis indicated no significant difference between study and control sub- jects. This was expected, since the control group was selected specifically to match the study group. No analyses were done on college major or class level, since the control group was selected specifically to match the study group exactly on these variables. Statistical analysis of the results indi- cated significant differences between study and control groups for the variables grade point average and persistence rate, but no significant difference was found between groups for graduation rate. Students who completed the library use course were found to have an average of 0.15 point higher grade point averages and an average of 2. 9 more quarters of attendance than the match group. DISCUSSION All of the library instruction evaluation studies cited in the introduction found some positive relationship between the li- TABLE 1 Variable SAT scores Grade point average Quarters enrolled Graduation rate Variable SAT scores Grade point average Persistence rate Graduation rate MEAN VARIABLE VALUES Biblio Students 948.3 2.85 14.1 40.3% TABLE 2 STATISTICAL ANALYSIS RESULTS n of · Test Pairs df t-test 234 233 t-test 234 233 t-test 234 233 X-sguare 1 Statistic Value 1.54 3.22 2.21 3.09 Match Group 964.1 2.70 11.2 56.5% 2-Tail Probability p > 0.05 p > 0.01 p > 0.05 p > 0.05 brary use course and student performance or perceptions. Most of the studies, though, viewed the library use course im- pact as ending at the door to the library. Only a few of the recent studies investi- gated the broader implications of library use skills acquisition on later student aca- demic performance. Kramer and Kramer found that student use of the library correlated significantly with grade point average. 21 Hardesty et al. were unable to demonstrate a significant relationship between library skills acquisi- tion and academic performance improve- ments, possibly due to extraneous vari- ables. 22 The fact that the current study found a statistically significant improve- ment in library instruction students' performance relative to that of the match sample provides confirmation of Kramer and Kramer's results. Kramer and Kramer also determined that students who used the library tended to remain in schoollon§er than those who did not use the library. Similarly, Breivik found higher course completion rates for library instruction course enrollees. 24 The present study, in finding that "Biblio Strategy '' students stayed at the univer- sity significantly longer than their matched counterparts, again confirmed Kramer and Kramer's, as well as Breivik' s, work. The current investigation attempted to expand the study of long-term library use skills retention through the use of gradua- tion rates as an additional instructional ef- fectiveness indicator. However, no signif- icant difference was found between the "Biblio Strategy" and match groups on this variable. It is possible that the ''Biblio Strategy" students, in remaining at the university longer than the match stu- dents, had artificially lowered their group graduation rate. It is also possible that the "Biblio Strategy" course, while influenc- ing students enough to keep them at the university, might not have been enough by itself to retain students through to graduation. The second goal of this study was to demonstrate the usefulness of long-range academic performance as a measure of the effectiveness of library instruction pro- Research Notes 479 grams. The study found that a matched- pairs analysis of long-range student per- formance data was an effective tool, one that compensated for the shortcomings of previous library instruction evaluation techniques. It controlled for certain forms of variance (i.e., preexisting academic abilities as measured by SAT scores, dif- fering fields of study, and class level), which have not been accounted for in other investigations . Additionally, the use of long-range academic performance as an indicator of instructional success elimi- nates the instructor effect that often biases student opinion survey results. Another advantage of this evaluation technique is that archival student per- formance data are usually available at col- leges and universities. The information is not subject to the differing interpretations generally associated with opinion survey results or single term paper grades, but rather presents an overall picture of later student performance after library use in- struction is completed. However, the reader should be aware that this evaluation tool is not flawless. The matching control variables used here may not be the only significant contribu- tors to academic performance. Other vari- ables, one example being student employ- ment while attending school, could also impact subsequent academic perform- ance. Further, this methodology is not meant to stand alone as a library instruction eval- uation tool. It does not have the inherent sensitivity to assess the effectiveness of in- dividual course components. It cannot, for instance, tell how well a student who took the library course uses the card cata- log relative to students who did not take the course. It does not even tell how much more effectively library use instruction students use the campus library. What the methodology does point out is the appar- ent degree to which library use instruction benefits overall student academic per- formance. To the extent that this method- ology provides an objective measure of the value of library use instruction, one which can be applied at many institutions of higher education, it is a useful evalua- tion tool. · 480 College & Research Libraries November 1983 REFERENCES 1. H . B. Rader, "Library Orientation and Instruction-1977; An Annotated Review of the Litera- ture,'' in C. A Kirkendall, ed., Improving Library Instruction: How to Teach and How to Evaluate (Ann Arbor: Pierian Pr., 1979); N. E. Gwinn, "Academic Libraries and Undergraduate Education: The CLR Experience," College & Research Libraries 41:5-16 (Jan. 1980); R. H. Werking, "Evaluating Bib- liographic Education: A Review and Critique," Library Trends 29:153-72 (1980) . · 2. Werking, "Evaluating Bibliographic Education"; R. R. Johnson, "Library Instruction: The My- thology of Evaluation," in R. T. Beeler, ed., Evaluating Library Use Instruction (Ann Arbor; Pierian Pr., 1975); J. Lubans, Jr.," Assessing Library Instruction," in C. A Kirkendall, ed ., Directions for the Decade: Library Instruction in the 1980s (Ann Arbor: Pierian Pr., 1981). 3. R. H. Werking, "The Place of Evaluation in Bibliographic Education," in C. Oberman-Soroka, ed., Proceedings from the Southeastern Conference on Approaches to Bibliographic Instruction (Charles- ton, S.C.: College of Charleston Continuing Education Office, 1978) . 4. J. G . Brewer and P . .J. Hills, "Evaluation of Reader Instruction," Libri 26:55-66 (1976). 5. Gwinn, "Academic Libraries." 6. Werking, "Place of Evaluation." 7. M. Adams, "Effects of Evaluation on Teaching Methods," in C. A. Kirkendall, ed., Improving Li- brary Instruction: How to Teach and How to Evaluate (Ann Arbor: Pierian Pr., 1979). 8. L. A. Kramer and M. B. Kramer, "The College Library and the Dropout," College & Research Li- braries 29:310-12 (1968). 9. Werking, "Evaluating Bibliographic Education." 10. J. Lubans, Jr., "Evaluating Attempts of Library Use Instruction Programs at the University of Col- orado Libraries," in R. J. Beeler, ed., Evaluating Library Use Instruction (Ann Arbor: Pierian Pr., 1975); E. Frick, "Evaluating Student Knowledge of Facilities at the University of Colorado, Colo- • rado Springs," in C. A. Kirkendall, ed., Improving Library Instruction: How to Teach and How to Eval- uate (Ann Arbor: Pierian Pr., 1979); P. P. Olevnik, "Evaluation as a Tool for Program Develop- ment," in C. A. Kirkendall, ed., Improving Library Instruction: How to Teach and How to Evaluate (Ann Arbor: Pierian Pr ., 1979); D. N. King and J. C. Ory, "Effects of Library Instruction on Stu- dent Research: A Case Study," College & Research Libraries 42:31-37 (Jan. 1981); R. Person, "Long- Term Evaluation of Bibliographic Instruction: Lasting Encouragement," College & Research Li- braries 42:19-25 (Jan . 1981). 11. Johnson, "Library Instruction." 12. P. Hughes and A. Flandreau, "Tutorial Library Instruction: The Freshman Program at Berea Col- lege," Journal of Academic Librarianship 6, no. 2:91-94 (1980). 13. M. E. Wiggins, ''Evaluation in the Instructional Psychology Model,'' in R. J. Beeler, ed., Evaluating Library Use Instruction (Ann Arbor: Pierian Pr., 1975); Frick, "Evaluating Student Knowledge"; Olevnik, "Evaluation as a Tool." 14. L. Hardesty; N. P. Lovrich, Jr.; and J. Mannon, "Library-Use Instruction: Assessment of the Long-Term Effects," College & Research Libraries 43:38-46 (Jan. 1982). 15 . Ibid. 16. Ibid. 17. P. S. Breivik, "Brooklyn College: A Test Case," in Open Admissions and the Academic Library (Chi- cago: American Library Assn ., 1977). 18. N. H. Nie and others, Statistical Package for the Social Sciences (2d ed.; New York: McGraw-Hill, 1975). 19. J. P. Guilford and B. Fruchter, Fundamental Statistics in Psychology and Education (6th ed.; New York: McGraw-Hill, 1978). · 20. Ibid . 21 . Kramer and Kramer, "The College Library." 22. Hardesty, Lovrich, Mannon, "Library Use Instruction." 23 . Kramer and Kramer,'' The College Library.'' 24. Breivik, "Brooklyn College." Once you concentrate your overseas subscriptions with Swets you are assured not only of fine service but also a wealth of subscription information: • customer subscription reports (CSR) • checklists • subject searches • quotations • price analysis reports • Swets info bulletin • microfiche catalog and much more. See the Information Package. What do we charge for it? NOTHING. Since Swets introduced these accurate features our modest handling charge (if any) has become lower than before. Ask for the Information Package and learn how reliable subscription. service and an excellent information support system resulted from close cooperation between library know-how people and computer experts at Swets. Swets North Ameri ca Inc. P. 0. Box 517 Berwyn Pa 19312 USA e Yes I d •=REE Swets tntarma=ka=di= call Swets I free 880-428 -1515 (in Pennsylvania 800-453 -1515) 0 Have a Swets representative contact me I I ~;;:~;:P~rs t I Library Address _ _______________ _ I City State _____ _ Zip-code Country j Swets Subscription Servi ce L.::.; P.O. Boxa3o Phone( -----2160 SZ Lisse The Nether lands Information • Is a resource Librarians play a vital role in paving the way toward a more well- informed society. With your help, we gain an awareness of the importance of information and develop the skills required to deal with it effectively. To aid your efforts, lSI offers a program of special recognition to reward and encourage advances in the fields of library and information science. We view this program as a positive reinforcement for excellence; an incentive to those gifted persons who can make significant contributions to the information community. And most of all, this program is lSI's way of saying, "We need more people like you!" For more information on any aspect of our program, please contact Susan McDonald, Director of Communications. cO ~D c· Institute for c::0 Scientific Information 3501 Mar k et Street Phil ade lphia, PA 19104 U .S.A . T e l. : (215) 386·0100, Cabl e: SCINFO, T e lex: 84·5305 ... so are the peoplew_ho manage It. lSI's Program of Special Recognition Outstanding Information Science Teacher Award--administered by the American Society for In - formation Science (ASIS); $500 awarded annually to individuals who demonstrate sustained excellence in teaching information science. Samuel Lazerow fellowship-administered by the Association of College and Research Libraries (ACRL ); $1,000 grant honoring outstanding contributions to acquisition or technical services in an academic or research library. Samuel Lazerow Lecture Series--delivered annu- ally at various universities with departments of library or information science; distinguished speakers are se- lected by the host universities. frank Bradway Rogers Information Advancement Award--administered by the Medical Library Associ- ation; $500 awarded annually to a librarian who has made an outstanding contribution to the application of technology in delivering health sciences information. Doctoral Dissertation Scholarship-administered by the ACRL; $I, 000 awarded annually to a student working on a dissertation on academic librarianship. lSI Information Science Doctoral Dissertation Scholarship-administered by ASIS; $1 ,000 awarded annually to a doctoral student working on his or her dissertation. lSI/Special Libraries Association Scholarship- $1,000 given annually to a beginning doctoral candi - date. Outstanding Paper Award-£250 awarded annu- ally in recognition of the author of the best paper published in the Journal of ln{onnation Science of the Institute of Information Scientists. Administered by the journal. <1 983 lSI 101-3167