College and Research Libraries Evaluating the Effectiveness of a Concept-based Computer Tutorial for OPAC Users Joan M. Cherry, Weijing Yuan, and Marshall Clinton In an experiment to evaluate the effectiveness of a concept-based computer tutorial for training OPAC users, the authors found that University of Toronto students who viewed the tutorial performed significantly better on search tasks than those who received no instruction. This paper reports the results of a second experiment to investigate the effectiveness of the computer tutorial. Fifteen students viewed the computer tutorial. Another fifteen students served as a control group and did not receive any instruction. The results showed no significant differences in performance between the students who viewed the computer tutorial and those who did not receive any instruction. We discuss the differences between the results of the two experiments in terms of the characteristics of the participants and the characteristics of the OPACs. We also relate our findings to the findings of other st.udies on concept-based instruction and offer suggestions for future research. everal authors have recently advocated a move toward concept-based instruction and away from procedure-based instruction in bibliographic instruction. The basic characteristics of each type of instruction are shown in table 1. In the context of online public access catalogs (OPACs), concept-based in- struction emphasizes the general organ- izing and searching principles in OPACs rather than specific procedures/com- mands/steps for doing searches on a particular OPAC. Katherine Branch, Joan K. Lippincott, and Linda Brew MacDonald et al. have discussed the concepts that might be included in this type of instruc- tion for OPAC users.t.2.3 These include: • Principles of database organization: what a database is; the structure of a bibliog- raphic record; searchable fields; indexing; keywords; descriptors; controlled vo- cabulary; freetext searching; Boolean logic. Joan M. Cherry is Associate Professor in the Faculty of Library and Information Science, Weijing Yuan is a doctoral candidate at the Faculty of Library and Information Science, and Marshall Clinton is Director of Information Technology Services at the University of Toronto Library, Toronto, Ontario, M5S, 1A1, Canada. The development of the instructional software was supported by Apple Canada and the Research Board of the University of Toronto. The experiments to test the effectiveness of the software as an educational vehicle were supported by a grant from the Council on Library Resources, Washington, D.C. The statements made and the views expressed in this paper are the responsibility of the authors. The following individuals worked on the development of the software: Don Gibson, John Bradley, Geoffrey Rockwell, Computing Sen.rices; Sophia Kaszuba, Science and Medicine Library; James Turner, Faculty of Library and Information Science. The authors thank Lari Langford, Head, Information Centre, Sigmund Samuel Library for her help in recruiting IX'rticipants for this study. 355 356 College & Research Libraries July 1994 TABLEt CHARACTERISTICS OF CONCEPT-BASED INSTRUCTION AND PROCEDURE-BASED INSTRUCTION Concept-based • Presents a conceptual model of the system. • Focuses on how the system (and others of its type) works. • Focuses on system-independent skills. • Problem analysis: division of a topic into components to develop a search strategy. • Evaluation of search output: precision, recall; limiting or broadening search. These concepts are universal and apply to all OPAC systems. RESEARCH ON CONCEPT-BASED/ MODEL-BASED INSTRUCTION In a study conducted by Frank G. Ha- lasz and Thomas P. Moran, four partici- pants were trained to use an electronic calculator with a conceptual model of the systemS while three users were not provided with the model-based train- ing.4 The participants performed three types of tasks: routine tasks, invention tasks, and combination tasks. On the routine tasks, the no-model users were 40 percent faster than the model partici- pants. On the invention tasks, the model users performed considerably better than the no-model users. The latter used commands more efficiently and re- quired fewer attempts to arrive at solu- tions. On the combination tasks the no-model users were slightly faster. Christine L. Bormnan trained thirty- two undergraduate students to use an online catalog.5 Half received model- based training; half received procedure- based training. The students performed five simple tasks and five complex tasks. On the simple tasks there was no signifi- cant difference in performance between those who received the model-based and procedure-based training. On the complex tasks those who received the model-based training performed signifi- cantly better (p = .08). Similarly, there Procedure-based • Presents procedures for doing tasks with the system at hand. • Focuses on the mechanics of operating the system at hand. • Focuses on system-dependent skills. were no significant differences in usage patterns, as indicated in the transaction monitoring data, between the groups on the simple tasks. There were, however, significant differences between the two groups on the complex tasks. Borgman commented that the I/ results were not as strong as we had hoped." She noted that a less sophisticated sample might have benefited more from the conceptual models provided.6 Piraye Bayman and Richard E. Mayer . conducted a study in which novices learned BASIC computer programming.? Some received conceptual instruction; others received standard instruction. The study included high ability subjects (as measured by the Scholastic Aptitude Test (SAT), and low ability subjects. Re- sults of a programming test indicated that the conceptual instruction en- hanced problem-solving performance for low ability subjects but not for high ability subjects. POTENTIAL BENEFITS OF CONCEPT-BASED TRAINING Theresa L. Wesley provides a useful summary of the benefits of concept- based instruction which have been sug- gested by previous research.8 The benefits of concept-based instruction fall into three categories: user performance, learning transfer, and judgment. Improved User Performance Users will: • perform better on advanced or non- routine tasks; • have less trouble extricating them- selves from errors; • be better able to make inferences and predictions; · • be able to better structure searches and interpreting results. Learning Transfer Users will: • be able to apply their knowledge to new situations; • be better able to move from one ven- dor's system to another, or to deal with a new release of an existing system; • be better able to use other related sys- tems (e.g., users who receive concept- based OPAC training will be better able to use end-user online systems, CD-ROM products, etc.). Judgment • users will understand the limitations of the system. Elizabeth Frick and Mary M. Huston discuss the merits of concept-based in- struction.9·10 Both refer to Christine Borgman's comment that while mastery of the mechanical aspects of searching may insure some results from the sys- tem, it is only when the conceptual as- pects are understood that users can exploit the system fully. Unfortunately, much of the OPAC in- struction provided by libraries has not been concept-based. Frick refers to Nowakowski's mid -1980s survey of sev- enty-two Canadian academic and public libraries using OPACs.11 Nowakowski concluded that the use of different sys- tems, each with its own idiosyncrasies, had resulted in libraries concentrating on teaching the user how to use their system rather than giving them skills which could translate to other systems. Giving users skills that can transfer to other systems seems even more important today since gateways such as the Internet have increased access to systems. A CONCEPT-BASED COMPUTER TUTORIAL FOR OPAC USERS The authors have developed a com- puter tutorial to deliver concept-based training in the use of OPACs. The tuto- rial runs on stand-alone microcomput- ers in the library. The software was Computer Tutorial for OPAC Users 357 developed by a team that included two of the authors of this paper, staff from the University of Toronto Computing Services, a computer graphics consult- ant, a professional librarian, and several graduate students in the Faculty of Li- brary and Information Science. The soft- ware runs on a Macintosh SE and was produced with Hypercard. The computer tutorial aims at the na- ive user. The system consists of modules for various aspects of OPAC searching. Each module is intended for use in linear fashion. However, users may view the modules in any sequence and may exit a module at any point. Giving users skills which can transfer to other systems seems even more important today since gateways such as the Internet have increased access to systems. The tutorial incorporates graphics, sound, animation, as well as digitized video clips and speech. Throughout the system these are usually used to provide addi- tional information content, or to reinforce concepts being explained in the tutorial. The tutorial presents concepts in such a way that they apply to any online cata- log. The general organizing and search- ing principles in online catalogs are the focus of the content. Analogies used to convey the intellectual content include everyday items such as the telephone book and the traditional card catalog. The instructional design of the software is de- scribed in detail in an earlier paper. 12 The design of the interface of the software is also discussed in another paper. 13 In an earlier experiment to evaluate the effectiveness of the concept-based computer tutorial we found that stu- dents who viewed the computer tutorial performed significantly better on search tasks than those who received no in- struction, and performed as well as stu- dents who received the standard classroom lecture provided by the Univer- sity of Toronto Library.1" The search tasks were performed on the Utlas T /Series-50 358 College & Research Libraries OPAC at the University of Toronto. These results were encouraging because the ability to deliver instruction effec- tively through computer tutorials of- fers potential benefits to both libraries and their users. For libraries, it offers the benefit of reduced staff time for Classroom instruction; for us- ers it offers an opportunity to learn inde- pendently, a learning style reported by many OPAC users.15 Concept-based computer tutorials of- fer two addi tiona! advantages for library users and libraries: the concepts learned from the tutorials should be transfer- able; for example, if the tutorial explains OPAC concepts, users should be more proficient with any OPAC. Once devel- oped, the computer tutorials could be used in a variety of libraries, thus provid- ing an opportunity for a library to benefit from development work undertaken at another institution. The experiment re- ported in this paper was designed to evalu- ate the effectiveness of our concept-based computer tutorial for training students to use a different OPAC. METHODOLOGY Design Thirty students participated in the ex- periment which consisted of a pretest and a task session, conducted with one participant at a time. In the pretest, par- ticipants completed a background ques- tionnaire and a library skills assessment. In the task sessions, participants were randomly assigned to one of two condi- tions: to view the computer tutorial, or to receive no instruction. All partici- pants used the Data Research Associates (DRA) OPAC at Tufts University to an- swer nine questions. We selected this OPAC because the University of Toronto Library had recently contracted to install the ORA system and we could access the OPAC at Tufts University through Telnet. Participants in the No Instruction condi- tion were given an opportunity to use the tutorial at the end of the session. The study was conducted in January and February of 1992 in a research laboratory at the Faculty of Library and Informa- tion Science, University of Toronto. July 1994 Participants One-third of the participants were un- dergraduate students. Seventeen per- cent were graduate students. They were recruited through advertisements on University of Toronto campus. Most of the others (43 percent) were students en- rolled in the preuniversity program at the university. Their curriculum in- cludes a component on use of library resources and the development of biblio- graphic skills. The head of the Informa- tion Centre at the Sigmund Samuel (un- dergraduate) Library recruited these participants while they were on a re- quired tour of the library. Participation in the experiment was voluntary. Students received $10 for their participation. As with all studies that use volunteers, the participants may differ in some ways from those who did not volunteer. For instance, the par- ticipants may have been more motivated than those who did not volunteer. The Pretest The pretest included two parts: a background questionnaire and a library skills assessment. (A copy of the pretest is available from the authors.) Partici- pants were given twenty minutes to read a one-page description of the study and to complete the pretest. The Task Sessions The task sessions were held immedi- ately after the pretest. The participants were randomly assigned to the No In- struction condition or the Computer Tu- torial condition. Computer Tutorial Condition. The students in the Computer Tutorial con- dition were given twenty minutes to view the tutorial. The tutorial included only conceptual information. Since time for viewing the tutorial was limited, par- ticipants were told that the Author Searching, Title Searching, and Subject Searching modules of the tutorial were most relevant for the experiment, and the observer suggested that they focus on these sections. We limited the tutorial viewing time to twenty minutes so that we could com- Computer Tutorial for OPAC Users 359 TABLE2 DIS1RIBUTION OF PARTICIPANTS BY PRIOR CATALOGUE EXPERIENCE % Computer %No %All Tutorial Instruction N=30 N= 15 N=15 chi-sq df p Used University of Toronto OPAC before (N = 30) 83.3 86.7 80.0 0.240 .624 Used other OPAC (N = 30) 63.3 66.7 60.0 0.144 .705 Received OPAC instruction (N = 30) 40.0 33.3 46.7 0.556 1 .456 Received card catalog instruction (N = 30) 30.0 40.0 20.0 1.429 .232 Percentage represents the proportion of "yes" responses to the items. Missing values were excluded. pare the results of this experiment with those from the first experiment, where participants in the Classroom Lecture group received a twenty-minute lecture by a librarian, and participants in the Computer Tutorial group spent twenty minutes viewing the tutorial. Each stu- dent in the Computer Tutorial group was tested immediately following the twenty-minute instruction period. No Instruction Condition. Students who were assigned. to theN o Instruction condition did not receive any instruction prior to doing the searches in the task session. They were given an opportunity to view the computer tutorial after com- pleting the task session. Search Tasks. The participants worked on nine search tasks that also were used in the first experiment. (The list of search tasks is available from the authors.) These tasks were modeled on search tasks used in an earlier study by Mary Ellen Larson and Dace Freivalds.16 Participants were given twenty minutes to do the search tasks. The search tasks were performed on the Tufts University OPAC, a Data Re- search Associates (DRA) system. RESULTS Background of Participants The undergraduates, graduate stu- dents, and preuniversity program stu- dents were evenly distributed across the Computer Tutorial and No Instruction groups. We asked a number of questions about the participants' prior experience with library catalogs and instruction in their use. These data are shown in table 2. As can be seen from table 2, over 80 percent of the participants had used the University of Toronto OPAC before, over 60 percent had used another OPAC, and 40 percent had received OPAC instruc- tion. Table 2 shows that there were no significant differences between the two groups on these variables; thus the groups appeared to be comparable. Participants were asked to indicate how many times they had used the University of Toronto OPAC: Never, 1-10 times, 11-20 times, 21-30 times, 31-40 times, or More than 40. Table 3 summarizes the participants' re- sponses to this question. As can be seen from table 3, most of those who had TABLE3 PREVIOUS USAGE OF UNIVERSITY OF TORONTO OPAC (UTLAS T /SERIES-50) % Computer %No %All Tutorial Instruction (N=30 (N=15) (N = 15) Never 16.7 13.3 20.0 1-10 times 56.7 60.0 53.3 11-20 times 6.7 13.3 21-30 times 3.3 6.7 31-40 times 10.0 20.0 More than 40 times 6.7 6.7 6.7 _____________________________________________________________________] 360 College & Research Libraries July1994 TABLE4 COMPUTER EXPERIENCE OF PARTICIPANTS % % Computer No %All Thtorial Instruction (N= 30) (N= 15) (N= 15) chi-sq d( p CD-ROM databases 30.0 33.3 26.7 0.159 .690 Database management systems 23.3 13.3 33.3 1.677 .195 Electronic spreadsheets 30.0 33.3 26.7 0.159 1 .690 Video games 53.3 46.7 60.0 0.536 1 .464 Word processors 73.3 80.0 66.7 0.682 .409 Own a computer 43.3 33.3 53.3 1.222 .269 Percentage of participants who have used each type of software. All (N= 30) 28.6 TABLES PRETEST SCORES Computer Thtorial (N= 15) 27.3 No Instruction (N = 15) 29.9 Mean number of questions answered correctly. used the University of Toronto OPAC had used it ten times or fewer. The groups did not differ significantly with respect to pre- vious usage of the University of Toronto OPAC (chi-sq = 6.259, df = 5, p = .282). We were also interested in the types of software the participants had used. Ta- ble 4 summarizes the participants' expe- rience with computers.There were no significant differences between the groups on any of the six variables. Library Skills Assessment The maximum score possible on the Library Skills Assessment was 39. The mean score for the thirty participants was 28.6. Scores ranged from 12 to 39, with a standard deviation of 8.34. The means for the Pretest scores are shown in table 5. A t-test indicated that the difference be- tween the scores of the two groups was not significant (t = 0.828, p = 0.415). The questions which were most fre- quently answered incorrectly are shown in table 6. The pattern is similar to that found in the first experiment. These data again suggest that students need in- struction on sequencing Library of Con- gress call numbers and distinguishing between citations to books and journal articles. Performance on the Search Tasks The number of search tasks completed successfully was the measurement of performance. The maximum score pos- sible was 9. The mean score for the thirty participants was 5.2. Scores ranged from 2 to 9, with a standard deviation of 2.46. The mean scores of the two groups are shown in table 7. Although those who received the computer tutorial scored slightly higher (5.3 versus 5.1) a t-test indicated that the difference between the scores of the two groups was not significant (t = 0.293, p = 0.772). As with all studies that use volunteers, the participants may differ in some ways from those who did not volunteer. Evaluation Questionnaire After doing the searches in the Post- test, participants completed an evalu- ation questionnaire (available from the authors). It included a question which asked how confident they were that they would be able to use the OPAC in future, and a question which asked how much the instruction received (the computer tutorial) had helped them to answer the search questions in the experimental ses- Computer Tutorial for OPAC Users 361 TABLE6 QUESTIONS THAT WERE ANSWERED INCORRECTLY MOST FREQUENTLY Question % A book with this ea!l number (L8601.B89) would be placed on the shelf: (#21) 46.4 A summary of the contents of an article, book, or other material: (#5) 43.3 A book with this call number (L1010.012) would be placed on the shelf: (#20) 39.3 In order to determine how thoroughly a topic is covered in a book, look the topic up in the: ( #29) 36.0 Which of the entry numbers are for magazine or journal articles about Davies? ~w ~7 The part of the book that gives the name of the author, the name of the book, the publisher, and the date of publication is the: (#24) 29.6 Items in great demand that are available for limited loan periods in a special sec- tion of the library: ( #6) 27.6 How many articles are listed under the subject "solar energy"? (#30) 21.1 Which of the following entry numbers are for books or parts of books about Davies? (#23) 20.0 (N = 30) Percentage represents the proportion of incorrect responses to the questions. Missing values were excluded. sion. The data for these questions are pre- sented in table 8. In constructing this table we excluded observations where partici- pants did not respond to the question. Table 8 shows the percentage of par- ticipants in each group who felt confi- dent that they could use the OPAC to identify and locate library materials and shows the opinions of those who viewed the computer tutorial on whether the TABLE7 POSTTEST SCORES-SEARCHES ON TUFfS UNNERSITY OPAC (ORA) All (N=30) 5.2 Computer Thtorial (N= 15) 5.3 No Instructions (N = 15) 5.1 Mean number of questions answered correctly. instruction that they had received had helped them to answer the questions in the exercise. As we can see from table 8, in both groups, a majority of partici- pants were confident that they could use the OPAC (chi-sq = 4.86, df = 4, p = .302). Most reported that the instruction had helped them to answer the questions in the exercise although their performance on the search tasks was not significantly better than those who had not received any instruction. Transaction Logs During the experimental sessions, we recorded the interaction between the participant and the OPAC, i.e., the entire search process, using the communications software. Using the transaction log data, we compared the behavior of the two groups with respect to types of searches TABLES EVALUATION OF INSTRUCTION % Strongly % % No % % Strongly Disagree Disagree Opinion Agree Agree I am confident that I can use the computerized catalog to identify and locate materials in the library. Computer tutorial (N = 15) 20.0 13.3 46.7 20.0 No instruction 13.3 6.7 6.7 66.7 6.7 The instructions I received in the use of the computerized catalog helped me to answer the questions in this exercise. Computer tutorial (N = 15) 20.0 6.7 66.7 6.7 362 College & Research Libraries TABLE9 COMPARISON OF DATA FROM TRANSACTION LOGS Computer No Tutorial Instruction (N= 15) (N = 14) Mean Mean Author searches 4.3 5.6 Title searches 6.2 4.8 Subject searches 4.3 3.6 Keyword searches 0.5 0.4 Zero-hit searches 4.9 4.8 Records retrieved 27.4 23.9 Browse I navigation 24.5 21.3 Help requests 0.6 1.1 Error messages 4.1 2.5 performed, search results, navigation and browsing, and problems encoun- tered, as reflected by error messages. These data are shown in table 9. The dif- ferences between the two groups on these measures were not significant. In some of the chi-square analyses, some cell expected frequencies are less than five. Traditionally, statisticians have recommended that expected fre- quencies should be equal to or greater than five. However, some statisticians now believe that this is not necessary. 17• 18 SUMMARY AND DISCUSSION The results of the first experiment to test the concept-based computer tutorial showed that those who viewed the tuto- rial performed significantly better on search tasks than those who received no instruction. However, the results of the experiment reported in this paper showed that students who received no instruction performed as well as those who received the computer tutorial. In this section we discuss the differences between the results of the two experi- ments in terms of the characteristics of the OPACs and the participants. We also relate our findings to the findings of other stud- ies on concept-based instruction. The main difference between the two experiments was the OPAC used. In the first experiment the OPAC was the Utlas T-Series/50 system. In the second ex- periment the OPAC was the Data Re- July 1994 search Associates (DRA) system. The participants completed more of the tasks on average in the second experiment than in the first (5.1 vs 3.4). This may indicate that the DRA OPAC is easier to learn/use than the Utlas T-Series/50 OPAC, and that instruction (at least of the type provided in the computer tuto- rial) is not necessary. However, it may be because of differences in the task ses- sions. In the first experiment, the task sessions were conducted with groups, whereas in the second experiment, the sessions were conducted with one par- ticipant at a time. The participants in the second experiment may have been more motivated to perform the tasks. Most believed that the instruction had helped them to answer the questions in the exercise although their performance on the search tasks was not significantly better than those who had not received any instruction. The results of the first experiment also showed that a significantly greater per- centage of those who viewed the tutorial . were confident that they could use the OPAC than those who received no in- struction. In contrast, the results of the. second experiment showed that as large a percentage of students who received no instruction were confident that they could use the OPAC. This may also be because DRA was easier for the partici- pants to learn and use and it inspired greater confidence in them. However, the differences in the results of the two experiments may also occur because of differences in the charac- teristics of the participants. There were more experienced OPAC users in the second experiment than in the first. In addition, more participants in the sec- ond experiment had used CD-ROM da- tabases, database management systems, and electronic spreadsheets. Thus, par- ticipants overall in the second experi- ment had a higher level of computer literacy than those in the first experi- ment. This could explain why the par- ticipants in the second experiment com- pleted more tasks than the participants in the first experiment. It may also ac- count for the lack of effect of the concept- based instruction. It would suggest that those with higher levels of computer lit- eracy (especially with search software) may benefit less (or not at all) from con- cept-based instruction, such as that pro- vided in the computer tutorial. This would be consistent with the findings of previous studies that have shown that students with "lower ability" benefit more from concept-based training. For ex- ample, in a study of computer program- ming, Bayman and Mayer found that students of lower ability benefited more from concept-based instruction than those of higher ability (as measured by SAT scores).19 Similarly, in discussing a study of OPAC instruction conducted at Stanford University where the results were not as strong as she had hoped, Borgman com- mented that a less sophisticated sample might have benefited more from the con- ceptual models provided. 20 FUTURE RESEARCH Future research should include stud- ies to further investigate the relationship Computer Tutorial for OPAC Users 363 between user experience and the impact of concept-based instruction. For exam- ple, this experiment could be repeated using a homogeneous group of less ex- perienced university students, or the ex- periment could be repeated in a public library setting or high school setting. In such studies, researchers should care- fully control for experience so that they can explore the relationships between various types of computer knowledge and bibliographic skills and the impact of concept-based computer tutorials. Fu- ture studies could also compare the con- cept-based computer tutorial to other forms of instruction; for example, the online tutorial (UTLearn) implemented at the University of Toronto could be compared with the newly installed OPAC (Data Research Associates). In such a study the form of delivery (via computer) of the instruction would be the same, but the content of the instruc- tion would differ in that UTLearn is not restricted to concept-based rna terial. In any future studies we suggest that re- searchers include more difficult tasks to enable them to detect any benefits that might relate only to performance on complex tasks. REFERENCES AND NOTES 1. Katherine Branch, "Developing a Conceptual Framework for Teaching End User Searching," Medical Reference Services Quarterly 5 (Spring 1986): 71-76. 2. Joan K Lippincott, "End-User Instruction: Emphasis on Concepts," in Conceptual Frame- works for Bibliographic Education: Theory into Practice, ed. Mary Reichel and Mary Ann Ramey (Littleton, Colo.: Libraries Unlimited, 1987), 183-91. 3. Linda Brew MacDonald, Mara R. Saule, Margaret W. Gordon, and Craig A. Robertson, Teaching Technology in Libraries: A Practical Guide (Boston, Mass.: G.K Hall, 1991), 125-26. 4. Frank G. Halasz and Thomas P. Moran, "Mental Models and Problem Solving in Using a Calculator," in CHI '83: Human Factors in Computing Systems, Boston, Dec. 12-15, 1983. Proceedings (New York: Association for Computing Machinery, 1984), 212-16. 5. Christine L. Borgman, ''The User's Mental Model of an Information Retrieval System: An Experiment on a Prototype Online Catalog," International Journal of Man-Machine Studies 24 (1986): 47-64. 6. Ibid. 7. Piraye Bayman and Richard E. Mayer, "Using Conceptual Models to Teach BASIC Computer Programming," Journal of Educational Psychology 80 (1988): 291-98. 8. Theresa L. Wesley, "Instructional Program Design: A Re-examination in Light of New OPAC Demands," Technicalities 11 (Mar. 1991): 9-11. 9. Elizabeth Frick, ''Theories of Learning and Their Impact on OPAC Instruction," Research Strategies 7 (Spring 1989): 67-78. 10. Mary M. Huston, ''Toward Contextual Sensitivity: Approaches to End User Instruction in the USA," Electronic Libran; 7 (June 1989): 164-67. 364 College & Research Libraries July 1994 11. Frick, ''Theories of Learning and Their Impact on OPAC Instruction", 67-78. 12. Joan M. Cherry, James Turner, and Marshall Clinton, "Online Public Access Catalogues (OPACs): Design of Instructional Software for User Training," in 1990 ASIS Annual Meeting. Proceedings, 143-50. 13. Joan M. Cherry, Geoffrey M. Rockwell and James M. Turner, "Designing for Diversity: The User Interface for a Hypermedia Information System on a University Campus," Behavior and Information Technology 11 (1992): 1-12. 14. Joan M. Cherry and Marshall Clinton, "An Experimental Investigation of Two Types of Instruction for OPAC Users," Canadian Journal of Information Science 16 (Dec. 199J): 2-22. 15. Joan M. Cherry and Marshall Clinton, "A Profile of OPAC Users and Their Satisfaction with OPACs at Five Universities," Canadian Library ]ournal49 (Apr. 1992): 123-33. 16. Mary Ellen Larson and Dace Freivalds, The Effect of an Instruction Program on Online Catalog Users. Final Report (Washington, D.C.: Association of Research Libraries, Office of Management Studies, Dec. 1984). 17. John T. Roscoe, Fundamental Research Statistics for the Behavioral Sciences, 2d ed. (New York: Holt, 1969). 18. Marija J. Norusis, SPSSX Introductory Statistics Guide (New Y~rk: McGraw-Hill, 1983). 19. Bayman and Mayer, "Using Conceptual Models to Teach BASIC Computer Programming," 291-98. ' 20. Borgman, ''The User's Mental Model of an Information Retrieval System: An Experiment on a Prototype Online Catalog," 60.