Pinto.indd Measuring Students’ Information Literacy Skills through Abstracting: Case Study from a Library & Information Science Perspective Maria Pinto, Andrés Fernández-Ramos, and Anne- Vinciane Doucet New education models based essentially on competencies and skills are gradually displacing the old systems based on teacher instruction and passive and memory-based learning in students, as these new competencies allow the student to learn actively with better levels of performance.We consider abstracting as a transcendent learning tool to analyze the basic role of information analysis and synthesis skills within the learning processes and their relation to the abstracting processes. Using an action-research methodology, we analyze the abstracting skill of students on the first and final courses of the Faculty of Library and Information Science at the University of Granada (Spain). Based on postulates from information literacy, analysis and synthesis competen- cies are studied through the students’ modus operandi at the different abstracting stages. Similarities and differences between the two groups of students are perceived and displayed, with reference to the relation between the learned subjects and the levels of competence and skill. In the light of these results, meaningful patterns and recommendations for improving students’ skill levels are proposed. o date, students have devoted a great deal of their mental efforts to memorizing data. However, global-scale chang- es in communication processes, largely due to the development of information and communication technologies (ICT), have led to the emergence of new educa- tion models. Whereas instruction was previously based on teacher instruction and student learning, education models now focus much more on active learning by the student. This situation has forced a change in the roles of the actors involved in teaching-learning processes. Today’s student can no longer be a mere passive subject who memorizes the material he or she is given; students must now have a series of skills and abilities that allow them to approach any information-based Maria Pinto is professor in the Faculty of Library and Information Science at the University of Granada; e-mail: mpinto@ugr.es; Andrés Fernández-Ramos is a librarian in the CSIC Humanities Center Library; e-mail: afernandezster@gmail.com; Anne-Vinciane Doucet is a Scholarship Holder in the Faculty of Library and Information Science at the University of Granada; e-mail: avdoucet@ugr.es. 132 mailto:avdoucet@ugr.es mailto:afernandezster@gmail.com mailto:mpinto@ugr.es http:performance.We Measuring Students’ Information Literacy Skills through Abstracting 133 problem and tackle it coherently. This information literacy, based on a set of competencies and skills, some general and others specific to each discipline, are linked to the competencies students need to be able to learn by themselves in the best possible conditions. The European Higher Education Area (EHEA), the aim of which is to harmonize and create convergence among university studies in Europe, advocates a change in the philosophy of higher education to prioritize proficient management of learn- ing tools over the mere accumulation of knowledge. The Tuning1 project was set up to achieve these aims, centered on edu- cational structures and content of studies. The project has identified a series of 30 competencies known as transversal or generic competencies. The issue of educa- tion based on competencies and skills has been growing in importance over recent years in the field of Information Science2, and has led to a research line known as information literacy, which focuses on information-use competencies (search, organization, processing, representation and management). Although many defi- nitions of information literacy3 have been put forward, one of the most cogent is that advanced by Webber and Johnston4: “Information literacy is the adoption of ap- propriate information behaviour to identify, through whatever channel or medium, infor- mation well fi ed to information needs, lead- ing to wise and ethical use of information in society”. A range of research studies have explored the measurement and assess- ment of information literacy skills5. From an academic perspective, it must be recognized that very few competen- cies related to information literacy are explicitly taught in universities. However, the Spanish Library and Information Science degree includes two core cur- riculum courses (Document Abstracting and Indexing and Abstracting Techniques) that are directly related to two of the core information competencies in international information literacy standards: analysis and information synthesis. Technological advances have not actually reduced the need for abstracting; in fact, the opposite is true: the development of the Internet has created a growing need for a variety of ways to filter information, of which abstracting is the pièce de résistance6. As a consequence, these courses have become true laboratory situations, where action-research methodology is used to examine aspects related to information literacy. The experience of teaching these subjects has allowed us to observe stu- dents’ skills in these competencies and the processes involved in learning them. The main objective of this pioneering study is precisely to observe and measure, using action-research methodology, how skilled students are in these competen- cies by specifying the stages necessary in abstracting processes and observing the extent to which the curricular de- velopment of these subjects affects the students’ skills. Learning Information Analysis and Synthesis Skills: Literature Review The OECD (Organization for Economic Co-operation and Development)7 defines a competency as the ability to meet indi- vidual or social demands or to perform an activity or task. The advantage of this external or functional approach, based on demand, is that it exposes the personal or social demands facing individuals. The generic competencies higher education students need have been dealt with by many education-related institutions8 and can be outlined according to the Alfin- EEES project9: • Learn to learn. • Learn to search for and evaluate information. • Learn to analyze and systemize. • Learn to generate knowledge. • Learn to work together. • Use technology to learn. The competencies directly linked to our study objectives are those known as “Information literacy competencies,” and refer to the search for, organization, processing, representation and legal and 134 College & Research Libraries March 2008 ethical use of information. According to Andre a,10 the information-literate per- son recognizes the need for information and determines the nature and extent of the information needed; accesses needed information effectively and efficiently; evaluates information and its sources critically and incorporates selected information into his or her knowledge base and value system; uses information accurately and creatively; applies prior and new information to construct new concepts or create new understandings; contributes positively to the learning community and to society; practices ethi- cal behavior in regard to information and information technology. Of all the competencies covered by the INFOLIT International Standards (ACRL, AASL, AECT, SCONUL, CAUL, and ANZIIL),11 we focus on information analysis and synthesis, as they are the most closely associated with abstracting processes. Given that meaningful learning should consciously and intentionally integrate the individual’s new and prior knowledge, the abstracting process favors this integration by involving not only the selection of relevant information but also the identification of the textual structure of the original document. In the subsequent representation process, the abstracter organizes the information obtained and generates new knowledge; but because of the complexity of the abstracting process, it is extremely complicated to approach as a complete entity. The learning model we use in abstracting instruction is there- fore broken down into subprocesses to analyze and synthesize the original information. Information Analysis and Evaluation Skills A superficial reading of a text can provide clues about its content, but a slightly greater effort is required to understand it. Meta-cognitive research has shown that the ability to identify and remember the main ideas is one of the bases for reading comprehension12, and one of the factors that differentiates “good” from “poor” readers13. All reading comprehen- sion processes eventually detect the text structure, its main subject ma er and, particularly, the author’s intention. The ACRL highlights that “the information literate student summarizes the main ideas to be extracted from the information gathered” and to do this, he or she must be able to “read the text and select main ideas, restate textual concepts in his/her own words and select data accurately and identify verbatim material that can be then appropriately quoted.” Since abstracts are reduced, autono- mous, and purposeful textual representa- tions of original texts,14 a certain, varying amount of the original text’s objective content is captured through the pertinent abstracting process, depending on the targets set. However, the abstract depends not only on the original document, but even more so on the abstracter ’s base knowledge and on his or her learning targets. The abstract should result from the convergence of an objective reality, the original document, and a subjective real- ity, the abstracter who has a certain level of knowledge and personal, nontransfer- able targets. There are two key moments in this process of learning through the technique of abstracting, in which we assume an acceptable level of comprehen- sion of the original text. First, the selec- tion of what is considered to be relevant content, and second, the structuring of this content for subsequent incorporation in the knowledge base of the recipient- abstracter. Selection is a process of pur- poseful elimination. Through contraction, reduction and condensation strategies, the aim of selection is to retain only the relevant information.15 In both the selec- tion and the structuring of the original content, the only assistance that may be offered takes the form of suggestions and recommendations that will help the task to be carried out more efficiently. Once the information has been ana- lyzed, the student is then able to evaluate and decide whether it fits in with his or http:information.15 Measuring Students’ Information Literacy Skills through Abstracting 135 her information needs. According to the ACRL standards, the information-literate student articulates and applies the follow- ing initial criteria for evaluating both the information and its sources: a) examining and comparing information from various sources to evaluate reliability, validity, accuracy, authority, timeliness, and point of view or bias; b) analyzing the structure and logic of supporting arguments or methods; c) recognizing prejudice, decep- tion, or manipulation; d) recognizing the cultural, physical, or other context within which the information was created and understanding the impact of context on interpreting the information. Information Incorporation, Synthesis and Use Skills The synthesis subprocess allows the con- ceptual results derived from the previous stages to be represented. But representa- tion is not an independent and self-ful- filling exercise. It will be necessary to investigate the linguistic, communicative, and organizational aspects of representa- tion from a multiplicity of sociocognitive perspectives and within the full range of discourse domains and knowledge communities.16 In any case, conditions of relevance, consistency, accuracy, and com- pleteness are needed for the abstract.17 Most university students already know how to select and structure the main ideas and include them in an abstract, but they tend to fall down when asked to present these subjects accurately and thoroughly in coherent sentences. Depending on the type of original information and the abstract objectives, some forms of graphic representation may also be effective. The information-literate student synthesizes main ideas to construct new concepts in the following ways: a) recognizing interrelationships among concepts and combining them into potentially useful primary statements with supporting evidence; b) extending initial synthesis, when possible, at a higher level of abstrac- tion to construct new hypotheses that may require additional information; c) utilizing computer and other technologies for studying the interaction of ideas and other phenomena. Below we list the competencies and skills necessary for abstracting: • Efficient reading of both wri en and graphic texts. • Awareness of the various types of abstracts. • Knowledge about how to select the type of abstract for each text, project, or context. • Knowledge about how to apply abstracting techniques to different types of documents. • Assessment and use of computer applications for automated abstracts. • Understanding of the potential and limitations of automated abstracts. • Learning to recognize and retrieve the appropriate information from a text. • Knowledge of the textual grammar and style of different abstract types. • Learning to classify and synthesize information in a text. • Learning to assess abstracts. • Rigor and accuracy, consistency and constancy. • Clarity in se ing out proposals and arguments. Analysis of the Skills and Competencies in Library and Information Science Students: A Case Study The scientific literature has analyzed the issue of problems and errors in writing scientific abstracts.18 However, the analy- sis of the various stages that go into the production of an abstract, and how they are related to the skills and abilities the process requires, the aim of the present research, is a new area of study. This study, carried out within the context of abstracting training, analyzed how abstracts were produced in accordance with the stages followed throughout the process, as well as the skills and abilities related to each stage. We were thus able to discover the students’ skill levels in a set of abilities related to document abstract- http:abstracts.18 http:abstract.17 http:communities.16 136 College & Research Libraries March 2008 ing and to identify possible weak points in their training. To this end, we carried out a trial with library science students in which they were asked to write an abstract of a scientific text and to specify how they carried out each one of the stages involved. The analysis and assessment were made by experts in the field and conse- quently a certain element of subjectivity must be taken into account. This factor is not easily avoided when assessing such relatively intangible aspects as those we deal with in this paper. Material and Method The study was based on the premises of action-research methodology, which advocates the use of the classroom as laboratory. At the same time it enables problems or weak points to be detected and rectified, thus contributing valuable information for the scientific community. If it is well designed and implemented, action research offers the possibility of generating data to support theorizing, to develop understanding and to create new knowledge.19 Taking this extra step demands a rigorous, critical, and system- atic approach and makes heavy demands on participant researchers.20 Action-research methodology is de- fined by Elliot21 as “the study of a social situation with a view to improving the quality of action within it’’ and by How- ard22 as follows: “Action research is the process of reflective problem solving conducted at the school level. This process allows us to identify an issue we want to study to determine if we can change our process or procedures to improve our program. Action research leads to program improvement and increased academic achievement for our students [and] offers possibilities for practical work that is also a form of learning for those involved.”23 “AR differs from case study research in that the action researcher is di- rectly involved in planned organisational change.”24 “One distinguishing feature of AR is, therefore, the active and deliberate self-involvement of the researcher in the context of his/her investigation.”25 The present study therefore falls within the frame of action-research methodology and specifically is an experimental study with an explanatory purpose: a trial or experiment was proposed on a set of students that consisted of the detailed wri en specification of the stages and processes involved in document abstract- ing; the experimental data were gathered in the classroom and analyzed with the aim of detecting the students’ weaknesses and strong points in the skills related to scientific information abstracting. This information guided us in focusing the learning targets for this type of skill and information literacy activity. Data Source We examined the international scientific literature26 to determine the most appro- priate procedure when presenting the ab- stracting stages to the students. The stages we focused on, simplified and adapted to our study, are as follows: • Reading. • Identification of the text structure, the main subject ma er and the author’s intention. • Selection of the most important sentences. • Generalization of the selected sen- tences. • Content schema. • Graphic representation. • Writing up. Based on the skills outlined by ANECA (Spanish Agency for Quality Assessment and Accreditation)27 and in the e-coms28, Alfinees29 and Cyberabstracts30 portals, we drew up a list of skills related to the stages in the abstract creation process. The skills selected were as follows: • Comprehension: detected in the identification of the text structure, the main subject ma er and the author’s in- tention, and in the selection of important sentences and keywords. • Analysis: detected in the identifica- tion of markers, in the text structure, the http:researchers.20 http:knowledge.19 Measuring Students’ Information Literacy Skills through Abstracting 137 selection of keywords and important sentences. • Synthesis: detected in the general- ization stage and in the writing up of the abstract. • Organization and structuring of the information: detected in the schema, sentence grouping, and visual organiza- tion. • Expression: analyzed from the way the abstract is wri en. Some skills are associated with more than one of the stages because of their transversal character and are present in numerous processes at different levels. Template The template designed by the research team was structured into four sections: • Student details: included any other university qualifications or professional activities where appropriate. • Evaluation of text to be abstracted: we asked students how familiar they were with the text subject and terminology. Their answers were graded on the fol- lowing scale: very familiar, quite familiar, moderately familiar, slightly familiar, unfamiliar. • Procedure used to prepare the abstract: brief description of the procedure (stages) used to prepare the abstract. • Preparation of the abstract: this sec- tion covered the way students carried out the various abstracting stages. — Mark unknown words. This re- vealed the types of terms (specific or gen- eral) the students did not understand. — Identify the subject of the text and the author’s intention. This stage was only required of fi h- course students who had more experience in information analysis. — Identify the general structure of the text. When abstracting, we must detect the structure of the original docu- ment. […] the learners must recognize what types of documents they are dealing with since this will help them greatly in subsequent selection, organizational, and construction tasks.31 — Underline text markers. Interest in what are known as “markers” stems from their potential to help detect text structure, as they signal sentences of par- ticular relevance and sections of the text. They can be classified into three types: additional information (also, in addition, moreover); contrast of idea or clarification (however, nonetheless, although, yet); and conclusion or summary (therefore, hence, as a result of, in summary). Only final-course students performed this stage. — Select the most important sen- tences. This stage allowed us to recognize the importance of what is superficial, thus reducing the text that needs to be worked on. — Generalize selected sentences. The aim of this stage was to see how the students rewrote the most important sentences chosen in the previous stage, making them more coherent and mean- ingful for the abstractor. This section was only required of final-course students. — Group the selected and general- ized sentences. The purpose of this stage was to reveal the students’ capabilities in finding associations between the sen- tences they had selected and generalized, by pu ing them into smaller groups. This stage was only required of final-course students. — Preparation of a graphic schema. Through this stage, we observed the type of schema the students used, together with their ability to structure the informa- tion. — Extract keywords and organize them graphically in a conceptual map. The representation of a text through key- words reveals the abstractor’s comprehen- sion and analysis—and to a certain extent, synthesis and expression—capabilities. We opted for free choice, rather than using controlled language. Final-course students were also asked to provide a con- ceptual map of the associations between the keywords. We were therefore able to observe the type of visual organization and the relation between the keywords http:tasks.31 138 College & Research Libraries March 2008 selected (regardless of whether the choice of words was correct or not). — Writing up the abstract. This concluding stage was essentially studied from the point of view of expression and synthesis abilities. The test was carried out with two groups of students: the first made up of students beginning the Library and In- formation Science (Bachelor) degree, and the second group of final-course (Master) students. As there were substantial dif- ferences in the training and capacities of the two groups, additional sections related to content taught on the Indexing and Abstracting Techniques course in the final year of study were included in the trial template completed by the Master students. Table 1 presents the stages the two groups were asked to follow in the trial. The text to be abstracted was handed out with the template. As Spanish is the official language used for instruction on these courses, and the students’ level of English was not sufficient to tackle a text in this language, we looked for texts, preferably scientific, in Spanish, with an abstract, keywords, and references. The texts had to be short enough for the students to be able to write the abstract following the template provided in the available time (three hours). The subject ma er of the article was related to the Library and Information Science degree, in part because the sub- ject was familiar to the students, but also because it would provide them with an idea of the type of research carried out in the field. The text we finally chose was the Span- ish version of Transforming document delivery in the e-content environment by Lucie Molgat, Los cambios en el suministro de documentos en un entorno de contenidos electrónicos, IFLA, 2005, available at www. ifla.org/IV/ifla71/papers/098s_trans-Mol- gat.pdf (Consulted: 15/05/06). TABLE 1 Stages of the process for each course Bachelor Master I. General reading of text Mark unknown words Mark unknown words Identify the subject dealt with and the author’s intention Identify the general structure of the text Identify the general structure of the text II. Second reading Underline text markers Select the most important sentences Select the most important sentences III. Information structuring Generalize the important sentences Group important sentences together Graphic / visual schema of text Graphic / visual schema of text IV. Information representation Identify keywords Identify keywords and organize them (their relation to each other) in a map or similar figure V. Expression Write up the abstract Write up the abstract Measuring Students’ Information Literacy Skills through Abstracting 139 Together with the trial instructions, the students were given the article from which the abstract, the keywords, and the references had been removed to avoid any influence on the students’ results. The final version was 4 pages long and had a total of 1,835 words. Test Sample and Conditions Two groups of students were chosen, the first made up of 40 students on the core- curriculum subject “document abstract- ing” from the Library and Information Science Diploma (Bachelor) and the other of 38 students on the a core-curriculum subject “Indexing and abstracting tech- niques for scientific documents,” from the final course of the Documentation Degree (Master), both taught at the Uni- versity of Granada. In this way we were able to appreciate the different levels of skills acquired by students, from those just beginning their degree to the more experienced. It should be mentioned that although the two subjects both deal with abstract preparation, their approaches are not the same. The first-year course in document abstracting predominantly centers on text analysis and comprehen- sion: once students begin university, they have to become familiar with the various document typologies, particularly with scientific texts, and it is important to learn to understand and structure these docu- ments. In the final year course, “Indexing and document abstracting,” the student is assumed to be more familiar with sci- entific texts and more accustomed to text comprehension; as a result, information representation and the use of conceptual maps are dealt with in greater depth. Although both courses concern the sub- ject of abstracting, they approach it from different angles: in the first year, the course centers more on the abstract as a product, with the study of the various abstract pro- duction procedures and their stages, and an in-depth exploration of textual structure detection. The final-year course focuses more on information representation and its links with new technologies. A weekday at the end of the final se- mester was chosen to carry out the trial. Participation was voluntary and took place on 23 and 24 May 2006 with a total of 19 master and 18 bachelor students (somewhat less than half the potential students). The sessions lasted 3 hours. Students were able to ask the expert tutors present in the classroom for clarification on how the schema should be done or the type of visual representation they were required to prepare. Data Collection and Processing Once the templates had been collected, all the data provided by the students was introduced into an Excel spreadsheet. This information was standardized and codified for ease of handling. We then designed the sample text. Each research team member abstracted the text according to the template, following all the stages outlined. The three authors of the study then held brainstorming sessions to come up with the best solution in each of the stages or sections of the template, bearing in mind that the assessment of some of the stages had to be based on the students’ responses in previous stages; for ex- ample, to assess the way the sentences had been grouped, the sentences chosen in the previous stage had to be consid- ered. This allowed us to spot any bias or doubts that could arise when assessing the results. The next step was the data analysis, which followed each of the sections of the template: • Descriptive data: We totaled the number of students holding other uni- versity qualifications or who carried out a professional activity compatible with the academic course. We also observed the students’ level of familiarity with the subject and the difficulties they had encountered with the terminology. • Procedure: The stages proposed by the students were gathered and standard- ized for subsequent tabulation. 140 College & Research Libraries March 2008 • Unknown words: The words the students had not understood were listed and tabulated, differentiating between LIS terminology and common words. • Subject and author’s intention. This assessment was somewhat subjective, as each student answered this question in his or her own words, and as such was not standardized. We assessed the two aspects (subject and author’s intention) on a scale of 1 to 3 to codify the answers, where 1 indicated no identification; 2, approximate identification; and 3, iden- tification of the respective aspect. • Text structure: We noted the text structure elements that had been identi- fied, verified whether or not they were correct, and assessed them on a scale of 1 to 5, with 1 representing the lowest and 5 the highest score. We took into account the way each part was named and the delimitation of the sections of text. • Markers: We observed which mark- ers had been selected, whether they were correct or not, and the frequency with which they had been chosen. We were thus able to see whether the students were aware of what markers were and the importance of being able to locate them in order to understand the text. • Sentence selection: We counted and standardized the number of sentences chosen and totaled the number of words in the sentences to calculate the percentage of the whole text they represented, bearing in mind that the formal selection is usually 50 percent. We also calculated the percent- age of correct selected sentences and the number of key sentences that should have been selected but were omi ed. It is not an easy task to determine the exact number of important sentences in a text, as it is essentially a subjective exercise, and con- sequently we analyzed those the students had underlined and only considered incor- rect those that were obviously superficial or repetitive. For the same reason, only the failure to mention a reference sentence for any one of the sections in the text structure (the 5 basic sections) was considered as an error of omission. • Generalization: We calculated per- centages of reduction on the previous stage and the original text to see whether the number of sentences or words in- creased or decreased when the students expressed themselves in their own words. • Grouping: We first counted the groups of sentences and then assessed how appropriate the groupings were in relation to the selected sentences on a scale of 1 to 3. • Schema: We looked at the type of schema the students had produced, and on a scale of 1 to 5, assessed how they structured the information and repre- sented it in a schema (compared with the “grouping” data in the case of the Masters students, and with “structure” in the case of first-year students). Finally, we assessed the appropriateness of the schema to the original text, also on a scale of 1 to 5. • Keywords: Three steps were fol- lowed in this analysis: — The keywords suggested by each student were analyzed and synonyms were removed, as the students were not provided with controlled vocabulary and many of them used similar terms such as electronic information / digital information. Hence, for data-handling purposes the number of keywords identi- fied by each student was reduced in some cases. — The percentage of correct keywords identified of all those proposed (accuracy) was calculated, together with the per- centage of correct keywords of the total number of keywords that should have been identified (thoroughness). Up to 5 keywords were accepted as valid to take these measurements. — Finally, to obtain an overall pic- ture, we examined the general frequency with which the proposed keywords ap- peared. • Visual organization. We focused on three aspects to assess this section: ability to coherently organize keywords, ability to represent the text content and choice Measuring Students’ Information Literacy Skills through Abstracting 141 of graphic used. The first two capacities were assessed on a scale of 1 to 5, where 5 represented the highest score. • Abstract: The following aspects of the abstract were analyzed: — Number of words. This gave us the percentage of reduction on the text. — Writing style. This allowed us to check whether the abstract was cor- rectly wri en, with no spelling mistakes, repetition, literal copying from the text, examples, and so on (scale of 1 to 5). — Representativeness of the text content. This allowed us to see whether the abstract represented the content of the text (scale of 1 to 5). — Proportion. The proportionality of the abstract was analyzed; that is, whether each section of the text was reflected in the abstract in its true measure (scale of 1 to 5). • General Comments. We introduced a section in the Excel spreadsheet to note FIGURE 1 Terminology and Familiarity with the Subject Area Familiarity with the subject area 0 10 20 30 40 50 60 V e ry fa m ili a r Q u ite F a m ili a r M o d e ra te ly fa m ili a r S lig h tly F a m ili a r U n fa m ili a r P e rc e n ta g e BLS MLIS Terminology 0 10 20 30 40 50 60 70 V e ry co m p lic a te d Q u ite C o m p lic a te d M o d e ra te ly co m p lic a te d N o t C o m p lic a te d E a sy P e rc e n ta g e BLS MLIS our general impression of each student’s skills and outline the most relevant as- pects observed. Results Of the 37 completed templates collected, 19 were from final-course students (11 women and 8 men) and 18 were from first-course students (12 women and 6 men). This sample had the following characteristics: • None of the first-course students combined a professional activity compat- ible with their studies or held any other university qualification. • Of the final course students, 21 percent had a university qualification other than the Library Science Diploma, and 31.5 percent combined their studies with a professional activity. Text Assessment When the students were asked how familiar they were with the text subject area and how complicated they found the terminology, the following opinions were obtained: Familiarity with the subject area was higher in final course students, 5 0 p e r c e n t o f w h o m claimed to be quite fa- miliar with the subject and 15.79 percent very familiar, whereas over half the first-course stu- dents were only slightly familiar with the subject and 11.11 percent were not at all familiar. This result was only to be expected ,given that the subject was related to library science and the fi- nal-course students logi- cally had a much greater knowledge. Generally speaking, neither of the two groups 142 College & Research Libraries March 2008 found the terminology especially com- plicated, particularly the final-course students, who had a broader education and richer vocabulary. Most of the first- course group considered the terminology not very complicated (66.67%), a few found it quite complicated (11.11%) and 22.22 percent, not very complicated. Final- course students had even fewer problems with the terminology: 26.32 percent found it easy/ not at all complicated and nearly 70 percent stated it was moderately or not very complicated. Procedure Used in Abstract Preparation The mean number of stages identified by first-course students was 3.88, while for final course participants it was 4.29. The latter group may have identified more stages because they followed a more thorough, complex process than the first-year students, who were at the start of their degree. The stages the students identified to approach the abstracting of the text are detailed below. However, this does not mean that the students rigorously followed these stages or did not uncon- sciously use others, but rather these were the ones they identified because they considered them to be more logical and provided a base for writing the abstract. There is reasonable consensus on several stages: rapid reading, detailed reading, underlining, extracting the main ideas and writing the abstract, all of which are highly logical and, to a large extent, coincide with related doctrine. The final-course students reported paying more a ention to the structure (47.06% of the sample) and, because of their experience, identified more stages. These tended to be stages that identified “extra-textual” elements such as the title, the structure, the identification of key- words (that allow the information to be be er understood and structured) and the typography, which allows the structure of the text to be be er understood. Some students also used other stages, such as looking at the subject (only 5.88%), which enabled them to understand the text as a TABLE 2 Stages identified by the students First Course Final Course Procedure Frequency Percentage Frequency Percentage 1) Rapid reading 12 70.59% 10 58.82% 2) Detailed reading 11 64.71% 10 58.82% 3) Underlining 10 58.82% 11 64.71% 4) Looking at the title 2 11.76% 2 11.76% 5) Looking at the structure 3 17.65% 8 47.06% 6) Extracting main ideas 8 47.06% 6 35.29% 7) Ordering of ideas / Schema 1 5.88% 1 5.88% 8) Writing 17 100.00% 17 100.00% 9) Identification of keywords 0 0.00% 3 17.65% 10) Elimination of superficial content 0 0.00% 1 5.88% 11) Looking at typography 0 0.00% 2 11.76% 12) Looking at the subject 0 0.00% 1 5.88% 13) Synthesis 2 11.76% 0 0.00% 15) Analysis of the introduction 0 0.00% 1 5.88% Measuring Students’ Information Literacy Skills through Abstracting 143 whole before focusing on what would be the most important elements. Some (5.88% of the final-course sample) ana- lyzed the introduction, since it should reflect the content of the text and thus allow them to focus on its essential aspects. Very few (5.88% of the two samples) used a schema to write the abstract, although final-course students did use them to represent the information, which proved to be a fairly effective system. Unknown Words There were no difficulties with the terminology and any unknown words were on the whole English expressions. The students were able to find the mean- ing of practically all the words using Internet (the trial was carried out on computers with an Internet connection). The term “STM” (Scientific, Technical, and Medical) caused the greatest dif- ficulties both for first-course (44.44% of the sample) and final-course students (31.58%); the term metadatos for first- course students (38.89%), which may be because it is a technical term on the degree they have just begun, and “peer review” for final-course students (21.05%). Identification of the Main Subject Matter In general, the students had no major difficulty in identifying the main subject ma er; only 15.79 percent were unable to do so and referred to more secondary questions. The remaining 85 percent iden- tified the topic either approximately (2) or correctly (3). We can therefore say that TABLE 3 Unknown Terms WORDS Frequency in First Course Frequency in Final Course Publicaciones STM 8 6 Endeavor 2 0 Metadatos 7 0 Peer review 1 4 pay per view 1 0 Secure Desktop Delivery 0 1 they were able to identify and analyze at a general level. Identification of Author’s Intention In general, the students accurately identi- fied the author’s intention; most of them (63.16%) realized that the author’s inten- tion was to inform on the situation of document delivery in Canada and to detail the initiatives undertaken and planned to take place in this field. Only 10.53 percent failed to identify this intention. This ele- ment is linked to the subject ma er but requires a deeper analysis of the text. Structure The mean score for first-course students was 4.35; and, for final-course students, 3.1 out of a possible total of 5 points. The first-course students therefore performed better, which may appear surprising since a priori the final-course students should have a greater ability for identifying text structure. However, FIGURE 2 Subject Matter 16% 47% 37% Not identified Approximately identified Correctly identified FIGURE 3 Author’s Intention 11% 26% 63% Not identified Approximately identified Correctly identified 144 College & Research Libraries March 2008 TABLE 4 Selection of Most Important Sentences Selected words % reduction Selected sentences Correct selected sentences % correct sentences (accuracy) Basic aspects not mentioned First course 410.17 22.35% 17.42 12.67 72.20% 0.83 Final course 325.22 17.72% 13.44 8.17 60.88% 1.33 further analysis of the responses indicated that this may be due to a “bad habit” picked up in the final year: a relatively high number of students suggested an OMRC (objectives, methodology, results, and conclusions) structure because it is the one most commonly found in scientific articles. The chosen text was not structured in this way; rather, it was the presentation of a specific situation in a specific place and took the following structure: • Introduction / contextualization of the problem. • Specific presentation of the situa- tion and initiatives in Canada. • Conclusions / future expectations. What were surprising were the high scores of the first-course students. This is due to the emphasis on structure identifi- cation in the first-year course content and the fact that they had carried out numer- ous practical exercises on both scientific and general texts in class. Markers A mean of 4.3 markers were identified by each student, but many were not found or were not given sufficient importance. The students identified a total of 17 different markers, all of which could be classified as reasonable choices, al- though perhaps they were not the most important. Clearly, not all of them were equally important, as some indicated the beginning of a section, others emphasized a relevant sentence, and others linked the ideas in the text. The most commonly detected mark- ers were: “however” (82.35%), “clearly” (70.59%), “unfortunately” (58.82%) ,and “although” (52.94%). “In addition” (47.06%), “finally” (35.29%), and “there- fore” (29.41%) were not so frequently identified. The remaining markers, more than half the total 17 identified, were only noted by one student in each case, which represents 5.88 percent for each marker. In general, neither of the two groups used this technique to full advantage, or at least not consciously. Selection of Most Important Sentences The students’ mean values in this stage were as follows: We can observe that the final-course students made greater reductions than the first-course group in this initial in- formation synthesis, with a mean of 17.42 sentences selected by the first course, and of 13.44 by the final course, representing 410 and 325 words respectively. However, further analysis of the select- ed sentences revealed be er results from the first-course than from the final-year students. A slightly higher percentage of correct sentences were selected by first- course students: 72.2 percent, as compared to 60.88 percent in the final-year group. The final-course students were also less successful than those on the first course in dealing with all aspects of the text. On average, the first-course group omi ed to mention 0.83 of the basic aspects of the text, while this mean rose to 1.33 in the final-course group. This aspect is fairly closely linked to the percentage of reduc- tion made, and the poorer results of the final-course group were to be expected as they had selected fewer sentences. Measuring Students’ Information Literacy Skills through Abstracting 145 TABLE 5 Generalization of Most Important Sentences Words in the selection Words in the generalization % reduction in words in relation to the selection % reduction in words in relation to the total Selected sentences Generalized sentences % reduction of sentences 325.22 274.17 84.39% 14.94% 13.44 11.33 83.88% Generalization of Most Important Sentences As shown in the following table, the final-course students (the only group to perform this stage) made a slight reduc- tion in the number of both sentences and words as compared with the previous stage. Not all were lower, however, and, in fact, some were higher, showing that not all the students had properly under- stood the purpose of this stage. Sentence Grouping Only final-course stu- dents were asked to comp le t e t hi s st ag e, of whom only 17 re- sponded. The results are shown in table 6. A high reduction percent- age can be observed: the mean value of 11.47 sentences in the previ- ous stage dropped to a value of 6.18 groups of sentences (61.04%). Moreover, the sentence grouping obtained a fairly high value (2.41 out of 3). It should be noted that the scores g i v e n f o r s e n t e n c e grouping were based on the work of previ- ous stages, and we were therefore assessing the ability to logically group the previously selected sentences. Schema The type of schema used was clearly dif- ferent in the two groups: the final-course students showed a preference for graphic schemas (67%), whereas 95 percent of the first-course group opted for linear sche- mas. This is explained by the final-course group’s greater knowledge of the different information representation techniques, which they studied in “Indexing and TABLE 6 Sentence Grouping Generalized Sentences Groups of Sentences Percentage Reduction Score (1 to 3) 11 7 63.64% 3 3 3 100.00% 3 18 3 16.67% 3 19 3 15.79% 2 14 3 21.43% 2 12 6 50.00% 3 16 10 62.50% 2 15 6 40.00% 2 12 6 50.00% 3 12 11 91.67% 2 7 4 57.14% 2 11 6 54.55% 2 5 4 80.00% 2 11 5 45.45% 3 9 8 88.89% 2 5 5 100.00% 3 15 15 100.00% 2 Mean values 11.47 6.18 61.04% 2.41 146 College & Research Libraries Abstracting Techniques” taught on the final course. Two aspects were assessed in this sec- tion: the ability to structure the informa- tion extracted in previous stages and its appropriateness to the text content. Both groups showed good ability in structuring the information they had gathered in the previous stages, although the final-course students obtained slightly higher scores (means of 4.28 in the first and 4.51 in the final-course group). However, the schemas prepared by the first-course students be er reflected the text content (4.33 in the first as opposed to 3.94 in the final course), perhaps due to the fact that in the “document schema” section they showed greater skill in iden- tifying the text structure. Keywords Both groups identified a total of 26 differ- ent keywords, 15 of which were included by the two groups. As can be seen in the following figure, the mean number of words put forward by the first-course students was 4 per student, which, once synonyms had been removed, fell to 3.94 per student. The final-course group identified a mean of 6.29 keywords per student, falling to 5 when synonyms were eliminated. The mean number of cor- rect keywords was slightly higher in the second group, although both groups came close to 3 of the 6 possible options. As the mean number of correct key- words identified by the two groups was similar, the accuracy and thoroughness values obtained were conditioned by the FIGURE 5 Schema Assessment 4.28 4.33 4.51 3.94 1 1.5 2 2.5 3 3.5 4 4.5 5 Ability to structure information Appropriateness of the schema BLS MLIS March 2008 FIGURE 4 Type of Schema 0% 20% 40% 60% 80% 100% Type of schema Linear Schema Graphic Schema BLS MLIS number of keywords selected. Hence, the first-course students, who had selected fewer keywords than the final-course group, obtained higher accuracy values but lower thoroughness values. Three of the 6 keywords we considered appropriate were detected by most of the students in both groups, and two, perhaps the most debatable of the six, were only selected by a small percentage of the students. One keyword, slightly more difficult to identify but still important, was identified more successfully by the final-course students, perhaps because of their greater ability to generalize and their familiarity with the subject ma er. However, on the whole, both groups performed well in this task. Visual Organization of the Information Only the final-course students were in- volved in this stage as they were more familiar with graphic representation of information and the various techniques and methods associated with it. The over- whelming majority of this group opted to use diagrams in their representation (88%), and more than half of this percentage used arrows to indicate the relationship between the diagrams (con- ceptual maps), unquestion- ably one of the clearest and most suitable ways. Only two students used a different type of graph, a hierarchy graph and a pie chart, the la er be- ing particularly unsuitable for this case. Measuring Students’ Information Literacy Skills through Abstracting 147 FIGURE 6 Schema Assessment 4 3.94 2.72 6.29 5 3.05 0 1 2 3 4 5 6 7 Mean number of keywords Mean number of keywords without synonyms Mean number of correct keywords BLS MLIS FIGURE 7 Accuracy / Thoroughness 0.69 0.45 0.61 0.51 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 Accuracy Thoroughness BLS MLIS The results we obtained from the assessment of graphic representation quality were poorer than those from the schema preparation stage, as it was more difficult to relate just a few concepts rather than sentences. Our assessment of the students’ ability to relate the selected keywords on a scale of 1 to 5 resulted in a mean score of 3.65, but when we analyzed the value of the graph as a representation of the text, the mean fell to 2.94. In most cases, error was due to an incorrect relation be- tween concepts. The choice of keywords also conditioned the way they were associated; and if a correct selection had not been made, it was clearly more difficult to provide the right relationship among them. Abstract The results are shown in the following two tables: The abstracts produced by the final-course students were more concise, with a mean of 142.74 words (7.70% of the text total); the first-course students reduced the text to 11.59 per- cent of the total, using a mean of 212.61 words in their abstracts. Although this represents a fairly significant difference, it did not substantially affect the quality of the abstracts. Our analysis of the abstracts’ qual- ity showed that the groups obtained a similar score for the representativeness of the text content (3.89 and 3.94) and the proportionality of the abstract (3.83 and 3.56). However, the final-year stu- dents presented a be er writing style (4.33 compared to 3.11 obtained by the TABLE 7 Keywords Selected First Course Final Course Keywords Frequency Percentage Frequency Percentage Document delivery 15 83.33% 12 70.59% CISTI 12 66.67% 9 52.94% Digital documents 12 66.67% 17 100.00% Canada 1 5.56% 3 17.65% Csi 3 16.67% 3 17.65% Information access 3 16.67% 8 47.06% 148 College & Research Libraries March 2008 FIGURE 8 Type of Graphic Representation 6% 41% 47% 6% Hierarchical Diagrams Arrow graph pie chart first-course group), evidenced in be er sentence linking, be er use of punctua- tion, and less direct copying from the text, to name a few. To a great extent, this is the factor that determined the slightly higher overall assessment in the quality of the final-course students’ abstracts. In general, the final abstracts are of a relatively good standard, and the main shortcomings, aside from writing style, are due to a failure to identify the struc- ture of the text and errors in text sche- matization. As a result, the structure of a number of the abstracts suffered from a lack of proportion. Discussion Our analysis of the results revealed a series of relevant differences between the first- and final-course students, the most significant of which are the following: • The final-course students were much more concise than the first-course TABLE 8 First Course Students’ Abstracts Assessment (1 to 5) Number of Words Reduction Writing Style Representativeness of Text Content Proportion Mean Assessment of Abstract 126 6.87% 3 3 3 3.0 265 14.44% 3 4 4 3.7 241 13.13% 3 4 4 3.7 210 11.44% 3 5 5 4.3 258 14.06% 2 4 5 3.7 246 13.41% 4 4 4 4.0 214 11.66% 2 3 4 3.0 118 6.43% 4 3 3 3.3 266 14.50% 2 3 4 3.0 153 8.34% 3 4 4 3.7 290 15.80% 4 5 5 4.7 180 9.81% 4 4 4 4.0 247 13.46% 3 5 5 4.3 67 3.65% 3 3 3 3.0 234 12.75% 3 4 3 3.3 225 12.26% 3 4 3 3.3 221 12.04% 3 4 3 3.3 266 14.50% 4 4 3 3.7 Mean 212.61 11.59% 3.11 3.89 3.83 3.61 Measuring Students’ Information Literacy Skills through Abstracting 149 TABLE 9 Final Course Students’ Abstracts Assessment (1 to 5) Number of Words Reduction Writing Style Representativeness of Text Content Proportion Mean Assessment of Abstract 87 4.74% 5 4 3 4.0 94 5.12% 4 3 3 3.3 201 10.95% 4 4 4 4.0 206 11.23% 3 4 3 3.3 50 2.72% 4 4 4 4.0 186 10.14% 4 4 4 4.0 96 5.23% 5 4 3 4.0 156 8.50% 5 5 5 5.0 181 9.86% 4 4 4 4.0 194 10.57% 5 5 4 4.7 117 6.38% 5 3 3 3.7 155 8.45% 4 4 5 4.3 74 4.03% 4 3 3 3.3 62 3.38% 5 3 2 3.3 140 7.63% 5 3 3 3.7 230 12.53% 4 5 4 4.3 121 6.59% 4 4 3 3.7 193 10.52% 4 5 4 4.3 169 9.21% 4 4 4 4.0 Mean 142.74 7.70% 4.33 3.94 3.56 3.94 FIGURE 9 Accuracy / Thoroughness Assessment of the abstracts 0 1 2 3 4 5 W ri tin g s ty le R e p re se n ta tiv e n e ss P ro p o rt io n M e a n S co re BLS MLIS 150 College & Research Libraries March 2008 group, both when selecting important sentences and in writing up the abstract. However, they used more keywords, pos- sibly because of their greater knowledge of indexing techniques. • The first-course students were more successful at identifying the text structure, preparing schemas that correspond more appropriately to the text and selecting the most important sentences. • The final-course students had a be er writing style and showed greater skill in structuring the information. As can be seen, there is a set of skills directly related to the stages in docu- ment abstracting that can be discovered through an analysis of these stages. Comprehension and Analytical Skills In the stages associated with these skills— identification of the text structure, selection of keywords and selection of most im- portant sentences—we observed a highly developed ability for comprehension and analysis in the first-course students. Par- ticular emphasis is placed on these skills in the “Document abstracting” course cur- riculum, in which intensive practice of text reading and structure analysis takes place, the object being to enable students to un- derstand the text correctly and thoroughly grasp its meaning. Final-year students do not tackle these aspects in such depth on the “Indexing and abstracting tech- niques” course, essentially because they TABLE 10 Table 1: Summary of Common Stages Common stages in first and final course students Structure identification First Course 4.35 Final Course 3.1 Selection of most important sentences Number of words in the sentences selected 410.17 325.22 Reduction of words from the original text 22.35% 17.72% Number of sentences selected 17.42 13.44 Number of sentences selected correctly 12.67 8.17 Percentage of correct sentences selected 72.20% 60.88% Number of basic aspects omitted 0.83 1.33 Preparation of the schema Assessment of structuring ability 4.28 4.51 Coincidence with the text 4.33 3.94 Keywords Mean number of keywords chosen 4 6.29 Mean number of keywords chosen without synonyms 3.94 5 Mean number of correct keywords 2.72 3.05 Percentage of correct keywords chosen 69% 61% Percentage of correct keywords from total possible 45% 51% Abstract Number of words in the abstract 212.61 142.74 Reduction of words from the original text 11.59% 7.70 Assessment of writing style 3.11 4.33 Assessment of appropriateness to the text 3.89 3.94 Assessment of abstract proportionality 3.83 3.56 Mean assessment of abstract 3.61 3.94 Measuring Students’ Information Literacy Skills through Abstracting 151 are assumed to be proficient in this skill, al- though clearly, if they do not systematically and regularly use what they have learned, they gradually lose this proficiency. The final-course students did not score badly, but simply lower than the first- course group. In general, they identified the text subject ma er and the author’s intention, and their selection of keywords was relatively good. In other words, they understood the text, but in the more thor- ough analysis (structure or the selection of the most important sentences) they were le slightly behind. How can this situation be improved? By a empting to improve text analysis competencies through similar exercises to those done by the first-course students. Identification of text markers was not a strong point in either of the groups, and it may be appropriate to perfect this technique to strengthen the students’ text analysis skills. Synthesis Skills This skill was best seen in the writing up of the abstract stage, although the most important sentence selection stage also provided significant clues. A clear tendency to emerge in various stages was that the final-course students were much more likely to synthesize; they said what they wanted to say more suc- cinctly. Although their scores were lower in the selection of important sentences stage, this weakness was due to a failure in analysis, not in their ability to synthe- size, which was, in fact, quite high. In the abstract, the final-course stu- dents obtained a similar overall score to the first-course group, but using much fewer words. As in the sentence selection stage, the problems arising in the abstract (proportionality) derived from previous stages and were not due to a lack of syn- thesis skills. This may be explained by the fact that the instruction the students received at university has fine-tuned their ability to synthesize as an important transversal skill in many subjects. Moreover, on the course studied they have learned the value of being concise through practice in schematization and preparation of con- ceptual maps. The first-course students have not yet developed this skill through practice and training. Information Organization and Structuring Skills. The final-course students’ skills in these areas were adequately developed. The way they prepared schema, grouped sentences, and represented the informa- tion graphically evidenced a great ability to structure information, which was to be expected since the “indexing and abstract- ing techniques” subject focuses heavily on these skills. As in the previous stages, the shortcomings we detected were due to problems in the detailed text analysis, but the information they selected was adequately organized and structured. The first-course students have also ac- quired these skills, and their results were fully satisfactory; certain aspects related to schema typology could be improved, but, on the whole, they showed a good grasp of these skills. The results they obtained in these stages were positively influenced by the high quality of their text analyses, which greatly contributed to their performance in subsequent stages. We can therefore conclude that stu- dents develop information organization and structuring skills on both courses, although they are more firmly consoli- dated in final-course students. Expression Skill Expression skill, evidenced in the writing up of the abstract, is of great importance, as it can be clearly seen in the final prod- uct of any communicative activity. It was more highly developed in the final-course students, as a result of much more train- ing in this aspect, ranging from project- based work to exams. It may therefore be expected that the first-course students will also develop this skill during their time at university. However, to ensure that this happens, the maximum num- 152 College & Research Libraries March 2008 ber of exercises involving text writing should be included in their instruction program. Conclusions The transversal nature of information skills means they can be appreciated in any learning process and, in general, in any aspect of life. Knowledge about how capable students are in these skills is essential if they are to be improved, if weaknesses and strong points are to be detected and corrective measures adopted. The discipline of document abstracting is studied on the Library and Information Science degree and, on the whole, centers on the product: the abstract. However, if abstract preparation is to be properly undertaken, various stages and skills are required in a set of abilities. If we focus only on the abstract as the final result, ig- noring the stages necessary in its creation, we cannot see which competencies and abilities need to be strengthened. Through this study we have analyzed these stages by observing how two groups of students produced an abstract, thereby detecting the strengths and weaknesses of each group. Likewise, by comparing the curriculums for the document-abstract- ing subjects taught at the University of Granada, we were able to discover the keys to identify their causes and improve skills in these competencies. Our case showed that the instruction received by first-course students in information analysis skills was appropriate, whereas the corresponding course taken by final-course students focused more on correct structuring and graphic representation of information. To improve competencies related to document abstracting in the field of in- formation analysis and synthesis skills training, greater emphasis must be placed on the learning process, which can be advanced through practical exercises in the corresponding subjects. The following will be of interest to the case in hand: • Exercises to improve reading speed, a ention, and comprehension (individual and feedback/sharing ideas). • Exercises in extracting thematic con- cepts from original documents following models by Lasswell32 and Ranganathan33 to improve skills in identifying theme and rheme structures in documents. • Learning activities dealing with vi- sual representation of text concepts using the concept map technique (individual and feedback/sharing ideas). • Exercises to assess abstract struc- ture and its correlation with the original document (individual and feedback/shar- ing ideas). The use of action-research methodol- ogy is particularly appropriate for this type of study, as its purpose is to get to know the student and thereby improve his or her training. The use of standard- ized templates that reveal the students’ skills, adapted to the students’ level and the characteristics of the desired informa- tion, provides this knowledge and, if used on a regular basis (at the beginning and end of a course, every year, or another appropriate time), additional valuable in- formation can also be obtained, including the extent to which the student improves throughout the course, the validity of a certain teaching method, or even the work of the teaching staff. Notes 1. Tuning, Educational structures in Europe. Available online at www.unideusto.org/tuning/. [Accessed January 30, 2008]. 2. Shirley J. Behrens, “A Conceptual Analysis and Historical Overview of Information Lit- eracy,” College and Research Libraries 55, no. 4, (1994): 309–22; Christine S. Bruce, The Seven Faces of Information Literacy (Adelaide, Australia: AUSLIB Press, 1997); Christine S. Bruce, Information Literacy as a Catalyst for Educational Change: A Background Paper. 2002. White Paper prepared for UNESCO, the U.S. National Commission on Libraries and Information Science, and the National Forum on Information Literacy, for use at the Information Literacy Meeting of Experts, Prague, www.unideusto.org/tuning Measuring Students’ Information Literacy Skills through Abstracting 153 The Czech Republic. Available online at www.nclis.gov/libinter/infolitconf&meet/papers/bruce- fullpaper.pdf. [Accessed January 30, 2008]; Alistair Mutch, “Information Literacy: An Exploration,” International Journal of Information Management 17, no. 5 (1997): 377–86; David Bawden, “Information and Digital Literacies: A Review of Concepts,” Journal of documentation 57, no. 2 (2001): 218–59; Mark Hepworth, “Approaches to Information Literacy Training in Higher Education: Challenges for Librarians,” New Review of Academic Librarianship 6 (2000): 21–34; Michael B. Eisenberg, Carrie A. Lowe, Kathleen L. Spitzer, Information Literacy: Essential Skills for the Information Age, 2nd ed. (Westport, Conn.: Libraries Unlimited, 2004); James Elmborg, “Critical Information Literacy: Im- plications for Instructional Practice,” Journal of Academic Librarianship 32, no. 2 (2006): 192–99. 3. Sandy Campbell, ‘‘Defining Information Literacy in the 21st Century.” Paper presented at the World Library and Information Congress: 70th IFLA General Conference and Council, August 22–27, 2004, Buenos Aires. Available online at www.ifla.org/IV/ifla70/papers/059e-Campbell.pdf. [Accessed January 30, 2008]. 4. Sheila Webber and Bill Johnston, Information Literacy: Definitions and models. 2003. Available online at h p://dis.shef.ac.uk/literacy/definitions.htm. [Accessed 7 September 2006]. 5. Patricia D. Maughan, “Assessing Information Literacy among Undergraduates: A Discus- sion of the Literature and the University of California-Berkeley Assessment Experience,” College and Research Libraries 62, no. 1 (2001) 71–85; Kathleen Dunn, “Assessing Information Literacy Skills in the California State University: A Progress Report,” Journal of Academic Librarianship 28, no. 1/2 (2002): 26–35; Lorrie A. Knight, “Using Rubrics to Assess Information Literacy,” Reference Services Review 34, no. 1 (2006): 43–55; Anita Ondrusek et al., “A Longitudinal Study of the Development and Evaluation of an Information Literacy Test,” Reference Services Review 33, no. 4 (2005): 388–417; Lisa G. O’Connor, Carolyn J. Radcliff, and Julie A. Gedeon “Applying Systems Design and Item Response Theory to the Problem of Measuring Information Literacy Skills,” College and Research Libraries 62, no. 6 (2002): 528–43. 6. Maria Pinto, “Abstracting/Abstract Adaptation to Digital Environments: Research Trends,” Journal of Documentation 59, no. 5 (2003): 581–608. 7. Organisation de Coopération et de Développement Economiques, Définition et Sélection des Sompétences (DESECO): Fondements Théoriques et Conceptuels. 2002. Available online at www. portal-stat.admin.ch/deseco/deseco_doc_strategique.pdf. [Accessed 19 July 2006]. 8. For example, Tuning (www.unideusto.org/tuning/), ANECA (www.aneca.es/), WASC (www.wascweb.org/). 9. María Pinto, Alfin-EEES. 2006. Available online at www.mariapinto.es/alfineees. [Accessed January 30, 2008]. 10. Susie Andre a, Information Literacy: A Practitioner’s Guide (Oxford: Chandos, 2005). 11. ACRL, Information Literacy Competency Standards for Higher Education, 2000. Available on- line at www.ala.org/ala/acrl/acrlstandards/standards.pdf; AASL and AECT, Information Literacy Standards for Student Learning, 1998. Available online at www.ala.org/ala/aasl/aaslpro ools/infor- mationpower/InformationLiteracyStandards_final.pdf; SCONUL, The Seven Pillars of Information Literacy, 2004. Available online at www.sconul.ac.uk/activities/inf_lit/seven_pillars.html; CAUL and ANZIIL, Australian and New Zealand Information Literacy Framework: Principles, Standards and Practice, 2004. Available online at www.caul.edu.au/info-literacy/InfoLiteracyFramework.pdf. [Accessed January 30, 2008]. 12. Mark W. Aulls, Development and Remedial Reading in the Middle Grades (Boston: Allyn & Bacon, 1978); Albert J. Harris and Edward R. Sipay, How to Increase Reading Ability: A Guide to Developmental and Remedial Methods (Nueva York: Longman, 1980). 13. Sandra S. Smiley et al., “Recall of Thematically Relevant Material by Adolescent good and Poor Readers as a Function of Wri en Versus Oral Presentation,” Journal of Educational Psychol- ogy 69 (1977): 381–87; Peter N. Winograd, “Strategic Difficulties in Summarizing Texts,” Reading Research Quarterly, 19 (1984): 404–25. 14. Pinto, Alfin-EEES. 15. Maria Pinto, “Documentary Abstracting: Toward a Methodological Model,” Journal of the American Society for Information Science 46, no. 3 (1995): 225–34. 16. E.K. Jacob and D. Shaw, “Sociocognitive Perspectives on Representation,” Annual Review of Information Science and Technology, 33 (1998): 131–85. 17. Maria Pinto, “A Grounded Theory on Abstract Quality: Weighting Variables and A ributes,” Scientometrics 69, no. 2 (2006): 213–26. 18. Meno de Jong, Peter J. Schellens, “Toward a Document Evaluation Methodology: What Does Research Tell Us about the Validity and Reliability of Evaluation Methods,” IEEE Transactions on Professional Communication 43, no. 3 (2000): 242–60; María Pinto, Cyberabstracts: Abstracting Quality Resource Directory. Available online at www.mariapinto.es/ciberabstracts. [Accessed January 30, 2008]. 19. John Ellio , Action Research for Educational Change (Buckingham: Open U P, 1991). www.mariapinto.es/ciberabstracts www.caul.edu.au/info-literacy/InfoLiteracyFramework.pdf www.sconul.ac.uk/activities/inf_lit/seven_pillars.html www.ala.org/ala/aasl/aaslpro�ools/infor www.ala.org/ala/acrl/acrlstandards/standards.pdf www.mariapinto.es/alfineees http:www.wascweb.org http:www.aneca.es www.unideusto.org/tuning www.ifla.org/IV/ifla70/papers/059e-Campbell.pdf www.nclis.gov/libinter/infolitconf&meet/papers/bruce 154 College & Research Libraries March 2008 20. Sharon Markless and David Streatfield, “Gathering and Applying Evidence of the Impact of UK University Libraries on Student Learning and Research: A Facilitated Action Research Ap- proach,” International Journal of Information Management 26 (2006): 3–15. 21. Ellio , Action Research for Educational Change. 22. Jody K. Howard and Su A. Eckhardt, “Why Action Research? The Leadership Role of the Library Media Specialist,” Library Media Connection (October 2005): 32–34. 23. Markless and David, “Gathering and Applying Evidence,” 4. 24. D. Avison, R. Baskerville and M. Meyers, “Controlling Action Research Projects,” Informa- tion Technology & People 14, no. 1 (2001): 28–45. 25. Judy McKay and Peter Marshall, “The Dual Imperatives of Action Research,” Information Technology & People 14, no. 1, (2001): 45–59. 26. Wilfrid F. Lancaster, Indexing and Abstracting in Theory and Practice, 2nd ed. (London: Library Association, 1998); Donald Cleveland and Ana Cleveland, Introduction to Indexing and Abstracting, 2nd ed. (Li leton, Colo: Libraries Unlimited, 1990); François-Pierre Gingras, Le Résumé, Cybermé- tho, 2005. Available online at h p://aix1.uo awa.ca/~fgingras/cybermetho/modules/resume.html. [Accessed January 30, 2008]. Pinto, “Documentary Abstracting”; Pinto, Alfin-EEES. 27. ANECA, “Libros Blancos.” Available online at www.aneca.es/activin/activin_conver_LLBB. asp. [Accessed January 30, 2008]. 28. Maria Pinto. Portal e-COMS. 2004. Available online at www.mariapinto.es/e-coms. [Ac- cessed January 30, 2008]. 29. Pinto, Alfin-EEES. 30. María Pinto, Cyberabstracts: Abstracting Quality Resource Directory. Available online at www.mariapinto.es/ciberabstracts. [Accessed January 30, 2008]. 31. María Pinto, Aprendiendo a Resumir: Prontuario y Resolución de Casos. (Gijón: Trea, 2005). 32. H.D. Lasswell, “The Structure and Function of Communication in Society,” in: ed. Lyman Bryson, The Communication of Ideas (New York: Harper Row, 1948). 33. S.R. Ranganathan, Prolegomena to Library Classification (New York: Asia Publishing House, 1967). AMS Press, Inc. ! Research Annuals, Serials, and Studies for the College and University Research Library ! Orders and Queries: amserve@earthlink.net www.amspressinc.co m www.mariapinto.es/ciberabstracts www.mariapinto.es/e-coms www.aneca.es/activin/activin_conver_LLBB