Exploring LIS Students’ Beliefs in Importance and Self-Efficacy of Core Information Literacy Competencies 703 Exploring LIS Students’ Beliefs in Importance and Self-Efficacy of Core Information Literacy Competencies Maria Pinto and Rosaura Fernandez Pascual Maria Pinto is Facultad de Comunicación y Documentación and Rosaura Fernandez Pascual is on the Faculty of Economics and Business in the Department of Quantitative Methods for Economics and Enterprise, both at the University of Granada, Spain; e-mail: mpinto@ugr.es, rpascual@ugr.es. © 2016 Maria Pinto and Rosaura Fernandez Pascual, Attribution-NonCommercial (http://creativecommons.org/licenses/by-nc/3.0/) CC BY-NC. Understanding perceptions of Library and Information Science (LIS) students on two dimensions—belief in the importance (BIM) of a set of core information competencies, and Self-Efficacy (SE)—is pursued. Fac- tor analysis implementation raises a clear distinction between BIM and SE results. This analysis points to two sets of competencies: principal competencies reflected the most positive insights from students, while the secondary ones were the most weakly perceived and consequently the most in need of encouragement. This is one of the few studies on the subjective profile of LIS students, including improvement initiatives relating to the weakest competencies. here is no doubt that all developed countries are in need of increasing In- formation Literacy (IL) awareness of their citizens on the vital importance of this issue. In the case of a special kind of citizen—the students (under- graduates and graduates) of Library and Information Science (LIS)—their state of awareness on IL ought to be even more important, as skills and capabilities within IL are key requirements for their future professional practice. Closely related to LIS students’ awareness, two kinds of perceptions are key: belief in the importance of such competencies, and Self-Efficacy. The significance of these perceptions is evident, as they influence students’ attitudes and behaviors. The belief in importance (BIM) concept, which some educators identify with the idea of motivation, refers to the rating of the importance of certain competencies on the part of students. Self-Efficacy (SE) seems to be a more sophisticated idea, often defined as people’s beliefs about their capabilities to produce the designated levels of performance. SE determines how people feel, think, motivate themselves, and behave. Such beliefs produce these diverse effects through four major processes: cognitive, motivational, affective, and selection.1 Information literacy self-efficacy and academic motivation, one could argue, both play important roles in student academic development.2 There is a third dimension, of a qualitative nature, which also affects students’ awareness, which arises from the student’s preferred choice of learning sources (LS). Four sources of learning have been examined in this study: classroom, library, specific courses, and self-learning. Although students actually resort to all these sources, we hope to ascertain to what extent each of them is used, as this will help to form a profile of the students’ IL behavior. doi:10.5860/crl.77.6.703 crl15-822 704 College & Research Libraries November 2016 This work aims to address the status of these perceptual dimensions (BIM and SE) among a large population of LIS students concerning a previously defined set of core competencies. This is a self-assessment study that makes use of a previously validated self-assessment tool, the IL-HUMASS. Limited to the context of higher education, the main goal of the IL-HUMASS questionnaire is to better understand the students’ ap- preciation of IL. By using these data, research may contribute to the improvement of the models of strategic planning and instruction based on competencies. A clear—and above all, comprehensive—understanding of perceptions among LIS students, both undergraduates and graduates, concerning IL competencies should include the four categories of searching, evaluation, processing, and communication- dissemination of information. Therefore, the following objectives are intended: • To examine the possible influence of gender, age, academic year, and program type (undergraduate or graduate) on the perceptions of a sample of students in relation to a set of previously defined basic competencies, grouped into the four aforementioned categories. • To analyze mean scores, as well as data relating to statistical similarities and differences in IL-HUMASS responses among the selected students, regarding their subjective levels of perceptions with regard to BIM and SE, as these play a key role in the learning process. • To explore students’ favorite learning sources (LS) and their distribution among the competencies and categories. • From a deeper perspective so as to explore the factors that underlie the set of previously defined responses from both the BIM and SE dimensions, as pro- vided by factor analysis techniques. In this way, two closely related aims may be achieved. First, it may be possible to notice patterns of response between the constructs of the questionnaire and those obtained through factor analysis. Second, the main objective of understanding the strengths and weaknesses of the perceptions of LIS students with regard to the set of basic information competencies should be clear. One may thereby focus on the lowest rated com- petencies, because these are home to the greatest weaknesses of students and, therefore, the greatest opportunities for improvement. Literature Review The mastering of information competencies constitutes a key issue in higher educa- tion. Particularly in the case of LIS students, these capabilities should form the base of their learning program and will form the main condition for any future professional career. Statements on competence and educational policy of LIS schools should take this assertion into consideration. In the opinion of Lester and Van Fleet: “The use by schools of library and information studies of these competencies documents [sic] is an indication of the strength of the ties between education and practice, as these docu- ments express the perspective of practitioners as formulated at the national level.”3 Foster is one of few authors who describe how the rationale of LIS courses should “allow future professional developments to be innovative and adaptive and meet the needs of the profession as it evolves within the information society.”4 Within this LIS environment, Varlejs raises several questions concerning the skills and knowledge now needed by information professionals, especially what LIS education was offering that was relevant to preparing graduates for careers in the special library sector.5 Moreover, IL instruction must be accompanied by an assessment that measures stu- dents’ learning outcomes in information competencies. There are numerous assessment strategies, methods, and tools currently available. Lindauer conceptualizes the critical arenas from which data and documentation can be collected for assessment purposes: Exploring LIS Students’ Beliefs in Importance and Self-Efficacy 705 learning environment, IL program components, and students’ learning outcomes.6 De- tlor et al. offer a comprehensive model of assessment based on interviews, including a standardized IL testing instrument involving all factors affecting IL instruction among librarians, library administrators, faculty members, and students at business schools.7 One frequently employed tool for analysis is the self-assessment test, sometimes used as the main method,8 or combined with other data of a more objective nature.9 A large number of works throughout the field make use of self-assessment as a diagnostic method that provides information about students’ perceptions and needs to improve the training provided both in libraries and through IL programs.10 Streatfield and Markless suggest a facilitated action research, based on its impact evaluation model.11 The Health Sciences area of IL has been assessed within the subject of Biology Studies at Macquarie University using a number of tools.12 The study by Elder et al. stands out in this example, in which the goal was to “critically examine one widely used health literacy test, drawing insights not only from health science but also from applied linguistics and language testing.”13 According to Fetter, students want “fair access to informatics and technology-rich clinical settings.”14 Wilkinson et al. analyze the “psychometric properties of instruments used in healthcare education settings, measuring experience and attitudes of healthcare students.”15 Colthart et al. “highlights the need to consider the potential for combining qualitative and quantitative data to further our understanding of how self-assessment can improve learning and profes- sional clinical practice.”16 Online assessments delivered during induction workshops were presented.17 Likewise, the Research Readiness Self-Assessment (RRSA) was used to measure the health information competencies of students.18 Within the areas of Social Sciences and Humanities, there are a considerable number of available IL studies. In the specific field of Translation and Interpreting, existing research provides evidences of information behaviour and the degree of acquisition of information competencies among a group of students, teachers, and professionals, thus allowing a better instructional design.19 Gross and Lathan describe the vision of freshmen (first-year) students on the acquisition of information skills, on the self- perception of their own knowledge and skills, and subsequently compare this vision with the scores obtained in a standardized test.20 The subsequent results show that students cared more about the final result than the search process and that they pre- ferred to acquire information skills on their own, emphasizing their personal interests as key to finding information. Pinto’s work uncovers the history students’ subjective perception of their own IL status.21 Another self-assessment experience is reported by Singh with the purpose of assess- ing the faculties’ perceptions of their students’ information literacy competencies in journalism and mass communication programs.22 Thaxton’s work explores changes in the nature of information dissemination within psychology since 1985, with particular emphasis on the impact of such changes on library instruction and information literacy. Interviews with a limited sample of faculty members suggest that students’ abilities to identify and evaluate information critically may be overestimated.23 McKinney et al. published a study where a number of evaluation instruments (questionnaire, informa- tion literacy competency test, focus group, and student reflective work) were used to examine staff and student perceptions.24 However, contrary to that observed in the area of social sciences, literature relating to IL self-assessment in the LIS domain provides fewer studies. Al-Daihani used a student ranking of resources and facilities to explore perceptions and views regarding Informa- tion and Communication Technologies (ICT) education within two LIS departments in Kuwait.25 The topic of IL self-assessment in LIS education has been also addressed by Malliari et al., demonstrating the utility of SE to understand the behavior of students 706 College & Research Libraries November 2016 with regard to information technology.26 Pinto and Fernandez-Pascual analyze the perceptions and outcomes of a group of LIS students on their competencies, using a mixed assessment model (subjective-objective) centered on perceptions and evidence.27 Studies about the evaluation of competencies, instruction programs, and students’ learning outcomes are more frequent. A case study from the University of Illinois at Urbana-Champaign reports on a simple, cyclical process to assess IL instruction in a hybrid distance education context. The challenges most noted included “the information-intensive nature of the graduate LIS curriculum; the scarcity of specialized research-level LIS collections; the enduring value of the physical library as a framework for understanding and accessing library resources and services; and the importance of practising librarians as professional role models.”28 Hebrang, Grgic, and Spiranec explored “the transferability of IL competencies to the overall research experience of LIS students and the application of IL competencies in fulfilling course assignments.”29 Krakowska offers a searching study with the aims of gaining knowledge about some core components of information-seeking behavior.30 Todorova and Peteva suggest the building of intellectual property competency as part of the information literacy of LIS students.31 Rudzioniene revealed the current situation of LIS students’ information literacy using the PIL survey.32 Blumer et al. implemented a case study on the evaluation of competencies in the field of LIS: “The biggest problems remain within the information need and information organization standards.”33 Boustany evaluates the IL competencies of students at Paris Descartes University Institute of Technology, suggesting that they seem to adopt an easy way out attitude: “They rely heavily on search engines instead of databases. They prefer Wikipedia to the classical encyclopedia, and they choose the resource they know or have heard about.”34 According to Matteson, students’ cognitive, emotional, and social characteristics have to be considered for successful learning.35 Few of the aforementioned publications, however, offer a diagnosis of the percep- tions of students regarding their information competence status. Oakleaf and Kaske suggest that this can be accomplished through the application of self-assessment tests and measures. Even more interesting is the combination of objective and subjective tools, as this provides a look at the relationship between standardized measurements and estimates of students’ IL skills.36 Despite the scarcity of a body of experiments simultaneously applying self-assessment perceptions and objective evaluation, studies of this type should be taken into account.37 Methodology Here we describe in detail our research tool, the population base of the study, and the different types of analysis used. Descriptive and inferential statistics, revealing various layers of results, have been combined. The Tool The Information Literacy–Humanities and Social Sciences (IL-HUMASS) question- naire is designed on the basis of a large body of literature within the field of IL. It was developed employing both general and normative methods,38 as well as specific methods, from both user and evaluation perspectives.39 The study has been described by its creators as “a comprehensive and user-friendly survey of self-assessment con- taining an exhaustive set of variables (grouped into categories) related to IL and to the specific target population of higher education in the humanities and social sciences of various Spanish and Portuguese universities.”40 The survey responses cover the three internal pillars of IL (motivation, self-efficacy, and preferred source of learning) and offer basic information on the IL perceptions of students.41 The implementation Exploring LIS Students’ Beliefs in Importance and Self-Efficacy 707 of the survey allows us to know the students’ opinions of their own BIM and SE with regard to each of the twenty-six competencies. It is also intended to identify the students’ favorite LS concerning each competence. Accordingly, the tool provides information on quantitative (BIM and SE) and qualitative (favorite LS) dimensions. The questionnaire gathers data from twenty-six questions on four interrelated con- structs, or categories: searching, evaluation, processing, and communication-dissemination of information. A Likert scale ranging from 1 (low) to 9 (excellent) was used for the quantitative results: less than 5 (not important/scarce), between 5 and 6 (moderately important/moderate), between 6 and 7 (important/normal), between 7 and 8 (very important/high), more than 8 (excellent). The twenty-six questions in the survey were grouped as such: • Searching: 1) using printed sources of information, 2) entering and using automated catalogues, 3) consulting and using electronic sources of printed information, 4) using electronic sources of secondary information, 5) know- ing the terminology of your subject, 6) searching for and retrieving Internet information, 7) using informal electronic sources of information, 8) knowing information search strategies. • Evaluation: 9) assessing the quality of information resources, 10) recognizing the author’s ideas within the text, 11) knowing the typology of scientific informa- tion sources, 12) determining whether an information resource is updated, 13) knowing the most relevant authors and institutions within your subject area. • Processing: 14) systematizing information and abstracting, 15) recognizing text structure, 16) using database managers, 17) using bibliographic reference managers, 18) handling statistical programs and spreadsheets, 19) installing computer programs. • Communication-Dissemination: 20) communicating in public, 21) communicating in other languages, 22) writing a document, 23) knowing the code of ethics in your academic/professional field, 24) knowing the laws on the use of information and intellectual property, 25) creating academic presentations, 26) disseminating information on the Internet. We want to stress that twelve of the twenty-six competencies (46.15%) are related to ICT. The questionnaire has been widely validated in previous studies,42 and we believe that this scale seems highly consistent and reliable (Cronbach alpha coefficient, 0.831). The Sample Both undergraduate and graduate LIS students from the University of Granada, Spain, were surveyed. Data collection began during the second semester of the academic year 2007–2008 and completed at the end of the 2013–2014 academic year. The data collected offer a complete picture of the changes undergone by students’ IL literacy perceptions in Spain during the last few years. The richness of the sample is conditioned by the participation of students from four university programs. Specifically, the data were obtained from students in the following programs: • Bachelor Degree in Librarianship and Documentation, B-LibDoc (2007–2010). This course lasted for three years. • Bachelor Degree in Documentation, B-Doc (2007–2010). This course lasted for two years and was available to students who had finished their previous three- years’-long degree. So, participants from this source are considered in this study to be fourth- or fifth-year students for investigation purposes. • Bachelor Degree in Information Studies and Documentation, B-InfDoc (2010– ). This course resulted from the adaptation of the European Higher Education Area programme in 2010, and it is currently four years long. 708 College & Research Libraries November 2016 • Master’s Degree in Information Studies and Communication, MA-InfCom. The division of respondents by graduate and undergraduate status is displayed in table 1. The ratio of undergraduate participants with respect to the total number of students in each respective program is given by academic year. Graduate participants are given in total. The surveyed LIS students range between the ages of 18 and 54. The percentages of undergraduates from the second and fourth year were similar (9%), while the percent- ages in the third and fifth year were slightly higher (13%). The highest participation was from first-course students (34.7%). Graduates accounted for 18.6 percent. Statistical Analysis The two types of statistics used, descriptive and inferential, should allow both an improved understanding of LIS students’ perceptions of competencies and a better knowledge of the structure and classification that lie behind these perceptions. Cap- turing this structure and classification would enable polishing existing IL instruction by means of improvement strategies. Descriptive statistics, including mean scores and standard deviations, were used for the results tabulation using SPSSv20 software. Since the instrument evaluates ordinal variables and normality is not fulfilled, the use of nonparametric data analysis was required. The U-Mann-Whitney and Kruskal-Wallis tests allowed some comparison of distributions among groups of students with regard to age, gender, academic year and level of studies. With the goal of testing the appropriateness of the constructs, as well as obtaining a categorisation and selection of the competencies, a factor analysis was used with the aim of investigating the adequacy of the four factors, or competency categories, proposed by the questionnaire. We sought thereby to reduce the number of elements to be studied and to observe how they were interlinked. Our interest in factor analysis was chiefly focused on the categorization of the competencies it provides, as well as its selective nature to discard the less relevant ones. In fact, factor analysis was carried out on both quantitative dimensions (BIM and SE). Such factor analysis has been employed by Chanlin,43 among others, to determine some undergraduate students’ perception of their capabilities; Mackey & Ho, to identify dimensions of IL and Information technolo- gies;44 and Thompson, to present a large-scale international study involving faculty and graduate students’ perceptions of academic libraries.45 TABLE 1 Sample Distribution by Academic Year and Degree Undergraduate Graduate Academic Year B-LibDoc B-Doc B-InfCom Total MA-InfCom 2007/08 81/181 67/194 — 148 24 2008/09 38/173 28/135 — 66 23 2009/10 38/162 39/140 — 77 22 2010/11 — — 38/69 38 20 2011/12 — — 69/145 69 18 2012/13 — — 79/228 79 15 2013/14 — — 47/273 47 11 Total 157 134 233 524 133 Exploring LIS Students’ Beliefs in Importance and Self-Efficacy 709 Findings The responses obtained from the survey are given below. We present these data with our own analysis and interpretation of the results, by category. BIM and SE Levels The resulting mean scores and standard deviation (in brackets) for each category of competencies, considering both BIM and SE dimensions, are displayed in figure 1. The BIM dimension offers scores ranging between 7.00 and 8.00 (very important) in all categories. The SE dimension scores fall between 6.00 and 7.00 (normal) in all categories. Within each dimension, the difference of scores across categories may be considered to be insignificant. Reviewing Learning Sources The qualitative dimension of favorite source of learning (LS) is explored from a de- scriptive point of view. LS are an essential factor in understanding the subjective char- acteristics of LIS students in relation to their information competencies. Head (2008) has studied types of LS by means of the Project Information Literacy (PIL). Globally considered, LIS students expressed a preference for the classroom as favorite LS. The next most chosen options were self-learning and specific instruction courses. “Library” is conceived here as any physical and/or digital facility from which one may access various sources of information, preferably also as the provider of accurate and effec- tive librarian advice. Surprisingly, this source of learning was only selected by 4.69 percent of students (see table 2). FIGURE 1 Overall Results of BIM and SE by Competency Categories 7.90% (0.78) 6.65% (1.23) 7.85% (0.89) 7.66% (0.98) 6.17% (1.32) 7.89% (0.99) 6.25% (1.34) 0% 1% 2% 3% 4% 5% 6% 7% 8% 9% BIM SE BIM SE BIM SE BIM SE Se ar ch in g Ev al u at io n P ro ce ss in g C o m m u n ic at io n 6.54% (1.28) 710 College & Research Libraries November 2016 As for the distribution of LS by category, the library received the lowest of any score in the processing category in terms of students’ BIM and SE (see figure 2). Examining Gender, Age, Academic Year, and Program The survey data was analyzed to investigate the differences regarding gender, age, academic year, and program of study. Nonparametric interpretation methods were employed. Specifically, the U-Mann-Whitney test was used to compare differences between male and female responses, while Kruskal-Wallis was used to determine if there were statistically significant differences between the variables BIM and SE that were not normally distributed. In relation to gender, the statistics indicate that there are no significant differences of students’ perceptions of competencies between females and males. These results were checked by means of a Mann-Whitney U-test (P > 0.05). On the basis of age, the analysis showed significant differences among age groups with regard to the students’ perceptions of competencies in the categories of search, FIGURE 2 Favorite Sources of Learning by Competency Categories 62.20% 31.86% 65.84% 42.71% 58.89% 30.83% 46.56% 32.22% 22.10% 48.12% 23.24% 47.07% 30.07% 53.51% 42.51% 50.05% 9.87% 12.41% 4.89% 3.01% 1.44% 1.88% 2.19% 2.90% 5.83% 7.61% 6.03% 7.22% 9.61% 13.78% 8.73% 14.82% 0% 10% 20% 30% 40% 50% 60% 70% Undergraduate Graduate Undergraduate Graduate Undergraduate Graduate Undergraduate Graduate Se ar ch in g Ev al u at io n P ro ce ss in g C o m m u n ic at io n Class Self-Learning Library Courses TABLE 2 Favorite Learning Sources among Undergraduates and Graduates Favorite Learning Source Undergraduates (%) Graduates (%) Total (%) Class 58.37 34.40 53.51 Self-learning 29.48 49.69 33.58 Library 4.60 5.05 4.69 Courses 7.55 10.86 8.22 Exploring LIS Students’ Beliefs in Importance and Self-Efficacy 711 evaluation, and processing (Kruskal-Wallis test, P < 0.05). However, no significant dif- ferences were found in the category of communication-dissemination (P > 0.05). As can be seen, this category demonstrates the most consistent student valuation by age, in both BIM and SE (see figure 3). The BIM lines, whose values are always higher than those found for SE, are more consistent across the different age groups, and very close among the four informational categories. Meanwhile, SE lines are more varied, with a significant increase in scores provided by students of 19, and also a considerable decrease of students of older than 36 years. For those ranging in age from 19 to 35 years, the rise in all lines is very slight, with greater fluctuations in SE. Among students of older than 36 years, BIM increases significantly while SE decreases considerably. With respect to the academic year, no significant differences were found among the overall mean importance given to the four constructs (Kruskal-Wallis, p>0.05). However, it was found that there are significant differences regarding the global SE scores in all the competence constructs (p<0.05). Within the academic subject, graduate and undergraduate responses were analyzed separately. The overall mean scores and standard deviations (in brackets) for BIM and SE in the four competencies constructs are shown in Figure 4. The undergraduate results reveal lower levels of BIM and SE than graduate ones in the categories of searching, evaluation, and communication of information. In addition, the variability of results is greater in the undergraduate responses. Consequently, students’ perceptions of BIM and SE are more consistent at graduate level than at undergradu- ate level. Even so, it was found that there were no significant differences among the overall mean importance and SE given to the processing competence (U-Mann Whitney, FIGURE 3 Mean Scores of BIM and SE on Categories of Competencies, by Age 9 8.5 8 7.5 7 6.5 6 5.5 5 4.5 4 18 19 20 21 22 23 24 25 (26–35) >35 712 College & Research Libraries November 2016 P > 0.05). By contrast, the three remaining competence constructs did demonstrate significant differences in relative scores (P < 0.05). Figures 5 and 6, respectively, show the mean levels of BIM and SE for all of the 26 competencies. As can be observed, the highest levels of BIM are seen in items 6 (search- ing for and retrieving Internet information), 14 (systematizing and abstract information), and 22 (writing a document). The mean scores for BIM are more homogeneous than those seen for SE, which present a clearly different behavior. Graduate students gave a higher SE score in all cases except for items 16 (using database managers), 17 (using bibliographic reference managers), and 18 (handling statistical programs and spreadsheets), which are related to ICT. Factors Relating to Belief of Importance One of the goals of our research was to determine if the IL-HUMASS constructs— namely its four categories—reflect the underlying competencies of the surveyed population. To this end, two separate exploratory factor analyses on both BIM and SE dimensions, respectively, were conducted. Factor analysis operates on the notion that measurable and observable variables can be reduced to fewer latent variables that share a common variance and are unobserv- able, which is known as reducing dimensionality. However, factor analysis involves not only the removal of some variables (by reduction), but also some regrouping (classification) of variables as a process of abstraction, which thereby allows one to uncover the underlying connections. Therefore, in our study certain IL competencies of secondary relevance were reduced or regrouped in this manner, and we believe FIGURE 4 Comparisons of BIM and SE Mean Scores and Standard Deviations between Undergraduates and Graduates 7.86% (0.79) 6.57% (1.24) 7.76% (0.90) 6.37% (1.28) 7.65% (0.95) 6.14% (1.32) 7.84% (1.00) 6.14% (1.33) 8.10% (0.71) 7.04% (1.21) 8.23% (0.76) 7.31% (0.92) 7.67% (0.96) 6.30% (1.28) 8.10% (0.94) 6.71% (1.30) 0% 1% 2% 3% 4% 5% 6% 7% 8% 9% BIM SE BIM SE BIM SE BIM SE Se ar ch in g Ev al u at io n P ro ce ss in g C o m m u n ic at io n Undergraduate Graduate Exploring LIS Students’ Beliefs in Importance and Self-Efficacy 713 that this kind of factor analysis is important in understanding the subjective weak points in student responses. Two types of factor analysis should be distinguished here: exploratory and confirmatory. In the first case, exploratory analysis, the factors that result from the factor analysis are not known, while in the second (confirmatory) case, the intention is to check whether a relationship between observed variables and their underlying latent constructs exists. We believe that, in the case of the BIM dimension of variables, the use of exploratory factor analysis has validated the relevance of the four categories, or factors, initially proposed by the questionnaire. Competencies for each of the four factors, as well as their respective loadings and accumulated variances, are displayed in table 3. Following Kaiser’s criterion, factors are included when the eigenvalue is greater than 1.46 For discrimination purposes, in this study only the variables with factor loading value higher than 0.6 and less than 0.5 for the other factors were included. This method categorizes factors according to the variance explained: BIM–Commu- nication (39.40%), BIM–Evaluation (9.95%), BIM–Searching (8.10%), and BIM–Processing (4.41%). The difference between the most important category (BIM–Communication) and the least (BIM–Processing) is highly significant. As stated before, our factor analysis method has reduced the number of basic competencies under consideration. These selected competencies are principal, while the remaining nine are considered secondary. From these seventeen competencies, FIGURE 5 BIM Mean Scores in the Twenty-six IL Competencies FIGURE 6 SE Mean Scores in the Twenty-six IL Competencies 714 College & Research Libraries November 2016 TABLE 3 Factor Analysis Of Students’ Belief in Importance Of Competences Belief in Importance (BIM) Factors Loading Accumulated Variance (%) Factor 1: BIM-Communication 39.40 BIM-20 Communicating in public 0.792 BIM-21 Communicating in other languages 0.745 BIM-22 Writing a document (such as a report or academic work) 0.780 BIM-25 Creating academic presentations (using PowerPoint, for example) 0.706 BIM-26 Disseminating information on the Internet (through webs, blogs, and other social platforms) 0.694 Factor 2: BIM-Evaluation 49.35 BIM-10 Recognizing the author’s ideas within the text 0.784 BIM-11 Knowing the typology of scientific information sources (thesis, proceedings, and so on) 0.696 BIM-12 Determining whether an information resource is updated 0.719 BIM-13 Knowing the most relevant authors and institutions within your subject area 0.784 Factor 3: BIM-Searching 57.45 BIM-2 Entering and using automated catalogues 0.757 BIM-3 Consulting and using electronic sources of primary information (such as journals) 0.742 BIM-4 Using electronic sources of secondary information (like databases) 0.764 BIM-6 Searching for and retrieving Internet information (such as advanced searches, directories, portals) 0.711 BIM-8 Knowing information search strategies (descriptors, Boolean operators, and such) 0.642 Factor 4: BIM-Processing 61.86 BIM-16 Using database managers (such as Access, MySQL) 0.828 BIM-17 Using bibliographic reference managers (Endnote, Reference Manager, and so on) 0.796 BIM-18 Handling statistical programs and spreadsheets (for instance, SPSS, Excel) 0.790 Extraction Method: Principal Component Analysis. Rotation Method: Promax with Kaiser Normalization Exploring LIS Students’ Beliefs in Importance and Self-Efficacy 715 ten (58.82%) are related to ICT, thus increasing the role of technological competence included in the questionnaire, even though eight of the ten competencies on ICT rank at the bottom of this classification, being, therefore, less important for students. With regard to LIS students’ BIM responses for the basic competencies, the BIM–Com- munication factor is by far the most important. This incorporates five basic competen- cies: 1) communicating in public; 2) communicating in other languages; 3) writing a document; 4) creating academic presentations; and 5) disseminating information on the Internet. The BIM–Evaluation factor is second in order of importance, and comprises four basic competencies: 1) recognizing the author’s ideas within the text; 2) knowing the typology of scientific information sources; 3) determining whether an information resource is updated; and 4) knowing the most relevant authors and institutions within your subject area. The third factor is BIM–Searching, consisting of five competencies: 1) entering and using automated catalogues; 2) consulting and using electronic sources of primary information; 3) using electronic sources of secondary information; 4) searching for and retrieving Internet information; and 5) knowing information search strategies. The last and least important factor is that of BIM–Processing, with three items: 1) us- ing database managers; 2) using bibliographic reference managers; and 3) handling statistical programs and spreadsheets. Correlations among factors are also provided in table 4. BIM–Evaluation strongly correlates with BIM–Communication and BIM–Searching. In the rest of the cases, corre- lations among factor pairs are moderate. Factors of Self-Efficacy With reference to SE responses in the study, five factors have been identified. From the twenty-six competencies, fifteen (57.69%) were considered to be principal, while the remaining eleven are secondary. The exploratory nature of our analysis has revealed a new factor of a technological kind, as the two competen- cies create academic presentations and dis- seminate information on the Internet load in a significant and independent way. Therefore the IL model concerning SE consists of five constructs that explain an accumulated variance of 61.10% (Table 5). As in the preceding case, our factor analysis categorizes the factors according to their explained variance, and this allows the following sequence by order of importance: SE–Evaluation (37.20%), SE–Searching (8.81%), SE–Technology (6.38%), SE–Communication (4.69%), and SE–Processing (4.02%). The difference between the most important category (SE–Evaluation), and the least important (SE–Processing), is highly significant. Also in this case, the exploratory factor analysis methodology has reduced the total number of com- petencies under consideration, leaving only fifteen from twenty six. Eight of the selected competencies (53.34%) are technological, increasing the ICT role with respect to the total. The SE–Evaluation factor is the most important by far, even though it is represented by only three basic skills: 1) recognizing the author’s ideas within the text; 2) knowing the typology of scientific information sources; and 3) knowing the most relevant authors and institutions within your subject area. Substantially less important, SE–Search- ing factor also consists of three core competencies: 1) entering and using automated catalogues; 2) using electronic sources of secondary information; and 3) knowing information search strategies. TABLE 4 Correlation Matrix of BIM Factors Factor 1 2 3 4 1 1 0.505 0.424 0.399 2 1 0.579 0.359 3 1 0.441 4 1 716 College & Research Libraries November 2016 TABLE 5 Factor Analysis of Students’ Self-Efficacy on Competencies Self-Efficacy (SE) Factors Loading Accumulated Variance (%) Factor 1: SE-Evaluation 37.20 SE-10 Recognizing the author’s ideas within the text 0.781 SE-11 Knowing the typology of scientific information sources (thesis, proceedings, and so on) 0.695 SE-13 Knowing the most relevant authors and institutions within your subject area 0.698 Factor 2: SE-Searching 46.01 SE-2 Entering and using automated catalogues 0.833 SE-4 Using electronic sources of secondary information (like databases) 0.812 SE-8 Knowing information search strategies (descriptors, Boolean operators, and such) 0.742 Factor 3: SE-Technology 52.39 SE-25 Creating academic presentations (using PowerPoint, for example) 0.795 SE-26 Disseminating information on the Internet (through webs, blogs, and other social platforms) 0.762 Factor 4: SE-Communication 57.08 SE-21 Communicating in other languages 0.851 SE-22 Writing a document (such as report, academic work) 0.784 SE-24 Knowing the laws on the use of information and intellectual property 0.638 Factor 5: SE-Processing 61.10 SE-14 Systematizing and abstracting information 0.643 SE-16 Using database managers (such as Access, MySQL) 0.818 SE-17 Using bibliographic reference managers (Endnote, Reference Manager, and so on) 0.822 SE-18 Handling statistical programs and spreadsheets (for instance, SPSS, Excel) 0.600 Extraction Method: Principal Component Analysis. Rotation Method: Promax with Kaiser Normalization Exploring LIS Students’ Beliefs in Importance and Self-Efficacy 717 SE–Technology is an emerging factor that encompasses two basic competencies: 1) creating academic presentations and 2) disseminating information on the Internet. SE–Communication includes three basic IL skills: 1) communicating in other languages; 2) writing a document; and 3) knowing the laws on the use of information and intel- lectual property. SE–Processing is the least significant factor, although it is represented by the largest number of competencies: 1) systematizing and abstracting information; 2) using database managers; 3) using bibliographic reference managers; and 4) handling statistical programs and spreadsheets. Correlations among these five SE underlying factors are displayed in table 6. At one end we find SE–Evaluation, which correlates strongly with SE–Searching. At the other is SE–Processing, which correlates weakly with SE–Technology and SE–Communication. In all remaining cases, the correlations between pairs achieves a moderate relevance. Discussion When it comes to interpreting results and raising comparisons with other similar work, some specific characteristics ought to be stressed. One such charac- teristic relates to the population being studied—LIS students—to which surprisingly little research has been devoted from a self- assessment perspective. Another characteristic observation of this kind may be seen in the obtained scores when evaluating these according to course seniority (graduate vs undergraduate). The key role of students’ perceptions with regard to the learning process is roundly confirmed.47 One of the most significant findings is that graduates show mean scores in both BIM and SE that are significantly higher than those of undergraduates. This observation reveals that students with past IL instruction demonstrate higher scores in the categories of search, evaluation, and communication–dissemination of information.48 Students in the master’s program, and those students who have completed a bachelor’s degree in Documentation (B-Doc), have had, by definition, past experience with IL instruction. One would thereby expect to see consistent progress in student informa- tion learning as they go through the various degree courses. The evidence obtained from our data allows confirmation of these expectations. The two dimensions of self-evaluation—BIM and SE—that are the subject of this research exhibit similarities to other results that have been obtained using the IL- HUMASS questionnaire. With regard to students’ attitudes, we have found high levels of awareness about the importance of a number of basic competencies, which in turn are highly concentrated. Scores for SE are rather normal, showing greater dispersion than for BIM. These results are similar to findings in earlier studies.49 In both cases, BIM and SE evolution is positive, although barely noticeable for BIM and very light in the case of SE. Teachers and instructors can, and should, aspire to greater progress in developing the self-esteem of their students along their educational journey. Con- cerning the third dimension used, LS, the phenomenon of the limited use of libraries by students has come to our attention once again. This is a worrying actuality, as both undergraduate and graduate LIS students will be our future library professionals. Regarding the assessment of competence categories, the results reveal differences not only between the BIM and SE dimensions within the same category but also TABLE 6 Correlation Matrix of SE Factors Factor 1 2 3 4 5 1 1 0.592 0.410 0.498 0.311 2 1 0.363 0.302 0.444 3 1 0.448 0.223 4 1 0.188 5 1 718 College & Research Libraries November 2016 among the different categories themselves. We have seen how communication and searching stand out in one of the two dimensions. Evaluation of information occupies an intermediate position, as far as students are concerned. The category of processing is the lowest rated. Any factor analysis selects a number of competencies from a previously established set. Furthermore, the analysis also arranges them and categorizes them as factors. Con- sidering BIM and SE in factor analysis, it may be observed that the factor of processing is the least relevant in both cases (see tables 3 and 5). Students’ perceptions are also differentiated according to BIM vs SE dimensions, as the communication-dissemination factor is found in first position under BIM, while evaluation is first under SE. Factor analysis from the BIM dimension selects a number of competencies, maintains the four categories of the questionnaire, and provides acceptable levels of internal correla- tion among these categories. Meanwhile, factor analysis from the SE perspective has selected a smaller number of competencies, has had the number of factors increased to five, and yet offers lower levels of internal correlation between these categories. Both factor analyses have used the appropriate selections of competencies from the questionnaire. These selected sets include those perceived by students as being the most relevant. However, the focus of this research is on the competencies that have not been selected by any of the factor analyses, and therefore those that are regarded as less relevant by students. Thus, we hope to achieve a better understanding of the subjective conditions of students, focusing on their deficiencies in attitude with respect to the competencies which are perceived to be less desirable. In this way, improvement initiatives concerning IL instruction may arise. We note that thirteen of the twenty-six competencies have been selected in both factor analyses. This selection provides a set of the most relevant, or principal, competencies. Regarding the remaining thirteen competencies that are not regarded as principal types, we suggest that further examination is required to enhance our understanding of the IL competencies and dimensions (BIM or SE) that are most in need of greater awareness and/or instruction to LIS students. Of these, there are two competencies that have not been selected in BIM factor analysis, although they have featured in SE factor analysis (see table 7). This condition reflects the fact that students do not suf- ficiently appreciate the importance of these two core competencies, although they do report that they have sufficient self-esteem for each. Surprisingly, even though LIS students did not sufficiently appreciate the importance of these two competencies, they still thought that they are experts in their practice. This slight contradiction would probably be better understood through an objective test that would prove their actual levels of competency. Second, we find competencies that have not been selected by SE factor analysis, but have been selected by BIM factor analysis. There are four competencies considered important by the students, although they do not feel sufficiently skilled in regard to them (Table 8). Compared with the previous two competencies in Table 7, perceptions of LIS students of these four competencies seem to be more reasonable. Not surprisingly, TABLE 7 Unselected Competencies from BIM Factor Analysis Information Competence 14. Systematizing and abstract information 24. Knowing the laws on the use of information and intellectual property Exploring LIS Students’ Beliefs in Importance and Self-Efficacy 719 we find notable levels of awareness regarding the importance of these competencies, even when recognizing students’ lack of expertise in their implementation. Finally, we find seven competencies that have not been selected for either of the two factors’ analyses. This means that these competencies are not as important in the opinion of students, nor do they feel sufficiently skilled to develop them (see table 9). Looking to the responses for the set of competencies in table 9, the need to improve the levels of BIM and SE of LIS students seems to be a priority. We should no doubt focus our attention on these seven competencies, as they require more development, not only from the perspective of the students’ states of awareness regarding their importance, but also from the point of view of their practi- cal implementation. The most immediate planning for instruction should focus espe- cially on the seven competencies that are undervalued by the students and for whose implementation they feel less proficient. Most of these competencies are cognitive in nature, not being informed by the use of specific technologies. Considering the future professional activity of LIS students, these competencies will adversely affect their ability to complete daily tasks and therefore cannot be ignored. University faculty and librarians should place special emphasis on a better understanding of the LIS students’ attitudes related to this set of competencies, as well as their actual capabilities, since these factors are closely related. We should also emphasize that only three of the thirteen secondary competencies, which are less valued by the students, can be regarded as ICT-related competencies. This confirms our view that the greatest difficulty for students with respect to master- ing these competencies and skills lies mostly in the cognitive domain. TABLE 8 Unselected Competencies from SE Factor Analysis Information Competence 3. Consulting and using electronic sources of primary information 6. Searching for and retrieving Internet information 12. Determining whether an information resource is updated 20. Communicating in public TABLE 9 Unselected Competencies from Both BIM and SE Factor Analysis Information Competence 1. Using printed sources of information 5. Knowing the terminology of your subject 7. Using informal electronic sources of information 9. Assessing the quality of information resources 15. Recognizing text structure 19. Installing computer programs 23. Knowing the code of ethics in your academic/professional field 720 College & Research Libraries November 2016 Conclusions Some recent study plans for Spanish university degrees in the Social Sciences (such as Education, Psychology, Journalism, Information Science, among others) have in- corporated an optional program component dealing with Information Literacy. Also, in recent years, some research projects on the assessment of Information Literacy in Spanish universities have been financed. The results of these studies will be of interest to the LIS faculties who have incorporated IL subjects into their curriculum. Taking into account the breadth of representation in the sample, we feel that the results can be viewed as a reliable picture of LIS students’ attitudes (belief in importance, self-efficacy, and favorite source of learning) with regard to this set of core informa- tion competencies. Although they are similar, these reported attitudes show clear differences that occur among the four IL categories. Broadly speaking, LIS students are well aware of the importance of information competencies, although self-esteem is not so high when tackling these competencies in practice. They report the greatest BIM level with regard to the competencies included in the categories of communication and evaluation, which are strongly correlated (by analysis). On the other hand, they feel the most skilled (SE) in the activities of evaluation and searching, two categories that are also highly correlated. IL instruction programs should bear in mind these subjective observations, all the more so when it comes to future LIS professionals. Likewise, significant differences were routinely found between undergraduates and graduates concerning their levels of belief in importance and self-efficacy, except in the processing category. No significant differences in results, however, were found on the basis of gender, age via academic year. One of the most striking questions to emerge from this work is this one: why do LIS students use the library so infrequently? Finding a possible answer to this conundrum is hard; especially when one considers that no drastic change in information provision has been observed. Finding a means by which to encourage the use of the library by these future LIS professionals is not an easy issue. Universities, and especially their faculty members, should raise awareness among students about the critical importance of the use of libraries—not only as a physical location from which to obtain knowledge, but also as a place to receive expert guidance from the library staff. The two separate factor analyses—both BIM and SE—of students’ responses with respect to these information competencies have confirmed the suitability of the ques- tionnaire used, as well as the appearance of an emerging factor related to self-efficacy. We feel that these results highlight the previously identified selection of thirteen competencies—which we regard as the principal competencies—separately from the others. These principal competencies are unique in that students are not only aware of their importance but that they also feel qualified to implement them. It is also noteworthy that most of these competencies are technological in nature as they relate directly to ICT skills. By contrast, in the remaining thirteen competencies, students show reduced levels of belief in importance and self-efficacy. These competencies are regarded here as secondary, being rather cognitive in nature. Students feel that they are less stimulated and/or less skilled when dealing with this other half of information competencies. Thus, on the basis of these results, one may state that students display attitudes that are higher in regard to the ICT-related competencies and lower concerning those of a cognitive nature. In any event, the seven competencies showing the greatest deficiency in students’ BIM and SE responses have been identified. These competencies require the biggest effort on the part of stakeholders. Improvement initiatives regarding the awareness of their importance, as well as the more effective use of these competencies, should Exploring LIS Students’ Beliefs in Importance and Self-Efficacy 721 be planned. Grouped by category, they are found in: searching (using printed sources of information, knowing the terminology of your subject, and using informal electronic sources of information); evaluation (assessing the quality of information resources); processing (recognizing text structure and installing computer programs); and communication–dis- semination (knowing the code of ethics in your academic/professional field). Enhancing the awareness and instruction of these seven deficient competencies, which are mostly of a cognitive nature, should be a priority. The results obtained in this study confirm that a major difficulty for many of the students surveyed lies in their inability to appraise and practically apply certain cognitive activities by means of critical thinking. This leads us to conclude that students are probably more comfortable performing certain technological tasks that are directly related to ICT skills, as students believe these to be of greater value than critical thinking. However, information literacy cannot be deprived of its cognitive and critical dimension if it is to be effective. For this reason, universities and instructors must put special emphasis on the mastery of cognitive IL competencies in which LIS students believe they are deficient. We hope that the understanding gained from this study may be of assistance and encourage future research on this topic. Acknowledgments The authors are grateful to the Spanish Ministry of Science and Innovation, who have funded the research project EVAL-CI (Assessment of Information Competencies of Social Science Students in Higher Education: Designing Web Tools and an E-Learning 2.0 Training Proposal) (EDU2011-29290) for the period 2011–2014. We also are indebted to those who contributed to our research at various stages throughout its completion. 722 College & Research Libraries November 2016 ANNEX With regard to … Belief in Importance Self-efficacy Source of Learning Information Literacy Competencies-Abilities Low High 1 2 3 4 5 6 7 8 9 Low High 1 2 3 4 5 6 7 8 9 Cl Class Co Courses L Library S Self-learning O Others Searching 1. Using printed sources of information (books, papers, and so on) 2. Entering and using automated catalogues 3. Consulting and using electronic sources of primary information (such as journals) 4. Using electronic sources of secondary information (like databases) 5. Knowing the terminology of your subject 6. Searching for and retrieving Internet information (such as advanced searches, directories, portals) 7. Using informal electronic sources of information (blogs, discussion lists, and the like) 8. Knowing information search strategies (descriptors, Boolean operators, and such) Evaluation 9. Assessing the quality of information resources 10. Recognizing the author’s ideas within the text 11. Knowing the typology of scientific information sources (thesis, proceedings, and so on) 12. Determining whether an information resource is updated 13. Knowing the most relevant authors and institutions within your subject area Exploring LIS Students’ Beliefs in Importance and Self-Efficacy 723 Notes 1. Albert Bandura, “Self-Efficacy,” in Encyclopedia of Human Behavior, vol. 4, ed. V.S. Ramach- audran (New York: Academic Press, 1994), 71–81. 2. M. Ross, H. Perkins, and K. Bodey, “Information Literacy Self-Efficacy: The Effect of Jug- gling Work and Study,” Library & Information Science Research 35 (2013): 279–87. ANNEX With regard to … Belief in Importance Self-efficacy Source of Learning Information Literacy Competencies-Abilities Low High 1 2 3 4 5 6 7 8 9 Low High 1 2 3 4 5 6 7 8 9 Cl Class Co Courses L Library S Self-learning O Others Processing 14. Systematizing and abstract information 15. Recognizing text structure 16. Using database managers (such as Access, MySQL) 17. Using bibliographic reference managers (Endnote, Reference Manager, and so on) 18. Handling statistical programs and spreadsheets (for instance, SPSS, Excel) 19. Installing computer programs Communication-Dissemination 20. Communicating in public 21. Communicating in other languages 22. Writing a document (such as a report, academic work) 23. Knowing the code of ethics in your academic/professional field 24. Knowing the laws on the use of information and intellectual property 25. Creating academic presentations (using PowerPoint, for example) 26. Disseminating information on the Internet (through webs, blogs, and other social platforms) 724 College & Research Libraries November 2016 3. June Lester and Connie Van Fleet, “Use of Professional Competencies and Standards Documents for Curriculum Planning in Schools of Library and Information Studies Education,” Journal of Education for Library and Information Science 49, no. 1 (2008): 43–69. 4. Allen Edward Foster, “Information Literacy for the Information Profession: Experiences from Aberystwyth,” Aslib Proceedings 58, no. 6 (2006): 488–501. 5. Jana Varlejs, “Professional Competencies for the Digital Age : What Library Schools Are Doing to Prepare Special Librarians,” Education Libraries 26, no. 1 (2003): 16–18. 6. Bonnie Gratch Lindauer, “The Three Arenas of Information Literacy Assessment,” Reference & User Services Quarterly 44, no. 2 (2004): 122–29. 7. Brian Detlor et al., “Learning Outcomes of Information Literacy Instruction,” Journal of the American Society for Information Science and Technology 62, no. 3 (2011): 572–85. 8. Andrew Walsh, “Information Literacy Assessment: Where Do We Start?” Journal of Librari- anship and Information Science 41, no. 1 (2009): 19–28. 9. Avril Patterson, “A Needs Analysis for Information Literacy Provision for Research: A Case Study in University College Dublin,” Journal of Information Literacy 3, no. 1 (2009): 5–18. 10. M. Pinto, “Design of the IL-HUMASS Survey on Information Literacy in Higher Education: A Self-Assessment Approach,” Journal of Information Science 36, no. 1 (2010): 86–103; M. Pinto, Diagnóstico Del Proceso de Aprendizaje de Las Competencias Informacionales En Los Estudiantes de La Universidad de Granada: Un Estudio Transversal, 2011; M. Pinto et al., “Information Competence of Doctoral Students in Information Science in Spain and Latin America: A Self-Assessment,” Journal of Academic Librarianship 39, no. 2 (Sept. 2012): 144–54; M. Pinto and Dora Sales, “INFOLITRANS: A Model for the Development of Information Competence for Translators,” Journal of Documentation 64, no. 3 (2008): 413–37; Rosemary Green and Peter Macauley, “Doctoral Students’ Engagement with Information: An American-Australian Perspective,” portal: Libraries and the Academy 7, no. 3 (2007): 317–32; M. Gross and Don Latham, “Attaining Information Literacy: An Investigation of the Relationship between Skill Level, Self-Estimates of Skill, and Library Anxiety,” Library & Informa- tion Science Research 29 (2007): 332–53; Iain Colthart et al., “The Effectiveness of Self-Assessment on the Identification of Learner Needs, Learner Activity, and Impact on Clinical Practice: BEME Guide No. 10,” Medical Teacher 30, no. 2 (Jan. 2008): 124–45; Stella Korobili, Aphrodite Malliari, and George N. Christodoulou, “Assessing Information Literacy Skills in the Technological Education Institute of Thessaloniki, Greece,” Reference Services Review 37, no. 3 (2009): 340–54. 11. David Streatfield and Sharon Markless, “Evaluating the Impact of Information Literacy in Higher Education: Progress and Prospects,” Libri 58, no. 2 (2008): 102–09. 12. Susan Vickery and Heather Cooper, “Confidence or Competence? Auditing Information Literacy Skills of Biology Undergraduate Students,” in Celebrating Teaching at Macquarie (Sydney: Macquarie University Research Online, 2002), 1–7. 13. Catherine Elder et al., “Assessing Health Literacy: A New Domain for Collaboration Be- tween Language Testers and Health Professionals,” Language Assessment Quarterly 9, no. 3 (July 2012): 205–24. 14. Marilyn S. Fetter, “Graduating Nurses’ Self-Evaluation of Information Technology Com- petencies,” Journal of Nursing Education 48, no. 2 (Feb. 2009): 86–90. 15. Ann Wilkinson, Alison E. While, and Julia Roberts, “Measurement of Information and Communication Technology Experience and Attitudes to E-Learning of Students in the Healthcare Professions: Integrative Review,” Journal of Advanced Nursing 65, no. 4 (Apr. 2009): 755–72. 16. Colthart et al., “The Effectiveness of Self-Assessment.” 17. Vivien Sieber, “Diagnostic Online Assessment of Basic IT Skills in 1st-Year Undergraduates in the Medical Sciences Division, University of Oxford,” British Journal of Educational Technology 40, no. 2 (2009): 215–26. 18. Lana Ivanitskaya, Irene O’Boyle, and Anne Marie Casey, “Health Information Literacy and Competencies of Information Age Students: Results from the Interactive Online Research Readiness Self-Assessment (RRSA),” Journal of Medical Internet Research 8, no. 2 (2006). 19. Pinto and Sales, “INFOLITRANS: A Model for the Development of Information Competence for Translators.” 20. M. Gross and Don Lathan, “Self-Views of Information-Seeking Skills: Undergraduates` Understanding of What It Means to Be Information Literate,” 2007 OCLC/ALISE research grant report published electronically by OCLC Research. Available online at: http://worldcat.org/ oclc/317880111/viewonline [accessed 21 May 2016]. 21. M. Pinto, “Information Literacy Perceptions and Behaviour among History Students,” Aslib Proceedings 64, no. 3 (2012): 304–27. 22. Annmarie B. Singh, “A Report on Faculty Perceptions of Students’ Information Literacy Competencies in Journalism and Mass Communication Programs : The ACEJMC Survey,” College & Research Libraries 66, no. 4 (2005): 294–311. 23. Lyn Thaxton, “Information Dissemination and Library Instruction in Psychology Revisited,” Exploring LIS Students’ Beliefs in Importance and Self-Efficacy 725 Behavioral & Social Sciences Librarian 22, no. 1 (2002): 1–14. 24. Pamela McKinney, Myles Jones, and Sandra Turkington, “Information Literacy through Inquiry: A Level One Psychology Module at the University of Sheffield,” Aslib Proceedings 63, no. 2/3 (2011): 221–40. 25. Sultan M. Al-Daihani, “ICT Education in Library and Information Science Programs,” Library Review 60, no. 9 (Oct. 11, 2011): 773–88. 26. Afrodite Malliari, Stella Korobili, and Aspasia Togia, “IT Self-Efficacy and Computer Competence of LIS Students,” The Electronic Library 30, no. 5 (Sept. 28, 2012): 608–22. 27. M. Pinto and Rosaura Fernández-Pascual, “Information Literacy Competencies among Social Sciences Undergraduates: A Case Study Using Structural Equation Model,” in European Conference on Information Literacy (Dubrovnik, Croatia October 20–23, 2014): 370–78. 28. Susan E Searing, “Integrating Assessment into Recurring Information Literacy Instruction: A Case Study from LIS Education,” Public Services Quarterly 3, no. 1–2 (2007): 191–220. 29. Ivana Hebrang Grgic and Sonja Špiranec, “Information Literacy of LIS Students at the University of Zagreb: Pros or Just Average Millennials,” in Worldwide Commonalities and Challenges in Information Literacy Research and Practice, eds. Serap Kurbanoglu et al. (Istanbul: Springer, 2013), 581–87. 30. Monika Krakowska, “Information Literacy Skills Assessment of LIS Students: A Case Study at the Jagiellonian University,” in Worldwide Commonalities and Challenges in Information Literacy Research and Practice, eds. Serap Kurbanoglu et al. (Istanbul: Springer, 2013), 617–624. 31. Tania Todorova and Irena Peteva, “Information Literacy Competency of LIS Students in SULSIT with a Special Focus on Intellectual Property,” in Worldwide Commonalities and Challenges in Information Literacy Research and Practice, eds. Serap Kurbanoglu et al. (Istanbul: Springer, 2013), 610–16. 32. Jurgita Rudžioniene, “Information Behavior of University Students: From Today’s Infor- mation and Communication Student towards Tomorrow’s Excellent Information Specialist,” in Worldwide Commonalities and Challenges in Information Literacy Research and Practice, eds. Serap Kurbanoglu et al. (Istanbul: Springer, 2013), 603–09. 33. Eliane Blumer et al., “Information Literacy Competences of LIS Students in Switzerland: A Case Study,” in Worldwide Commonalities and Challenges in Information Literacy Research and Practice, eds. Serap Kurbanoglu et al. (Istanbul: Springer, 2013), 597–603. 34. Joumana Boustany, “Information Literacy Skills of Students at Paris Descartes University,” in Worldwide Commonalities and Challenges in Information Literacy Research and Practice, eds. Serap Kurbanoglu et al. (Istanbul: Springer, 2013), 589–95. 35. Miriam L. Matteson, “The Whole Student: Cognition, Emotion, and Information Literacy,” College & Research Libraries 75, no. 6 (2014): 862–77. 36. Megan Oakleaf and Neal Kaske, “Guiding Questions for Assessing Information Literacy in Higher Education,” portal: Libraries and the Academy 9, no. 2 (2009): 273–86. 37. M. Gross and D. Latham, “Undergraduate Perceptions of Information Literacy: Defining, Attaining, and Self-Assessing Skills,” College & Research Libraries 70, no. 4 (July 1, 2009): 336–50; M. Gross and Don Latham, “What’s Skill Got to Do With It ? Information Literacy Skills and Self-Views of Ability Among First-Year College Students,” Journal of the American Society for Information Science and Technology 63, no. 3 (2012): 574–83; M. Pinto et al., “Designing and Implementing Web-Based Tools to Assess Information Competences of Social Science Students at Spanish Universities,” in Worldwide Commonalities and Challenges in Information Literacy Research and Practice, eds. Serap Kurbanoglu et al. (Istanbul: Springer, 2013), 443–449. 38. ALA/ACRL, “Information Literacy Competency Standards for Higher Education,” 2000, available online at www.ala.org/acrl/standards/informationliteracycompetency [accessed 20 July 2015]; Christine Bruce, The Seven Faces of Information Literacy (Adealide: Auslib Press, 1997); S. Corrall, “Benchmarking Strategic Engagement with Information Literacy in Higher Education: Towards a Working Model,” Information Research 12, no. 4 (2007); SCONUL, “Information Skills in Higher Education: A SCONUL Position Paper,” 1999, available online at www.sconul.ac.uk/ groups/information_literacy/papers/Seven_pillars.html [accessed 20 July 2015]; Sheila Webber and Bill Johnson, “Working towards the Information Literate University,” in Information Literacy: Recognising the Need, eds. G. Walton and A. Pope (Oxford: Chandos, 2006), 47–58. 39. Pinto, “Design of the IL-HUMASS Survey on Information Literacy in Higher Education”; M. Pinto, “An Approach to the Internal Facet of Information Literacy Using the IL-HUMASS Sur- vey,” Journal of Academic Librarianship 37, no. 2 (2011): 145–54; L. Limberg, “Informing Information Literacy Education through Empirical Research,” in The Information Literate School Community 2: Issues of Leadership, eds. J. Henri and M. Asselin (Wagga Wagga, NSW: Centre for Information Studies, Charles Sturt University, 2005), 39–50; Clarence Maybee, “Undergraduate Perceptions of Information Use: The Basis for Creating User-Centered Student Information Literacy Instruction,” Journal of Academic Librarianship 32, no. 1 (Jan. 2006): 79–85; Kimmo Tuominen, Reijo Savolainen, http://www.sconul.ac.uk/groups/information_literacy/papers/Seven_pillars.html http://www.sconul.ac.uk/groups/information_literacy/papers/Seven_pillars.html 726 College & Research Libraries November 2016 and Sanna Talja, “Information Literacy as a Sociotechnical Practice,” Library Quarterly 75, no. 3 (2005): 329–45. 40. Pinto, “Design of the IL-HUMASS Survey on Information Literacy in Higher Education,” 90. 41. Pinto, “An Approach to the Internal Facet of Information Literacy Using the IL-HUMASS Survey.” 42. Pinto, “Design of the IL-HUMASS Survey on Information Literacy in Higher Education”; M. Pinto and Dora Sales, “Uncovering Information Literacy’s Disciplinary Differences through Students’ Attitudes: An Empirical Study,” Journal of Librarianship and Information Science 47, no. 3 (2015): 204–215. 43. Lih-juan Chanlin, “Development of a Competency Questionnaire for LIS Undergraduates at Fu-Jen Catholic University,” Journal of Educational Media and Library Sciences 47, no. 1 (2009): 5–17. 44. Thomas P. Mackey and Jinwon Ho, “Implementing a Convergent Model for Information Literacy: Combining Research and Web Literacy,” Journal of Information Science 31, no. 6 (Dec. 2005): 541–55. 45. Bruce Thompson, Exploratory and Confirmatory Factor Analysis: Understanding Concepts and Applications (Washington, D.C.: American Psychological Association, 2004). 46. Field, Discovering Statistics Using SPSS for Windows, 4th ed. (SAGE Publications Ltd. 2013). J. Hair et al., Multivariate Data Analysis: A Global Perspective, 7th ed. (Upper Saddle River, N.J.: Pearson Prentice Hall, 2010). 47. A.J. Fairchild et al., “Evaluating Existing and New Validity Evidence for the Academic Motivation Scale,” Contemporary Educational Psychology 30, no. 3 (2005): 331–58; Mitchell Ross, Helen Perkins, and Kelli Bodey, “Information Literacy Self-Efficacy: The Effect of Juggling Work and Study,” Library & Information Science Research 35, no. 4 (Oct. 2013): 279–87; C.O. Walker, B.A. Greene, and R.A. Mansell, “Identification with Academics, Intrinsic/Extrinsic Motivation, and Self-Efficacy as Predictors of Cognitive Engagement,” Learning and Individual Differences 16, no. 1 (2006): 1–12. 48. Chanlin, “Development of a Competency Questionnaire for LIS Undergraduates,” Journal of Educational Media & Library Sciences 47, no. 1 (2009): 5–17. 49. Pinto and Fernández-Pascual, “Information Literacy Competencies among Social Sciences Undergraduates,” Communications in Computer and Information Science 492, (2014): 370–378.