730 Assessing the Information Literacy Skills of First- Generation College Students Sarah LeMire, Zhihong Xu, Valerie Balester, Leroy G. Dorsey, and Douglas Hahn* Academic libraries, together with their colleges and universities, are increasingly identifying first-generation college students as an underserved population that is likely to experience barriers to library access and usage. Less is known, however, about the information literacy skills of first-generation students, particularly in comparison with their continuing-generation counterparts. This study assessed the information literacy skills of first-generation college students in general education courses at Texas A&M University to inform information literacy instructional efforts and to in- form advocacy efforts for developing substantial and sustained information literacy support for first-generation students at that campus. Study results indicate that first- generation students experience significant information literacy gaps in comparison with continuing-generation students at the same institution and in the same courses. Introduction As colleges and universities strive to increase retention and graduation rates on their campuses, increased attention has been paid to underserved student populations such as first-generation college students. First-generation students, defined for the purposes of this study as students whose parents did not graduate from a four-year college, are more likely to experience barriers and are less likely to graduate than their continuing-generation peers.1 As campuses develop learning communities, courses, and other programs for first-generation students, librarians have begun to get involved. While some librarians have fully embedded in first-generation pro- grams on their campuses,2 others are still working to develop collaborative relationships with program leaders or are limited to resource awareness-focused one-shots. Though embedding information literacy instruction is more impactful for first-generation students,3 advocating for time and resources to provide sustained information literacy instruction can meet with resis- tance. Library administrators and first-generation program coordinators may want evidence Sarah LeMire is Coordinator of First-Year Programs, Zhihong Xu is Data Management Librarian, and Douglas Hahn is Director of Library Applications and Integration at the Texas A&M University Libraries; emails: slemire@ tamu.edu, xuzhihong@tamu.edu, and dhahn@tamu.edu. Valerie Balester is Assistant Provost, Undergraduate Studies, Executive Director, University Writing Center & Academic Success Center, and Professor of English at Texas A&M University; email: v-balester@tamu.edu. Leroy G. Dorsey is Associate Dean for Inclusive Excellence and Strategic Initiatives for the Texas A&M University College of Liberal Arts; email: l-dorsey@tamu.edu. 2021 Sarah LeMire, Zhihong Xu, Valerie Balester, Leroy G. Dorsey, and Douglas Hahn, Attribution-NonCommercial (https://creativecommons.org/licenses/by-nc/4.0/) CC BY-NC. mailto:slemire@tamu.edu mailto:slemire@tamu.edu mailto:xuzhihong@tamu.edu mailto:dhahn@tamu.edu mailto:v-balester@tamu.edu mailto:l-dorsey@tamu.edu https://creativecommons.org/licenses/by-nc/4.0/ Assessing the Information Literacy Skills of First-Generation College Students 731 to support the need for these increased resources. To date, the library literature has provided limited evidence of the unique information literacy needs of first-generation college students.4 In this study, researchers from several campus departments partnered to assess the in- formation literacy skills of first-generation college students on their campus. This assessment was intended to help librarians and their collaborators across campus better understand the information literacy skills and needs of first-generation college students. The goal of the study was to inform future information literacy collaborations and programming aimed at serving this important but underserved population. Research questions included: 1. Are there differences in the overall information literacy test scores between first- generation students and continuing-generation students? 2. Are there differences in information literacy outcomes between first-generation stu- dents and continuing-generation students? 3. Are there differences in information literacy dispositions between first-generation students and continuing-generation students? 4. Are there differences in information literacy performance indicators between first- generation students and continuing-generation students? Literature Review Assessment represents an integral part of information literacy program design, advocacy, and instruction. Oakleaf and Kaske identify three reasons for assessment: to “increase student learning,” “respond to calls for accountability,” and “improve library instruction programs.”5 Assessment results can help librarians design appropriate and tailored information literacy instruction and can also be used to help librarians advocate for instructional time, adminis- trative support, or other resources to increase or sustain information literacy in a target area. Librarians have long recognized that information literacy skills may differ from one group to another. Accordingly, librarians have conducted assessments to better understand the specific information literacy skills of a variety of unique groups. For example, studies have explored how information literacy skills may vary based on class year, studying first-year college students,6 graduating seniors,7 and incoming graduate students.8 Other studies have investigated information literacy skills of students in specific majors, such as teacher educa- tion students,9 international graduate business students,10 and graduate education students.11 Librarians have also explored student information literacy skills based on admission type, such as transfer students,12 or participation in a specific program, such as an educational op- portunity program.13 First-generation college students constitute another specific group that has become in- creasingly of interest in academic libraries. One common thread in the literature focuses on identifying and reducing barriers to library services for first-generation students. Researchers have found that the labyrinthine nature of academic library buildings and services is a problem for this group of students. Brinkman et al. found that first-generation students can experience the library as “a confusing world.”14 Parker found that “many described the physical library in terms such as ‘intimidating,’ ‘hard to navigate,’ and ‘scary.’”15 Accordingly, researchers have recommended alterations to library services to better meet the needs of first-generation students. Tyckoson recommended changes such as adjustments to service hours to better ac- commodate student schedules and other life responsibilities.16 Arch and Gilman echo many 732 College & Research Libraries July 2021 of these recommendations, including peer mentoring and personal librarian programs, and add suggestions for co-locating academic success services and lending programs for textbooks and technology.17 In addition to reducing barriers to library access and services, the literature also explores the information literacy skills of first-generation students. Ilett noted that the research shows “first-generation students improve their information literacy knowledge and skills over the course of their college careers and come to understand the role of the library and librarians in their academic success.”18 Indeed, Logan and Pickard interviewed first-year, first-generation college students and found that, contrary to the expectations of librarians and faculty mem- bers, all had experience with research papers from high school. They noted, “These students clearly knew to look for quality information.”19 A follow-up study by Pickard and Logan that compared first-generation seniors to first-year, first-generation students found that first- generation seniors demonstrated a growth in information literacy knowledge and skills during the course of their undergraduate education.20 However, the literature reveals little research that would help them answer a question com- monly posed to librarians by administrators. When librarians advocate for library involvement in programming aimed at first-generation students, they may be asked how first-generation students’ information literacy skills differ from those of their continuing-generation peers. Initial research by Graves et al. suggested that first-generation students may demonstrate dif- ferent information literacy skills related to selecting and documenting sources.21 This study contributes toward filling a gap in the literature by further exploring if, and to what extent, first-generation students demonstrate differences in their information literacy skills in com- parison with their continuing-generation counterparts. Methodology The primary goal of this study involved establishing a baseline understanding of under- graduate information literacy skills to inform the University Libraries’ instruction program. To do so, the researchers planned to collect a large dataset that they could use to explore a number of research questions. The researchers applied for campus funding to implement a standardized information literacy test. There are benefits and disadvantages to using an information literacy test to measure information literacy skills, as Oakleaf describes.22 Tests have limited utility in measuring student behavior or execution of information literacy skills; instead, they measure students’ ability to recognize information. On the plus side, tests can have high accuracy, particularly if they are extended in length, and they scale well. Finally, Oakleaf notes that a benefit to fixed-choice tests is that “people believe in them.”23 For this reason, fixed-choice tests can be an effective advocacy tool in conversations with campus stakeholders. The researchers were familiar with a previous campus project24 using Carrick Enter- prises’ Project SAILS (Standardized Assessment of Information Literacy Skills) test, which is based on the Association of College and Research Libraries’ (ACRL) Information Literacy Standards for Higher Education25 and which has been sunset. The researchers wanted to build upon this previous work using Carrick Enterprises’ newly validated test, TATIL (Threshold Achievement Test of Information Literacy). Unlike Project SAILS, TATIL was designed in response to the ACRL Framework for Information Literacy for Higher Educa- tion.26 Assessing the Information Literacy Skills of First-Generation College Students 733 Threshold Achievement Test of Information Literacy (TATIL) To gain the broadest assessment of student information literacy skills, researchers opted to implement all four modules of the TATIL test. Table 1 includes module names and descrip- tions from Carrick Enterprises.27 Each module of the TATIL test can take approximately 30–50 minutes to complete, ac- cording to Carrick Enterprises, so the researchers opted not to assign all four modules to each student. Instead, students would be randomly assigned to a single information literacy module. Though not every student who was assigned a module completed it, this method ensured that a similar number of students completed each module of the TATIL test. Each module of the TATIL test is composed of four components: outcomes, dispositions, and performance indicators, and an overall score.28 Outcomes are the information literacy skill categories that TATIL is testing; for example, Outcome 1.1 is “Apply knowledge of source cre- ation processes and context to evaluate the authority of a source.”29 Each outcome is measured through completion of several individual questions, which are called performance indicators. Scores on individual performance indicators are mapped to a larger outcome score, and all of the outcome scores for a single module comprise the overall test score for that module. In the TATIL test, dispositions are a series of questions that are intended to measure strategies, attitudes, or behavior rather than knowledge. Disposition scores are kept separate from outcome scores and performance indicator scores and do not factor into overall scores. In the TATIL test, students can score highly on a particular performance indicator or outcome, indicating knowledge of a specific information literacy skill, but score much lower on a related disposition score if the strate- gies they choose do not reflect their knowledge. For example, a student may be able to identify a scholarly source from a list, but their search strategies may not yield any scholarly sources to use in their paper. Finally, each student who completes a TATIL module is given an overall score for that specific module. The overall score is a composite of that module’s outcome and performance indicator (but not disposition) scores. This study evaluates students’ results in all four aspects of the TATIL modules: outcomes, performance indicators, dispositions, and overall scores. TABLE 1 TATIL Modules and Descriptions Module Number Module Name TATIL Module Description Module 1 Evaluating Process & Authority (EP&A) This module combines concepts from two of the ACRL information literacy frames, Authority Is Constructed and Contextual and Information Creation as a Process. It focuses on the process of information creation and the constructed and contextual nature of source authority. Module 2 Strategic Searching (SS) This module relates to the Searching as Strategic Exploration frame. It focuses on the process of planning, evaluating, and revising searches during strategic exploration. Module 3 Research & Scholarship (R&S) This module combines elements from the Research as Inquiry and Scholarship as a Conversation frames. It focuses on the knowledge- building process and how scholars build knowledge. Module 4 Value of Information (VoI) This module is inspired by the Information Has Value frame. It focuses on the norms of academic information creation and the factors that affect access to information. 734 College & Research Libraries July 2021 Recruitment After receiving institutional review board approval, the researchers contacted instructors of Texas A&M University core curriculum, or general education, courses to request their help in recruiting participants. During the four semesters of this study, 52 unique instructors vol- unteered to share the study with the students in their general education classes. Instructors had the option to directly share the study invitation to participate with their students, or they could invite the researchers into the classroom to share the study directly with the students. Students who opted to complete the survey could follow the link embedded in the invita- tion to participate, which prompted them to use their Single Sign On (SSO) authentication credentials to log into a demographic questionnaire, after which students were randomly assigned one of the four modules. To incentivize participation, instructors were permitted to offer extra credit for participation, though not all chose to do so; all participants were entered into a drawing for gift cards at the end of the semester. Demographics This study included all testing of the four TATIL modules, spread out over four academic semesters beginning in fall 2018 and ending in fall 2019. To get a valid dataset, the researchers had to eliminate a number of participant responses. In particular, the researchers were con- cerned about the amount of time participants spent completing the modules, which ranged from 20 to 30 questions. Though the TATIL system requires participants to spend a minimal amount of time on each question page, the dataset included responses from participants who spent less than three minutes on an entire module. In contrast, the median amount of time spent on the modules ranged from 19 to 32 minutes. To ensure that the dataset was focused on participants who had substantively engaged with the module, the researchers dropped all of the participants (134) whose total time of finishing a module was less than 10 minutes. A total of 631 participants finished Module 1 (EP&A). Among them, 12 participants’ information were dropped because their total participation time was less than 10 minutes. Therefore, 619 participants’ data were analyzed for this module. Among them, there were 466 continuing-generation students and 153 first-generation students. Similarly, 653 participants completed Module 2 (SS), and 53 participants’ information were dropped due to participa- tion time (<10 minutes). Of the remaining 600, 436 continuing-generation students and 164 first-generation students’ data were analyzed for Module 2. A total of 611 participants com- pleted the third module. Among them, we dropped 27 participants’ information because of insufficient participation time (<10 minutes). Thus, 584 participants’ data were analyzed for Module 3 (R&S): 433 continuing-generation students and 151 first-generation students. For Module 4 (VoI), there were 634 participants finishing the survey. The completion time of 42 TABLE 2 Participant Completion Completed the Survey Dropped (participation time) Included in the Analysis Module 1 (EP&A) 631 12 619 Module 2 (SS) 653 53 600 Module 3 (R&S) 611 27 584 Module 4 (VoI) 634 42 592 Total 2,529 134 2,395 Assessing the Information Literacy Skills of First-Generation College Students 735 participants was less than 10 minutes. Therefore, 421 continuing-generation students and 171 first-generation students were included in the analysis for Module 4, with 592 in total. Detailed information about participants were included in table 2. Data Analysis To analyze the data, we imported the dataset into STATA and ran two types of analyses: t-tests and multivariate multiple regressions. For both types of analysis, a threshold of p < 0.05 was used to determine statistical significance. For research question 1, we ran four t-tests to investigate the differences in information literacy and overall scores between first-generation students and continuing-generation stu- dents across four modules. Research question 2 examined the differences in information literacy outcomes between first-generation students and continuing-generation students across four modules. For this research question, four multivariate multiple regressions were analyzed in STATA with the outcome scores as the dependent variable and the group condition (first generation or continu- ing-generation students) as the independent variable, with library experience as covariates. We chose multivariate multiple regression because the outcome scores are correlated. The least- squares estimation was used as the parameter estimation method. We used weighted scores instead of raw scores because difficulty level is an important component of TATIL scoring.30 To answer the third research question, four multivariate multiple regressions were employed to investigate the differences in information literacy dispositions between first- generation students and continuing-generation students. The group condition and the library experience were used as independent variables, while the disposition scores were used as de- pendent variables. The least-squares estimation was used as the parameter estimation method. Four multivariate multiple regressions were conducted to investigate the differences in information literacy performance indicators between first-generation students and continu- ing-generation students (research question 4). We used the group condition and the library experience as independent variables, while the performance indicator scores were used as the dependent variables. The least-squares estimation was used as the parameter estimation method. For all of the multivariate multiple regression analyses, we first included library experi- ence as the control variable. Since there was no statistical difference existing in the library experience, we dropped the covariate in the final model. Results Overall Scores To gain an overall understanding of the TATIL test performance of first-generation and continuing-generation students, we first calculated how frequently each group reached each of TATIL’s performance levels. TATIL uses three performance levels—conditionally ready, college ready, and research ready—to describe student achievement on the knowledge section of the test. Each test module uses the same three performance levels, though the description of the knowledge and skills associated with each performance level varies from one module to another. Conditionally ready is the lowest level, while college ready is the intermediate level and research ready is the highest level. TATIL has established for each module specific cutoff scores between each performance level. Table 3 represents participants’ knowledge levels in the overall score category for all four modules. 736 College & Research Libraries July 2021 As a whole, results in all four modules revealed that the vast majority of students scored at the college ready level or higher, regardless of first-generation status. Students in both groups performed most highly on modules 3 (R&S) and 4 (VoI). Only six (3.51%) first-generation students and nine (2.14%) continuing-generation students scored in the lowest category, conditionally ready, on module 4 (VoI). Similarly, 10 (6.62%) first-generation students and 24 (5.54%) continuing-generation students were identified as conditionally ready in module 3 (R&S). Though the differences were small, first-generation students were more likely to be placed in the conditionally ready category. First-generation students were also less likely to be placed in the highest category. Only 23 (15.23%) first-generation students were research ready in module 3, while 126 (29.10%) continuing-generation students were research ready. Similarly, 12 (7.02%) first-generation students were categorized as research ready for module 4, while 54 (12.83%) continuing-generation students were placed in that category. Overall, students scored lower on modules 1 (EP&A) and 2 (SS), regardless of first-gen- eration status. First-generation students were again more likely to be placed in the condition- ally ready category, with 33 (21.57%) first-generation students marked as conditionally ready for module 1 (EP&A), compared to 33 (7.08%) continuing-generation students. Similarly, 24 (14.63%) first-generation students were marked as conditionally ready for module 2 (SS), com- pared to 46 (10.55%) continuing-generation students. At the other end of the spectrum, neither first-generation students nor continuing-generation students were likely to be categorized as research ready. In module 2, only seven (4.27%) first-generation students were categorized as research ready and a similar percentage (4.59%, or 20) of continuing-generation students were research ready for that module. Interestingly, neither group had a single student score in the research ready category for module 1 (EP&A). To determine whether there were statistically significant differences between first-generation students and continuing-generation students across modules in the overall scores, we ran four t-tests. We found that there were significant differences between the two groups in each module. In module 1 (EP&A), first-generation students’ overall scores (M = 441.45, SD = 137.30) were statis- tically lower than continuing-generation students’ scores (M = 510.34, SD = 132.74) (p < 0.001). In module 2 (SS), first-generation students’ overall scores (M = 479.39, SD = 138.75) were statistically lower than continuing-generation students’ scores (M = 518.09, SD = 142.79) (p < 0.01). In module 3 (R&S), first-generation students’ overall scores (M = 459.97, SD = 149.11) were statistically lower than continuing-generation students’ scores (M = 520.79, SD = 155.80) (p < 0.001). In module 4 TABLE 3 Knowledge Performance Levels for Overall Scores Module 1 (EP&A) Module 2 (SS) Module 3 (R&S) Module 4 (VoI) Group n Overall Score (%) n Overall Score (%) n Overall Score (%) n Overall Score (%) Firstgen CdR 33 21.57 24 14.63 10 6.62 6 3.51 Firstgen CR 120 78.43 133 81.10 118 78.15 153 89.47 Firstgen RR 0 0 7 4.27 23 15.23 12 7.02 Continuing CdR 33 7.08 46 10.55 24 5.54 9 2.14 Continuing CR 433 92.92 370 84.86 283 65.36 358 85.03 Continuing RR 0 0 20 4.59 126 29.10 54 12.83 NOTE: CdR = conditionally ready; CR = college ready; RR = research ready. Assessing the Information Literacy Skills of First-Generation College Students 737 (VoI), first-generation students’ overall scores (M = 452.85, SD = 134.64) were again statistically lower than continuing-generation students’ scores (M = 491.41, SD = 138.48) (p < 0.01). Outcomes In addition to exploring differences in overall TATIL module scores, we analyzed the data for potential differences between first-generation students and continuing-generation students on TATIL test outcomes. Each of the four TATIL modules (EP&A, SS, R&S, VoI) contains two outcomes along with their Carrick Enterprises descriptions in appendix A, along with the code associated with each outcome for analysis purposes.31 TATIL also breaks down outcome scores according to knowledge performance levels, using the same categories as with the overall scores: conditionally ready, college ready, and research ready. Knowledge performance level results for outcomes closely aligned with those of overall scores, which is unsurprising as outcome scores contribute to overall scores. For all eight outcomes, first-generation students were more likely to be placed in the conditionally ready category. Students performed most highly on modules 3 (R&S) and 4 (VoI), regardless of first-gen- eration status. For module 4 (VoI), only eight (4.68%) first-generation students and 14 (3.33%) continuing-generation students scored as conditionally ready for Outcome 4.1 (O41), while 16 (9.36%) first-generation students and 25 (9.54%) continuing-generation students scored as conditionally ready for Outcome 4.2 (O42). For module 3 (R&S), nine (5.96%) first-generation students and 17 (3.93%) continuing-generation students were conditionally ready for Outcome 3.2 (O32), while 29 (19.21%) first-generation students and 45 (10.39%) continuing-generation students were conditionally ready for Outcome 3.1 (O31). Similar to the overall scores, the outcome scores for modules 1 (EP&A) and 2 (SS) suggested all students had more difficulty in these areas. For module 1 (EP&A), 23 (15.03%) first-generation students were conditionally ready for Outcome 1.1 (O11), as opposed to 29 (6.22%) continuing- generation students. For Outcome 1.2 (O12), 50 (32.68%) first-generation students were condi- tionally ready, compared with 78 (16.74%) continuing-generation students. For module 2 (SS), 40 (24.39%) first-generation students and 77 (17.66%) continuing-generation students were conditionally ready on Outcome 2.1 (O21). For Outcome 2.2 (O22), 34 (20.73%) first-generation students and 73 (16.74%) continuing-generation students were conditionally ready. While first-generation students were more likely to perform as conditionally ready across the board, the results for the research ready performance level were more mixed. Continuing- generation students were more likely to have scores at this level for four of the eight outcomes. However, first-generation students were more likely to score as research ready for the other four, though the differences are small. For Outcome 1.1 (O11), two (1.31%) first-generation students scored as research ready compared to three (0.64%) continuing-generation students. Similarly, Outcome 1.2 (O12) had one (0.65%) first-generation student score as research ready, while no continuing-generation students scored in that category. For Outcome 2.2 (O22), 19 (11.59%) first-generation students and 46 (10.55%) continuing-generation students were research ready. Finally, for Outcome 4.2 (O42), 51 (29.82%) first-generation students were research ready, com- pared to 123 (29.21%) continuing-generation students. Table 4 provides detailed information about students’ percentage rate of knowledge performance levels for each outcome. Multivariate multiple regression analysis showed that there were statistically significant differences between first-generation students and continuing-generation students in all eight outcomes. In module 1 (EP&A), first-generation students had lower scores (M = 424.35, SD = 738 College & Research Libraries July 2021 146.99) than their continuing-generation peers (M = 496.86, SD = 144.67) for outcome O11 [t (618) = –5.36, p < 0.001]. Similarly, first-generation students had statistically lower O12 (M = 462.99, SD = 168.69) scores [t (618) = –4.24, p < 0.001] than the continuing-generation students (M= 527.21, SD = 160.26) in module 1. In module 2 (SS), first-generation students’ O21 score (M = 477.36, SD = 152.91) and O22 (M = 483.09, SD = 204.49) were significantly lower [O21 t (599) = –2.74, p < 0.01; O22 t (599) = –2.01, p < 0.05 respectively] than the continuing-generation students’ O21 scores (M = 516.35, SD = 155.88) and O22 scores (M = 520.93, SD = 206.38). There were also statistically significant differences between the first-generation students and continuing-generation students in the outcome score O31 [t (583) = –4.39, p < 0.001] and O32 [t (583) = –3.16, p < 0.01] in module 3 (R&S), and O41 [t (591) = –2.02, p < 0.05] and O42 [t (591) = –2.96, p < 0.01] in module 4 (VoI). Detailed descriptive statistics about the information literacy outcome scores and results from the multivariate multiple regression analysis were included in tables 5 and 6. Dispositions The TATIL test also measures students’ information literacy dispositions. Disposition scores are, according to Carrick Enterprises, “based on a student’s judgments regarding strategies. TABLE 5 Outcome Scores for First-Generation and Continuing-Generation Students Module 1 (EP&A) Module 2 (SS) Module 3 (R&S) Module 4 (VoI) Group n O11 (M/SD) O12 (M/SD) n O21 (M/SD) O22 (M/SD) n O31 (M/SD) O32 (M/SD) n O41 (M/SD) O42 (M/SD) Firstgen 153 424.35/ 146.99 462.99/ 168.69 164 477.36/ 152.91 483.09/ 204.49 151 451.32/ 165.15 467.52/ 166.85 171 389.65/ 181.51 493.82/ 151.17 Continuing 466 496.86/ 144.67 527.21/ 160.26 436 516.35/ 155.88 520.93/ 206.38 433 523.77/ 177.91 518.22/ 170.93 421 423.52/ 186.48 535.33/ 155.97 TABLE 4 Knowledge Performance Levels for Outcome Scores Module 1 (EP&A) Module 2 (SS) Module 3 (R&S) Module 4 (VoI) Group n O11 (%) n O12 (%) n O21 (%) n O22 (%) n O31 (%) n O32 (%) n O41 (%) n O42 (%) Firstgen CdR 23 15.03 50 32.68 40 24.39 34 20.73 29 19.21 9 5.96 8 4.68 16 9.36 Firstgen CR 128 83.66 102 66.67 116 70.73 111 67.68 118 78.15 78 51.66 116 67.84 104 60.82 Firstgen RR 2 1.31 1 0.65 8 4.88 19 11.59 4 2.64 64 42.38 47 27.48 51 29.82 Continuing CdR 29 6.22 78 16.74 77 17.66 73 16.74 45 10.39 17 3.93 14 3.33 25 5.94 Continuing CR 434 93.14 388 83.26 320 73.40 317 72.71 348 80.37 170 39.26 270 64.13 273 64.85 Continuing RR 3 0.64 0 0 39 8.94 46 10.55 40 9.24 246 56.81 137 32.54 123 29.21 NOTE: CdR = conditionally ready; CR = college ready; RR = research ready. Assessing the Information Literacy Skills of First-Generation College Students 739 TABLE 6 Multivariate Regression Analysis Results for Information Literacy Outcomes Estimate Standard Error t-value p-value Module 1 (EP&A) O11 Intercept 496.86 6.73 73.85 0.000 Firstgen/Continuing –72.51 13.53 –5.36 0.000*** O12 Intercept 527.21 7.52 70.09 0.000 Firstgen/Continuing –64.21 15.13 –4.24 0.000*** Module 2 (SS) O21 Intercept 516.35 7.43 69.53 0.000 Firstgen/Continuing –38.99 14.21 –2.74 0.006** O22 Intercept 520.93 9.86 52.84 0.000 Firstgen/Continuing –37.84 18.86 –2.01 0.045* Module 3 (R&S) O31 Intercept 523.77 8.40 63.47 0.000 Firstgen/Continuing –72.45 16.51 –4.39 0.000*** O32 Intercept 518.23 8.16 63.47 0.000 Firstgen/Continuing –50.70 16.06 –3.16 0.002** Module 4 (VoI) O41 Intercept 423.52 9.02 46.96 0.000 Firstgen/Continuing –33.87 16.78 –2.02 0.044* O42 Intercept 535.33 7.53 71.05 0.000 Firstgen/Continuing –41.51 14.02 –2.96 0.003** NOTE: * p<0.05; ** p<0.01; ***p<0.001. Students earn high scores on these items if they judge behaviors associated with the disposi- tion to be useful and behaviors not associated with the disposition to be not useful.”32 Each module of the TATIL test has one to three dispositions associated with it. A list of disposi- tions, along with their Carrick Enterprises descriptions, is available in appendix B.33 Results from multivariate multiple regression analysis showed that there were statistically significant differences between the first-generation students and continuing-generation students in two (22%) of the dispositions. Specifically, there were differences between the two groups in the disposition score D21 (Productive persistence) [t (599) = –2.39, p < 0.05] and D33 (Responsibility to community) [t (583) = –3.15, p < 0.01]. The first-generation students had lower D21 scores (M = 63.16, SD = 9.68) and D33 scores (M = 50.45, SD = 9.39) than the continuing-generation 740 College & Research Libraries July 2021 TABLE 7 Disposition Scores for First-Generation Students and Continuing-Generation Students Module 1 (EP&A) Module 2 (SS) Module 3 (R&S) Module 4 (VoI) Group n D11 (M/SD) D12 (M/SD) D13 (M/SD) n D21 (M/SD) n D31 (M/SD) D32 (M/SD) D33 (M/SD) n D41 (M/SD) D42 (M/SD) Firstgen 153 53.34/ 10.82 56.32/ 12.89 64.29/ 13.04 164 63.16/ 9.68 151 54.40/ 11.01 73.82/ 14.11 50.45/ 9.39 171 65.75/ 11.97 71.33/ 8.58 Continuing 466 54.91/ 10.24 57.46/ 11.74 66.56/ 13.64 436 65.19/ 9.13 433 55.19/ 10.18 73.92/ 12.30 53.29/ 9.58 421 66.63/ 11.89 72.19/ 8.57 TABLE 8 Multivariate Regression Analysis Results for Information Literacy Dispositions Estimate Standard Error t-value p-value Module 1 (EP&A) D11 Intercept 54.91 0.48 114.13 0.000 Firstgen/Continuing –1.57 0.97 –1.62 0.106 D12 Intercept 57.46 0.56 103.05 0.000 Firstgen/Continuing –1.14 1.12 –1.02 0.310 D13 Intercept 66.56 0.63 106.46 0.000 Firstgen/Continuing –2.27 1.26 –1.80 0.072 Module 2 (SS) D21 Intercept 65.19 0.44 146.63 0.000 Firstgen/Continuing –2.04 0.85 –2.39 0.017* Module 3 (R&S) D31 Intercept 55.19 0.50 110.45 0.000 Firstgen/Continuing –0.79 0.98 –0.81 0.419 D32 Intercept 73.93 0.61 120.25 0.000 Firstgen/Continuing –0.10 1.21 –0.09 0.931 D33 Intercept 53.29 0.46 116.33 0.000 Firstgen/Continuing –2.84 0.90 –3.15 0.002** Module 4 (VoI) D41 Intercept 66.63 0.58 114.77 0.000 Firstgen/Continuing –0.88 1.08 –0.82 0.414 D42 Intercept 72.19 0.42 172.75 0.000 Firstgen/Continuing –0.86 0.78 –1.11 0.269 NOTE: * p<0.05; ** p<0.01; ***p<0.001. Assessing the Information Literacy Skills of First-Generation College Students 741 students (M = 65.19, SD = 9.13) and D33 (M = 53.29, SD = 9.58). The descriptive statistics and detailed results from the multivariate multiple regression analysis were reported in tables 7 and 8. Performance Indicators Performance indicators represent the most granular scores on the TATIL test. They mea- sure student proficiency on specific skills areas and are, therefore, valuable as they can indicate specific problem areas that need to be addressed. There are 80 performance indicators across the four TATIL modules: 24 in module 1 (EP&A), 17 in module 2 (SS), 24 in module 3 (R&S), and 15 in module 4 (VoI). We also included in this category the individual disposition questions, as they are similar in granularity to the performance indicators. There are 21 individual disposition questions: six in module 1 (EP&A), three in module 2 (SS), six in module 3 (R&S), and six in module 4 (VoI). Altogether, there are 101 items in this category. First-generation students scored lower than continuing-generation students on 90 (89%) of the performance indicators and individual disposition questions. There were statistically significant differences between first-generation and continuing-generation students in 36 (36%) of the performance indicators and individual disposition questions. Though there were no statistically significant performance indicators in which first-generation students outper- formed their continuing-generation peers, first-generation students did score higher on seven performance indicators and four individual disposition questions. For the sake of brevity, only those performance indicators and individual disposition questions with significant results are included in appendix C.34 Four multivariate multiple regressions were employed to identify whether there were differences between first-generation and continuing-generation students on per- formance indicators and individual disposition questions across the four modules. In module 1 (EP&A), statistically significant differences between first-generation students and continuing-generation students were found in performance indicators p1211 [t(618) = –3.82, p < 0.001], p116 [t(618) = –2.97, p < 0.01], p126 [t(618) = –3.71, p < 0.001], p119 [t(618) = –3.06, p < 0.01], p129 [t(618) = –2.37, p < 0.05], p113 [t(618) = –3.06, p < 0.01], p1111 [t(618) = –3.46, p < 0.001], p118 [t(618) = –2.55, p < 0.05], p117 [t(618) = –2.11, p < 0.05], p127 [t(618) = –4.00, p < 0.001], p124 [t(618) = –2.20, p < 0.05], D11a [t(618) = –2.22, p < 0.05], D13b [t(618) = –2.23, p < 0.05] and D12b [t(618) = –2.56, p < 0.05]. Results showed that there were statisti- cally significant differences in performance indicators p216 [t(599) = –2.67, p < 0.01], p212 [t(599) = –3.42, p < 0.001], p2111 [t(599) = –2.49, p < 0.05], p222 [t(599) = –2.19, p < 0.05], p221 [t(599) = –2.21, p < 0.05], and D21c [t(599) = –2.38, p < 0.05] in module 2 (SS). In module 3 (R&S), performance indicators p312 [t(583) = –2.56, p < 0.05], p314 [t(583) = –2.18, p < 0.05], p3113 [t(583) = –2.29, p < 0.05], p325 [t(583) = –3.51, p < 0.001], p3112 [t(583) = –2.80, p < 0.01], p326 [t(583) = –2.37, p < 0.05], p315 [t(583) = –2.49, p < 0.05], p3212 [t(583) = –3.24, p < 0.001], p318 [t(583) = –2.66, p < 0.01], p319 [t(583) = –3.14, p < 0.01], p3114 [t(583) = –2.06, p < 0.05], p316 [t(583) = –2.89, p < 0.01], D31b [t(583) = –2.10, p < 0.05], and D33a [t(583) = –2.70, p < 0.01] were found statistically different between groups. There were statistically significant differences in performance indicators p425 [t(591) = –2.29, p < 0.05] and p416 [t(591) = –3.13, p < 0.01] in module 4 (VoI). Among all of the statistically significant per- formance indicators and individual disposition questions, first-generation students had 742 College & Research Libraries July 2021 lower scores than continuing-generation students. Due to the number of indicators and disposition questions, detailed information about performance indicator and individual disposition question scores and the associated multivariate multiple regression analysis is provided in appendix D. Discussion Analysis of the multivariate multiple regression and t-test results reveals a few consistent themes in the results related to the information literacy knowledge, skills, and behaviors of first-generation students as well as all students, regardless of their first-generation status. These results both support and extend the conclusions and recommendations from earlier studies. Shared Strengths and Weaknesses One reason that the researchers opted to include all four modules of the TATIL test in this study was to gain a clearer understanding of the specific information literacy strengths and weak- nesses of students at Texas A&M University. The overall test scores reveal that the majority of Texas A&M University students score at the college ready level or above, which suggests that most students come to college with at least a moderate level of information literacy prepared- ness, regardless of first-generation or continuing-generation status. This finding suggests that librarians should not assume that students in either group are information literacy novices; instead, they should consider preassessing student information literacy skills ahead of instruc- tion sessions to determine the specific information literacy learning needs of their students. Although most students scored in the college ready category or above, reviewing the overall and outcome-specific knowledge performance scores reveals that students’ performance was not consistent across all four modules. There were some differences in the percentages of students scoring in the lower, conditionally ready, category and the higher, research ready, category. Students scored most highly in module 3 (R&S), which Carrick Enterprises notes “combines elements from the Research as Inquiry and Scholarship as a Conversation frames,” and module 4 (VoI), which focuses on the Information Has Value frame.35 This result was somewhat surprising, as these modules include concepts students might have been expected to struggle with, such as identifying gaps in the literature, developing research questions, and plagiarism. On the other end of the results is module 1 (EP&A), which focuses on the frames Authority Is Constructed and Contextual and Information Creation as a Process, and module 2 (SS), which focuses on Searching as Strategic Exploration.36 Notably, no first-generation or continuing- generation students breached the research ready threshold in the overall scores on module 1 (EP&A). Both first-generation and continuing-generation students had a larger number of students who scored as conditionally ready in these two modules, although both search strategies and source evaluation are frequently taught as part of the Texas A&M University Libraries’ information literacy program. These findings support librarians’ continued efforts to develop and reinforce information literacy knowledge and skills for all students, regardless of first-generation status, related to search strategies and source evaluation as part of information literacy instructional efforts. Though conducting preassessments with classes is a good way to inform instructional efforts in general, in particular librarians may wish to conduct preassessments related to Information Has Value, Research as Inquiry, and Scholarship as Conversation to ensure that instructional time devoted to those frames is carefully focused on the aspects with which students are less familiar. Assessing the Information Literacy Skills of First-Generation College Students 743 Tailored Information Literacy Instruction Analysis of TATIL test outcomes and performance indicators suggests that there is indeed a gap between the information literacy knowledge and skills of first-generation and continuing- generation students. Indeed, study results suggest that the gap between first-generation and continuing-generation students is wider than anticipated. Study results found statistically significant gaps in all four test modules and across all eight outcomes. They also revealed that first-generation students scored lower than their continuing-generation counterparts on 89 percent of the performance indicators and individual disposition questions, with statistically significant gaps on 36 (36%) of them. The statistically significant gaps were most concentrated in modules 1 (EP&A) and 3 (R&S). For TATIL test outcomes, the gap between first-generation students and continuing- generation students were the largest for three outcomes: 1. O11: Apply knowledge of source creation processes and context to evaluate the au- thority of a source. 2. O31: Understand the processes of scholarly communication and knowledge building. 3. O12: Apply knowledge of authority to analyze others’ claims and to support one’s own claims At the performance indicator level, large gaps between first-generation students and continuing-generation students were seen in all four modules. The five largest gaps were seen in the following performance indicators: 1. P1211: Determine the reason why a quote is used in a given passage (examples: show significance, give authoritative support, provide context, emphasize, summarize). 2. P126: Identify relevant questions to ask about the suitability of a source when con- sidering it as support for a claim. 3. P416: Given a list, select the purposes of citation. 4. P113: Match the elements of a source record to what they reveal about the process used to create the source (for example, publisher name, authors’ names, date, subject terms, source type). 5. P3212: Classify descriptions of specific actions taken during the research process by the stage in the research process when they are most likely to happen. These findings suggest that offering information literacy instruction specifically for first- generation students is appropriate given consistent information literacy knowledge and skill gaps between first-generation students and their continuing-generation counterparts. Many colleges and universities, including Texas A&M University, offer learning communities and other courses specifically aimed at supporting first-generation college students in their tran- sition to college. Arch and Gilman recommend “integrating information literacy instruction into existing first-generation student programs, including orientations and summer bridge programs.”37 The findings from this study support this recommendation. Knowledge vs. Dispositions Though the largest number of statistically significant gaps between first-generation and con- tinuing-generation students are found in the areas of outcomes and performance indicators, there were differences in the disposition category of the TATIL test. The TATIL test handles dispositions differently from the other question categories; unlike performance indicators, outcomes, and overall scores, which are interrelated, disposition scores are kept separate. 744 College & Research Libraries July 2021 Dispositions, according to the ACRL Framework, are “ways in which to address the affective, attitudinal, or valuing dimension of learning.”38 Students’ ability to recognize information and their attitude toward or the value they ascribe to that information may be different. For example, students may be able to identify the elements of a book citation from a list of possi- bilities; but, if they do not value citation, they may be less likely to implement their knowledge when creating a bibliography. There was a statistically significant difference between first-generation students and continuing-generation students in only two (22%) dispositions. The small number of dif- ferences in the disposition category may be due to differences in scoring, as there are fewer points associated with dispositions on the TATIL tests. However, it could suggest that, while there is a difference between first-generation and continuing-generation students in terms of information literacy knowledge and skills, there is a smaller difference in attitudes and be- haviors. Additional research is needed to better understand if and how information literacy gaps between first-generation and continuing-generation students exist outside a testing environment. Though the reason remains unclear for the comparatively small amount of difference between continuing-generation and first-generation students, it is clear that those differences are located in two different modules: module 2 (SS) and module 3 (R&S). It is noteworthy that none of the three dispositions in module 1 (EP&A) had statistically significant differences even though that module showed differences in both the outcomes and performance indicators categories. The two statistically significant dispositions, along with their descriptions from Carrick Enterprises,39 are listed below: • D21 (Productive persistence): “Learners who are disposed to demonstrate productive persistence during their searches for information approach searching as iterative and not linear by employing alternative strategies and learning from mistakes.” • D33 (Responsibility to community): “Learners who are disposed to demonstrate a sense of responsibility to the scholarly community recognize and conform to academic norms of knowledge building.” The differences in these specific dispositions may suggest that first-generation students are less likely to identify as part of the scholarly community and recognize some of the norms related to scholarly searching. Librarians should strive to explicitly welcome first-generation students into the scholarly research community and to make that community’s norms more transparent and inclusive for first-generation students. Limitations There are inherent limitations in a standardized testing approach to measuring information literacy skills. Participant behavior may not correlate with their responses on a multiple-choice test. In addition, because tests were administered out of class and for extra credit, the valid- ity of participant responses is uncertain. Tests were administered during multiple semesters and aggregated into a single dataset. It is possible that responses may have differed from one semester to another due to factors external to the study. It is also possible that there are other factors that contributed to differences between the two study populations that were not controlled for in the analysis. Finally, this is a study from a single university, and findings cannot be generalized to other universities. Assessing the Information Literacy Skills of First-Generation College Students 745 Conclusion It is commonly understood by librarians that information literacy skills are critical to aca- demic success for all students. It is also well-established that some student populations are underserved, may experience barriers, and may have different information literacy skills. Both continuing-generation and first-generation students demonstrate moderate to high informa- tion literacy skills as a whole, and librarians should take care not to assume that either group consists of information literacy novices. However, this study supports earlier research that suggested first-generation students’ information literacy skills differ from those of continuing- generation students and that therefore targeted and sustained information literacy support for first-generation students is appropriate. Identifying these differences in information literacy skills is an important first step toward advocating for the increased and dedicated resources necessary to develop targeted and sustained information literacy instructional support for first-generation college students. By establishing the need, librarians can open up new avenues for collaboration with first-generation programs and classes on their campus. In this way, librarians can help first-generation students feel welcome and supported by the library and also help first-generation students develop the information literacy skills to excel in higher education and beyond. 746 College & Research Libraries July 2021 APPENDIX A. TATIL Outcomes TABLE 9 TATIL Outcomes Code Module TATIL Outcome TATIL Outcome Description40 O11 1 (EP&A) Outcome 1.1 Apply knowledge of source creation processes and context to evaluate the authority of a source. O12 1 (EP&A) Outcome 1.2 Apply knowledge of authority to analyze others’ claims and to support one’s own claims. O21 2 (SS) Outcome 2.1 Plan, conduct, evaluate, and revise searches to achieve relevant results. O22 2 (SS) Outcome 2.2 Compare and contrast a range of search tools. O31 3 (R&S) Outcome 3.1 Understand the processes of scholarly communication and knowledge building. O32 3 (R&S) Outcome 3.2 Understand stages of the research process. O41 4 (VoI) Outcome 4.1 Recognize the rights and responsibilities of information creation. O42 4 (VoI) Outcome 4.2 Recognize social, legal, and economic factors affecting access to information. APPENDIX B. TATIL Dispositions TABLE 10 TATIL Dispositions Code Module TATIL Disposition TATIL Disposition Description41 D11 1 (EP&A) Disposition 1.1 Mindful self-reflection D12 1 (EP&A) Disposition 1.2 Toleration of ambiguity D13 1 (EP&A) Disposition 1.3 Responsibility to community D21 2 (SS) Disposition 2.1 Productive persistence D31 3 (R&S) Disposition 3.1 Productive persistence D32 3 (R&S) Disposition 3.2 Mindful self-reflection D33 3 (R&S) Disposition 3.3 Responsibility to community D41 4 (VoI) Disposition 4.1 Mindful self-reflection D42 4 (VoI) Disposition 4.2 Responsibility to community Assessing the Information Literacy Skills of First-Generation College Students 747 APPENDIX C. TATIL Performance Indicators and Individual Disposition Descriptions TABLE 11 TATIL Performance Indicators and Dispositions Code Module TATIL Performance Indicator/Individual Disposition TATIL Performance Indicator/Individual Disposition Description42 p113 1 (EP&A) Performance Indicator 1.1.3 Match the elements of a source record to what they reveal about the process used to create the source (such as publisher name, authors’ names, date, subject terms, source type). p116 1 (EP&A) Performance Indicator 1.1.6 Match an information need to the most authoritative source types (like news agency, government website, scholarly article) for fulfilling that need. p117 1 (EP&A) Performance Indicator 1.1.7 Identify the audience for whom a source was created. p118 1 (EP&A) Performance Indicator 1.1.8 Identify types of scholarly products and communication modes that fall outside the typical publication processes but are still worthy of use (examples: conference presentations, contributed papers, discussions on association websites). p119 1 (EP&A) Performance Indicator 1.1.9 Identify relevant questions to ask about sources’ origins and context when considering them as support for a claim. p1111 1 (EP&A) Performance Indicator 1.1.11 Match descriptions of popular, polemic, and primary documents to scenarios where it would be appropriate to use them. D11a 1 (EP&A) Disposition 1.1 Mindful self-reflection. D12b 1 (EP&A) Disposition 1.2 Toleration of ambiguity. D13b 1 (EP&A) Disposition 1.3 Responsibility to community. p124 1 (EP&A) Performance Indicator 1.2.4 Recognize that polished, visually appealing presentation of web content does not equate to authoritative, high- quality content. p126 1 (EP&A) Performance Indicator 1.2.6 Identify relevant questions to ask about the suitability of a source when considering it as support for a claim. p127 1 (EP&A) Performance Indicator 1.2.7 Identify information directly relevant to an argument. p129 1 (EP&A) Performance Indicator 1.2.9 Recognize when a quote from a well-known author or recognized expert is being used by an author to gain authority. p1211 1 (EP&A) Performance Indicator 1.2.11 Determine the reason why a quote is used in a given passage (for instance: show significance, give authoritative support, provide context, emphasize, summarize). p212 2 (SS) Performance Indicator 2.1.2 Identify keyword searching as an appropriate basic search strategy when beginning research. p216 2 (SS) Performance Indicator 2.1.6 Scan search results for synonyms to use for additional searches. 748 College & Research Libraries July 2021 TABLE 11 TATIL Performance Indicators and Dispositions Code Module TATIL Performance Indicator/Individual Disposition TATIL Performance Indicator/Individual Disposition Description42 p2111 2 (SS) Performance Indicator 2.1.11 Apply nested logic structures, Boolean operators, and truncation to successfully construct an advanced search. p221 2 (SS) Performance Indicator 2.2.1 Identify differences among search tools such as those on the open web, in a database, and in a library catalog. p222 2 (SS) Performance Indicator 2.2.2 Understand when it is appropriate to use a web search engine to find information. D21c 2 (SS) Disposition 2.1 Productive persistence. p312 3 (R&S) Performance Indicator 3.1.2 Given a literature review, identify the gap that the authors have identified in the existing research. p314 3 (R&S) Performance Indicator 3.1.3 Recognize that scholars bring their own perspectives to the study of a research topic. p315 3 (R&S) Performance Indicator 3.1.5 Categorize common types of sources by whether the authors are expected to list their cited sources. p316 3 (R&S) Performance Indicator 3.1.6 Identify social consequences of scientific falsification. p318 3 (R&S) Performance Indicator 3.1.8 Identify reasons why scholars track down influential works. p319 3 (R&S) Performance Indicator 3.1.9 Identify venues for scholarly communication, such as books, journals, conventions, blogs. p3112 3 (R&S) Performance Indicator 3.1.12 Evaluate an emerging scholar’s likelihood of being accepted into the scholarly conversation. p3113 3 (R&S) Performance Indicator 3.1.13 Given a description of scholarly disagreement, select the interpretation that acknowledges the value of disagreement for moving knowledge forward. p3114 3 (R&S) Performance Indicator 3.1.14 Given a set of research needs, match them to appropriate research methods. D31b 3 (R&S) Disposition 3.1 Productive persistence. p325 3 (R&S) Performance Indicator 3.2.5 Order the stages of the research process when writing a research paper. p326 3 (R&S) Performance Indicator 3.2.6 Explain why research inquiry can be appropriate for personal information needs in addition to academic needs. p3212 3 (R&S) Performance Indicator 3.2.12 Classify descriptions of specific actions taken during the research process by the stage in the research process when they are most likely to happen. D33a 3 (R&S) Disposition 3.3 Responsibility to community. p416 4 (VoI) Performance Indicator 4.1.6 Given a list, select the purposes of citation. p425 4 (VoI) Performance Indicator 4.2.5 Identify the meaning and scope of the concept of intellectual property. Assessing the Information Literacy Skills of First-Generation College Students 749 APPENDIX D. Performance Indicator and Individual Disposition Question Scores TABLE 13 Multivariate Regression Analysis Results for Performance Indicators Estimate Standard Error t-value p-value Module 1 (EP&A) p1211 Intercept 469.32 11.80 39.78 0.000 Firstgen/Continuing –90.67 23.73 –3.82 0.000*** p116 Intercept 366.52 12.53 29.25 0.000 Firstgen/Continuing –74.86 25.20 –2.97 0.003** p126 Intercept 428.35 11.88 36.06 0.000 Firstgen/Continuing –88.62 23.89 –3.71 0.000*** TABLE 12 Performance Indicator Scores for First-Generation and Continuing-Generation Students Module 1 (EP&A) Group p1211 (M/SD) p116 (M/SD) p126 (M/SD) p119 (M/SD) p129 (M/SD) p113 (M/SD) p1111 (M/SD) p118 (M/SD) p117 (M/SD) p127 (M/SD) p124 (M/SD) D11a (M/SD) D13b (M/SD) D12b (M/SD) Firstgen 378.65/ 262.10 291.66/ 250.77 339.73/ 248.97 335.05/ 266.42 202.00/ 145.42 333.13/ 257.80 256.82/ 171.14 347.42/ 287.90 264.90/ 181.16 184.46/ 128.94 198.01/ 261.46 15.93/ 4.69 9.29/ 3.32 12.65/ 3.51 Continuing 469.32/ 252.23 366.52/ 276.59 428.35/ 258.83 411.53/ 269.02 231.80/ 131.29 413.39/ 288.60 310.18/ 163.48 414.88/ 282.82 298.63/ 168.85 225.79/ 104.45 253.09/ 270.23 16.85/ 4.36 9.92/ 2.98 13.52/ 3.67 Module 2 (SS) p216 (M/SD) p212 (M/SD) p2111 (M/SD) p222 (M/SD) p221 (M/SD) D21c (M/SD) Firstgen 514.48/ 266.16 69.50/ 29.56 198.59/ 233.73 207.44/ 263.46 214.56/ 231.59 14.85/ 2.71 Continuing 581.09/ 274.38 76.73/ 20.12 252.24/ 235.71 261.33/ 270.17 261.23/ 229.85 15.44/ 2.65 Module 3 (R&S) p312 (M/SD) p314 (M/SD) p3113 (M/SD) p325 (M/SD) p3112 (M/SD) p326 (M/SD) p315 (M/SD) p3212 (M/SD) p318 (M/SD) p319 (M/SD) p3114 (M/SD) p316 (M/SD) D31b (M/SD) D33a (M/SD) Firstgen 219.36/ 267.33 162.60/ 258.55 136.29/ 100.56 316.25/ 199.53 379.48/ 292.88 162.11/ 132.07 286.333/ 262.51 398.37/ 260.28 276.19/ 148.55 299.01/ 189.63 283.34/ 181.68 189.70/ 215.99 10.19/ 2.26 9.33/ 2.83 Continuing 284.67/ 271.49 218.91/ 277.95 156.65/ 91.52 378.33/ 183.72 457.88/ 297.04 190.10/ 122.61 352.76/ 288.98 478.10/ 260.50 318.44/ 174.19 355.02/ 188.40 318.00/ 176.36 248.57/ 214.94 10.64/ 2.27 10.09/ 3.02 Module 4 (VoI) P425 (M/SD) P416 (M/SD) Firstgen 497.54/ 296.10 419.15/ 279.03 Continuing 559.48/ 298.83 499.49/ 284.79 750 College & Research Libraries July 2021 TABLE 13 Multivariate Regression Analysis Results for Performance Indicators Estimate Standard Error t-value p-value p119 Intercept 411.53 12.43 33.10 0.000 Firstgen/Continuing –76.47 25.01 –3.06 0.002** p129 Intercept 231.80 6.25 37.09 0.000 Firstgen/Continuing –29.80 12.57 –2.37 0.018* p113 Intercept 413.39 13.03 31.72 0.000 Firstgen/Continuing –80.26 26.21 –3.06 0.002** p1111 Intercept 310.18 7.66 40.48 0.000 Firstgen/Continuing –53.37 15.41 –3.46 0.001*** p118 Intercept 414.88 13.16 31.53 0.000 Firstgen/Continuing –67.46 26.47 –2.55 0.011* p117 Intercept 298.63 7.97 37.49 0.000 Firstgen/Continuing –33.73 16.02 –2.11 0.036* p127 Intercept 225.79 5.14 43.92 0.000 Firstgen/Continuing –41.33 10.34 –4.00 0.000*** p124 Intercept 253.09 12.42 20.38 0.000 Firstgen/Continuing –55.07 24.98 –2.20 0.028* D11a Intercept 16.85 0.21 81.85 0.000 Firstgen/Continuing –0.92 0.41 –2.22 0.027* D13b Intercept 9.92 0.14 69.92 0.000 Firstgen/Continuing –0.64 0.29 –2.23 0.026* D12b Intercept 13.52 0.17 80.33 0.000 Firstgen/Continuing –0.87 0.34 –2.56 0.011* Module 2 (SS) p216 Intercept 581.09 13.03 44.58 0.000 Firstgen/Continuing –66.61 24.93 –2.67 0.008** p212 Assessing the Information Literacy Skills of First-Generation College Students 751 TABLE 13 Multivariate Regression Analysis Results for Performance Indicators Estimate Standard Error t-value p-value Intercept 76.73 1.11 69.41 0.000 First-gen/Continuing –7.23 2.11 –3.42 0.001*** p2111 Intercept 252.24 11.26 22.40 0.000 First-gen/Continuing –53.65 21.54 –2.49 0.013* p222 Intercept 261.33 12.85 20.33 0.000 First-gen/Continuing –53.89 24.58 –2.19 0.029* p221 Intercept 261.23 11.03 23.68 0.000 First-gen/Continuing –46.67 21.10 –2.21 0.027* D21c Intercept 15.44 0.13 120.88 0.000 First-gen/Continuing –0.58 0.24 –2.38 0.017* Module 3 (R&S) p312 Intercept 284.67 13.00 21.90 0.000 Firstgen/Continuing –65.31 25.56 –2.56 0.011* p314 Intercept 218.91 13.12 16.68 0.000 Firstgen/Continuing –56.30 25.81 –2.18 0.030* p3113 Intercept 156.65 4.51 34.70 0.000 Firstgen/Continuing –20.36 8.88 –2.29 0.022* p325 Intercept 378.33 9.00 42.05 0.000 Firstgen/Continuing –62.08 17.69 –3.51 0.000*** p3112 Intercept 457.88 14.22 32.19 0.000 Firstgen/Continuing –78.40 27.97 –2.80 0.005** p326 Intercept 190.10 6.01 31.62 0.000 Firstgen/Continuing –27.99 11.82 –2.37 0.018* p315 Intercept 352.76 13.57 25.99 0.000 Firstgen/Continuing –66.43 26.69 –2.49 0.013* p3212 Intercept 478.10 12.52 38.20 0.000 752 College & Research Libraries July 2021 TABLE 13 Multivariate Regression Analysis Results for Performance Indicators Estimate Standard Error t-value p-value Firstgen/Continuing –79.73 24.61 –3.24 0.001*** p318 Intercept 318.44 8.07 39.45 0.000 Firstgen/Continuing –42.26 15.87 –2.66 0.008** p319 Intercept 355.02 9.07 39.15 0.000 Firstgen/Continuing –56.01 17.84 –3.14 0.002** p3114 Intercept 318.00 8.54 37.23 0.000 Firstgen/Continuing –34.66 16.80 –2.06 0.040* p316 Intercept 248.57 10.34 24.03 0.000 Firstgen/Continuing –58.88 20.34 –2.89 0.004** D31b Intercept 10.64 0.11 97.65 0.000 Firstgen/Continuing –0.45 0.21 –2.10 0.036* D33a Intercept 10.09 0.14 70.67 0.000 Firstgen/Continuing –0.76 0.28 –2.70 0.007** Module 4 (VoI) p425 Intercept 559.48 14.53 38.52 0.000 Firstgen/Continuing –61.94 27.03 –2.29 0.022* p416 Intercept 499.49 13.80 36.20 0.000 Firstgen/Continuing –80.35 25.68 –3.13 0.002** *p < 0.05; **p < 0.01; ***p < 0.001. Notes 1. Xan Arch and Isaac Gilman, “First Principles: Designing Services for First-Generation Students,” College & Research Libraries 80, no. 7 (2019): 996–1012; Emily Forrest Cataldi, Christopher T. Bennett, and Xianglei Chen, “First-Generation Students: College Access, Persistence, and Postbachelor’s Outcomes,” NCES 2018-421, Statistics in Brief (Washington, DC: US Department of Education, 2018). 2. Adriana Parker, “Academic Libraries and Vulnerable Student Populations: A New Model of Embedded Librarianship for First-Generation University Students,” Political Librarian 3 no. 1, (2017): 26–31. 3. Parker, “Academic Libraries and Vulnerable Student Populations.” 4. Darren Ilett, “A Critical Review of LIS Literature on First-Generation Students,” portal: Libraries and the Academy 19, no. 1 (2019): 177–96; Firouzeh Logan and Elizabeth Pickard, “First-Generation College Students: A Sketch of Their Research Process,” in College Libraries and Student Culture: What We Now Know, eds. Lynda M. Duke and Andrew D. Asher (Chicago, IL: American Library Association, 2012), 109–25; Elizabeth Pickard and Firouzeh Logan, “The Research Process and the Library: First-Generation College Seniors vs. Freshmen,” College & Research Libraries 74, no. 4 (2013): 399–415. Assessing the Information Literacy Skills of First-Generation College Students 753 5. Megan Oakleaf and Neal Kaske, “Guiding Questions for Assessing Information Literacy in Higher Edu- cation,” portal: Libraries and the Academy 9, no. 2 (2009): 273. 6. Melissa Gross and Don Latham, “What’s Skill Got to Do with It? Information Literacy Skills and Self-Views of Ability among First-Year College Students,” Journal of the American Society for Information Science and Technology 63, no. 3 (2012): 574–83. 7. Patricia Davitt Maughan, “Assessing Information Literacy among Undergraduates: A Discussion of the Literature and the University of California-Berkeley Assessment Experience,” College & Research Libraries 62, no. 1 (2001): 71–85; Brandy Whitlock and Nassim Ebrahimi, “Beyond the Library: Using Multiple, Mixed Measures Simultaneously in a College-Wide Assessment of Information Literacy,” College & Research Libraries 77, no. 2 (2016): 236–62. 8. Amalia Monroe-Gulick and Julie Petr, “Incoming Graduate Students in the Social Sciences: How Much Do They Really Know about Library Research?” portal: Libraries and the Academy 12, no. 3 (2012): 315–35. 9. Samantha Godbey and Alexandra Dema, “Assessment and Perception of Information Literacy Skills among Teacher Education Students,” Behavioral & Social Sciences Librarian 36, no. 1 (2017): 1–15. 10. Russell Michalak and Monica D.T. Rysavy, “Information Literacy in 2015: International Graduate Busi- ness Students’ Perceptions of Information Literacy Skills Compared to Test-Assessed Skills,” Journal of Business & Finance Librarianship 21, no. 2 (2016): 152–74. 11. Amy Jo Catalano, “Using ACRL Standards to Assess the Information Literacy of Graduate Students in an Education Program,” Evidence Based Library & Information Practice 5, no. 4 (2010): 21–38. 12. Min Tong and Carrie Moran, “Are Transfer Students Lagging behind in Information Literacy?” Reference Services Review 45, no. 2 (2017): 286–97. 13. Ma Lei Hsieh, Susan McManimon, and Sharon Yang, “Faculty-Librarian Collaboration in Improving Information Literacy of Educational Opportunity Program Students,” Reference Services Review 41, no. 2 (2013): 313–35. 14. Stacy Brinkman, Katie Gibson, and Jenny Presnell, “When the Helicopters Are Silent: The Information Seeking Strategies of First-Generation College Students,” in Imagine, Innovate, Inspire: Proceedings of the ACRL 2013 Conference, ed. Dawn M. Mueller (Chicago, IL: Association of College and Research Libraries, 2013), 647. 15. Parker, “Academic Libraries and Vulnerable Student Populations,” 29. 16. David A. Tyckoson, “Library Service for the First-Generation College Student,” in Teaching the New Library to Today’s Users: Reaching International, Minority, Senior Citizens, Gay/Lesbian, First-Generation, At-Risk, Graduate and Returning Students, and Distance Learners, eds. Trudi E. Jacobson and Helene C. Williams (New York, NY: Neal-Schuman, 2000), 100–04. 17. Arch and Gilman, “First Principles,” 1002–08. 18. Ilett, “A Critical Review of LIS Literature on First-Generation Students,” 118. 19. Logan and Pickard, “First-Generation College Students,” 113. 20. Pickard and Logan, “The Research Process and the Library,” 411. 21. Stephanie J. Graves, Sarah LeMire, and Kathy Christie Anders, “Uncovering the Information Literacy Skills of First-Generation and Provisionally Admitted Students,” Journal of Academic Librarianship 47, no. 1 (2021): 102260. 22. Megan Oakleaf, “Dangers and Opportunities: A Conceptual Map of Information Literacy Assessment Approaches,” portal: Libraries and the Academy 8, no. 3 (2008): 234–40. 23. Oakleaf, “Dangers and Opportunities,” 235. 24. Graves, LeMire, and Anders, “Uncovering the Information Literacy Skills of First-Generation and Provi- sionally Admitted Students.” 25. “The Tests,” Project SAILS, accessed June 5, 2020, https://www.projectsails.org/site/background/. 26. “How the Test Was Developed,” Threshold Achievement Test for Information Literacy, accessed June 5, 2020, https://thresholdachievement.com/the-test/background. 27. “Test Modules,” Threshold Achievement Test for Information Literacy (TATIL), accessed June 5, 2020, https://thresholdachievement.com/the-test/test-modules. 28. “Test Modules,” TATIL. 29. “Module Descriptions,” Threshold Achievement Test for Information Literacy (TATIL), accessed June 5, 2020, https://thresholdachievement.com/files/Module-Descriptions.pdf. 30. “Module Descriptions,” TATIL. 31. “How the Test Was Developed,” Threshold Achievement Test for Information Literacy (TATIL). 32. How the Test Was Developed,” TATIL. 33. “Module Descriptions,” TATIL. 34. “Module Descriptions,” TATIL. https://www.projectsails.org/site/background/ https://thresholdachievement.com/the-test/background https://thresholdachievement.com/the-test/test-modules https://thresholdachievement.com/files/Module-Descriptions.pdf 754 College & Research Libraries July 2021 35. “Module Descriptions,” TATIL. 36. “Module Descriptions,” TATIL. 37. Arch and Gilman, “First Principles,” 1002. 38. “Framework for Information Literacy for Higher Education,” Association of College and Research Librar- ies, accessed June 5, 2020, www.ala.org/acrl/standards/ilframework. 39. “Module Descriptions,” TATIL. 40. “Module Descriptions,” TATIL. 41. “Module Descriptions,” TATIL. 42. “Module Descriptions,” TATIL. http://www.ala.org/acrl/standards/ilframework _agyvyrwc7xkm _5n0kwt2tdpeo _xir2mw4qjgno _gzqwhs58cz36 _7c951tq3j995 _v20bvx7reww1 _mh79wg3xejgn _gqhm2kog329k _17x6gff096hk _n5b5nynxujxw _l7h9wj9imt4u _iwnc2xmbpe9k _onjigywxvazv _gjdgxs _sw4k1xydp2pw _fotvbinmi8w9 _quxcbvv24sd6 _u6yuf05uuy5g _c50pxlf71keg _607wfja4vims _eur2d8m2m1wb _loevabdix8ef