Take the Template and Run: Austin Community College’s Student Library and Technology Use Study
In the Library with the Lead Pipe welcomes guest author Adrian Whatley. Adrian Whatley is the e-Resources Librarian at Austin Community College. She views her job as connecting students to the information they need in the easiest, most time-efficient manner possible.
by Adrian Whatley and Ellie Collier
Part One: Setting the Stage
Austin Community College Library Services (ACCLS), like many academic library systems, is coming to terms with an increasingly tech savvy student population that will ultimately transform the nature of the work we do, the services we provide, and the content we offer. Instead of relying on assumptions and overheated trendcasting, we wanted to hear from the students themselves: the devices they own, which social networks they frequent, and what types of tech-oriented services they would like to see from us. Our goal was to ascertain the facts that would lead to the oft-lauded data-driven decisions that all libraries seek. To that end, we conducted a college wide student library and technology use environmental scan.
The idea for doing the survey germinated in the Summer of 2009, while collaborating on a webpage that would feature a collection of links to online searching, citing, and organizing tools. After musing about what students might find most useful, we had a “Why don’t we just ask them?” moment. Ellie had recently interviewed Char Booth about her research report, Informing Innovation: Tracking Student Interest in Emerging Library Technologies at Ohio University (A Research Report), which she felt was an especially good example of hard data about student technology use. Moreover, Adrian had attended Booth’s presentation at the 14th Annual ACRL convention in Seattle, and had been very impressed by the insightful, and in some cases, counter-prevailing wisdom of the results. We batted around the idea about replicating the survey at our own institution. Would the social network usage of ACC students mirror OU’s?1 How would community college students differ in their rate of technology adoption? And most importantly, would they friend us on Facebook?2
Part Two: Getting to Yes
Every institution has its own structure for approving projects. ACCLS functions primarily through a team structure, but was also engaged in creating a project proposal form that would allow individuals to propose projects for themselves, or for a specific team to tackle. We were very excited to take on this project ourselves, so we brainstormed approaches and settled on filling out the application for an ACC Innovation Grant as a way to organize ourselves before presenting our proposal to library management. The application process for the grant necessitated the completion of a scope and purpose statement, planning report, budget request, and project timeline. The documentation process proved invaluable to us as we pitched the project and conceptualized its deployment at ACC. Because we had done the initial legwork of nailing down timelines and formulating grand strategies, it was easier to focus on important details, such as honing the phrasing of the survey instrument itself.
While we didn’t receive the Innovation Grant, we received strong support for the project from our library dean, who facilitated our goal of offering a cash prize rather than an electronic gadget (a move that created additional administrative hoops). She also provided extra adjunct librarian hours to afford us time off the reference desk to focus on the project.
Part Three: Nuts and Bolts
Austin Community College consists of eight campuses throughout central Texas serving roughly 44,000 students with nearly three quarters of our students attending part time. Each campus has its own library, and in 2008, all eight libraries combined had a door count of 1,124,027 and 1,014,250 hits on our homepage.3 The fact that our virtual and actual visits are nearly equivalent highlighted to us the need for a serious and measured evaluation of technology use among our student population in order to further develop digital services. We spent the Summer and Fall of 2009 determining our scope and purpose, defining the research questions, and developing the survey instrument with feedback from our colleagues. We also evaluated several types of survey software, and decided that SurveyMonkey provided the best delivery and analysis options.
Our goal was to survey the entire ACC student body. The survey itself consisted of 7 demographic questions, 21 multiple-choice questions about library and technology use, and 3 open-ended items that queried general student experiences with ACC Libraries. While drafting the instrument, we quickly found that we wished to ask many more questions than our respondents likely had the patience to answer, so some focusing of direction was necessary.
We put a lot of effort into making the survey as short as possible and ended up tabling many questions that we would have liked to ask (though we did save them for future use). Our primary criteria was, “How would we use the results from this question?” With that as a guideline we decided to cut common survey questions like race and gender, which may have been interesting, but wasn’t going to impact whether or not we create a mobile website or offer text messaging reference. We kept demographic questions about campuses and majors so that we could gauge how representative our sample was and provide bibliographers with custom profiles. With our editors’ red pens in hand we also cut questions about information literacy, search strategies, favorite resources, and library website usage, deciding that that cell phone usage and general comfort with technology would be our main foci. We probably should have cut even more, both to increase the response rate and to reduce the amount of work needed to process and interpret the results.
Promotion and distribution of the instrument took place in February. The survey was open to all ACC students and available from March 22 until April 16. We collected a total of 1,097 responses (87.8% completion rate). Promotional efforts included: a link on the library homepage, a splash on the ACC homepage, posts to the ACC faculty listserv, a presence on the library’s blackboard tab, bookmarks, posters, table tents, links on the desktops of the computers in the computer lab, links on the Student Life website and Facebook pages, and an article in the student newspaper, The Accent. Our colleagues also greatly assisted in the marketing of the survey by directing students to it at the conclusion of library instruction sessions and reference interviews.
Part Four: Death to all open ended questions!
Ellie was able to pull together the main responses almost immediately thanks to the simplicity of reporting in Survey Monkey. Coding the responses to the open ended questions was, however, another story entirely. We decided to put our graduate school research methods course to use. We reviewed the responses independently to come up with categories, met to finalize agreed upon categories, coded the questions independently, and met again to agree upon codes. In hindsight, this was overkill. We could, theoretically, tell you our intercoder reliability now, but for our purposes, it really doesn’t matter and it added an enormous amount of time and energy to the whole process. We were able to create and demo a slideshow of graphs of the results to the multiple choice questions at multiple library meetings in May. It took us until November to complete the coding and data crunching for the open ended questions. We feel it would have been preferable to have more of the results available sooner to capitalize on the initial excitement. We would recommend cutting several steps, perhaps just reading the questions separately to brainstorm categories and attach initial codes and then meeting to come to consensus.
The open ended responses did provide an interesting look into many of our and our students’ assumptions. For example, a number of the responses to our question about texting services made it clear that students did not know they could renew books online. We had a different question that asked students what services they were and weren’t familiar with and 32% responded that they didn’t know they could renew books online. However, the open ended responses made much more of an impact on us. So, to be clear, we’re not saying don’t use open ended questions, just don’t worry about being quite as thorough as we were.
Part Five: Results!
For full summary of results, or to follow along with graphic representations of the data while you’re reading, open the slideshow.
Before detailing responses, we would like to caution that this is a convenience sample rather than a probability sample. Nearly half (47%) of respondents learned of the survey through the library website, so results are skewed towards students who use the library website. [slide 2] However, we were pleased to see a large response number (total started survey: 1,250, total completed survey: 1,097) and a wide variety of majors represented, with the highest being Nursing (151, 12.4%) and General Education Transfer Credits (98, 8.1%). [slide 4]
Overall, we had generally positive feedback, in particular, that the library is valued as physical quiet space to work/study. Students reported physically coming to the library (63% 2-3 times/week or more) more than using the library website (45% 2-3 times/week or more). [slide 12] This mirrored the Informing Innovation results which also showed a greater use of the library building than the library website (p. 67)
We saw from Booth’s report that the study had a side effect of increasing student awareness of library services. With that in mind, one of our questions was a bucket list of every service we could think of. Of the items with low awareness, the biggest takeaway in our eyes is letting students know about their library account and how to access it online. [slide 14]
Students showed general confidence in their searching and technology abilities, but not in web design or fixing computer problems. While they tended to feel that their research skills were at least adequate, there was a sharp decline in their confidence from using the Internet for personal use to using it for school to researching with library resources. [slide 17] These results were nearly identical to those from Informing Innovation (p. 63).
About half (47%) of our respondents owned smartphones. On the assumption that we have a disproportionately tech-savvy sample, we can hypothesize that some smaller percentage of the whole student body have smartphones. Slightly fewer (44%) owned VHS players and only 53 students (5%) owned e-book readers. [slide 20] It is interesting to compare these results to the 2010 ECAR Study of Undergraduate Students and Information Technology, which found a 3.1% ownership of e-book readers and 62% ownership of smartphones among its respondents (p. 42).
We made an error in formatting the computer and laptop ownership question4, but generally we found that most respondents did own a computer. Seventy students (6%) reported having no internet connection at home, and another 21 (2%) used dial-up. The same number of students reported having wireless and broadband (about 60%). A quick glance at the data showed that many students selected wireless without selecting where their wireless was coming from, so the broadband/cable/DSL adoption is likely closer to 90%. [slide 22] Ellie was particularly interested in this section of the survey. She has had several students refuse e-books as potential sources saying they didn’t have computers and/or internet at home. If we combine our students with dial-up and those without internet at home, that means nearly a tenth of our students don’t have the tools to use our online resources. If we are correct in the assumption that this sample was a particularly tech savvy subset, the number could be larger.
We asked students how likely they were to embrace new technologies. The responses trend slightly above average, enforcing our assumption that our sample is slightly skewed towards the tech savvy. [slide 25] The ECAR results (p. 39) were similar however, so it could also simply show a trend of people preferring to consider themselves average or above.
Another portion of our survey listed a wide selection of social networking and other online activities and asked respondents how often they participated in each. Texting and Facebook were the only two answers with high “every day” response rates. Watching videos on Youtube and reading wiki articles were the highest “2-3 times a week” responses. Most other options had high scores in the “never” category. [slide 26]
Most (96%) respondents own cell phones. [slide 29] Offered a selection of cell phone activities, text messaging scored highest in the “every day” category, with “never” being the most common response for most other choices (including post to Twitter, read or contribute to blogs, download music, watch videos, read e-books). [slide 30] In terms of interacting with the library, students would most like to use text messaging to receive renewal or overdue notices and to renew materials. [slide 32]
More than half (66%) of respondents’ cell phones can access the internet [slide 33] and there was generally high interest in using the library website on their phones, with the most positive responses for finding hours, locations and phone numbers, checking their account/renewing books, and searching for books. [slide 34]
Facebook was the top social networking site both in terms of reported use (45% use it all the time) and in willingness to friend the library (25%). All other choices (including Twitter, MySpace, Second Life, delicious, and more) scored highest in either “never use it” or “never heard of it.” [slide 37] Facebook was also the overwhelmingly preferred social network of respondents to the ECAR, while MySpace was a distant second (p.11). Of the 484 students who wrote in the open ended section answering, “Would you like the ACC Libraries to have accounts on any of these sites? Why or why not? What type of information would you want the library to share?” the top categories of replies were:
- Yes, Facebook (97)
- Content – Events (49)
- Negative feedback about social networking sites in general (32)
- Content – Contact Information (31)
- Negative feedback about the library being in a social space – these are personal/private spaces and/or non-academic (25)
Nearly half of respondents chose Firefox as their preferred web browser, though we believe this question may have been inadvertently skewed by offering “Google Chrome” as a choice rather than just saying “Chrome.” We believe the high choice of Chrome (21%) can at least partially be attributed to students selecting the search engine Google rather than the browser. This belief is based on additional write in answers choosing Yahoo! as their preferred browser. [slide 39] (To the best of our knowledge, Yahoo! does not currently offer a browser, though they may have in the past, so we cannot entirely rule it out.) More than half (59%) of respondents customized their browsers [slide 40], and 56% indicated interest in a library related-plugin. [slide 41] (Luckily, ACC already offered the LibX toolbar).
We ended with three open ended questions:
What do you like MOST about the ACC libraries? (988 replies)
Top responses:
- Staff (268)
- Ease of use (213)
- Quiet (164)
- Computers (146)
- Collection (133)
- Facilities (82)
- Atmosphere (64)
- Hours (49)
- Resources (48)
- Printing (42)
What would you CHANGE about the ACC libraries? (930 replies)
Top responses:
- Nothing (260)
- Provide more computers (117)
- Provide more space (83)
- Stay open longer hours (82)
- Provide more books (76)
- Specific/one of a kind complaints (46)
- Enforce quiet (39)
- Provide more study rooms (27)
- Provide more seating (24)
- Printing (23) / Outlets (22)
The last question asked whether they had any other comments. The top two responses far outweighed any others and they were, “no” and generally positive feedback.
Part Six: Missteps and Mental Models
We had hoped to create bibliographer reports like Char Booth did, breaking down responses by subject area, however only a handful of categories had enough responses to make this worthwhile. We have also offered to run any requested analyses, but have had very few requests.
We already mentioned our formatting mistake on the computer ownership question and the issue with offering browser choices. We also had a few questions that asked, “If the library offered X…” and received write in replies along the lines of, “I didn’t know you did that,” telling us either the student didn’t read the question clearly enough, or we didn’t word it clearly enough.
Other responses revealed more about our students’ mental models. Some responses showed student assumptions on how the library and technology interrelate. Others revealed assumed relations between physical areas or services that are run by different departments. Here are a few examples of ACC students’ misconceptions that we encountered while reading open ended responses:
- All traditional library materials (books, articles, videos, etc.) are available electronically.
- Lack of distinction between a phone application and the internet/web.
- Lack of distinction between a browser and the web.
- Lack of distinction between copier and printer.
- Lack of distinction between computer labs and library.
- Lack of distinction between other ACC service providers and library.
- Assumption that if the library offered text a librarian we would be giving out librarians’ personal cell phone numbers.
Some of these are just interesting without being particularly problematic. Ellie noticed an increasing number of students standing in front of the copier looking for their print jobs, but was much more understanding of the confusion after seeing her doctor’s office’s all-in-one giant copier/scanner/fax/printer. Some signage may be helpful, but the issue doesn’t have much impact on student success or on how students view the library.
Complaints about computer labs and wireless service however, do effect how students view the library. They are also, both fortunately and unfortunately, handled by a different department. The confusion is particularly understandable since in many buildings the computer lab is physically inside the library. We hope to address a number of these issues in a public response to the survey. We will also be sharing a final report with the college and highlighting items of interest for specific departments.
Finally, there are those more esoteric issues that are much harder to tackle. We can create a tutorial or link to an explanation of electronic publishing or a definition of a browser, but how do we get students to watch it and how do we know if they understood? And how much does it really matter? If they can get online, check their email, register for classes and get the research they need to finish their paper, is it hurting them if they “clicked the blue e” without knowing what a browser is or how it’s different from an app? As technology lines blur, how important is it to fully understand all the distinctions? We’d love to hear your take.
Part Seven: Still to come
We are writing this article and our final report to the college at the same time. Some of our preliminary recommendations include:
- Publicize the library account features and how to access your account online. – e.g. place a library account bookmark in each checked out book for a week.
- Create a “Did you know?” page on our library website. Review open ended answers to “What would you change” and list items that were requested that we already provide. Also provide brief answers to items that are out of our control – e.g. computer labs are run by a different department, library does not control the wireless.
- Run a “You asked, we answered” promotion – e.g. promoting Library Anywhere and the LibX toolbar as a response to survey results.
- Create a simple short explanation of textbooks. Cover that the library doesn’t purchase textbooks & why. Explain that student services makes the decisions on how many and which textbooks to purchase. Tips on how to improve your chances on getting a textbook (clean record, completed form, etc.)
- Address mental models – provide an explanation of how books end up online (that not every book is available as an e-book).
- Investigate checking due dates and providing renewals through text messaging.
Some of our results fit the standard pronouncements about student social media and technology use (everyone is on Facebook, text messaging is the current hot thing), but some of our library-oriented results in particular surprised us (the students particularly value the library as place and want the librarians to enforce the rules – especially keeping the facilities quiet). We hope that sharing our experience has encouraged you to adapt Booth’s template for your own institution.
Thanks to Char Booth for her inspiration, hard work, and feedback on this article, and to Emily Ford and Eric Frierson for their feedback.
Additional Resources
- Full survey instrument
- ECAR Study
- Informing Innovation
- The Physical and the Virtual: The Relationship between Library as Place and Electronic Collections
- Ellie in particular was interested to see if danah boyd’s findings about class and social network preference would be visible in our results. We certainly did not see the animosity towards MySpace that Booth saw in her results, but we did still see Facebook as the primary player in social networking sites for our students. [↩]
- Not so much. [↩]
- Data from the ACC Student Factbook [↩]
- We asked how long they had owned their laptop and desktop computers. We misconfigured the question such that they could not select the same answer for both. [↩]