114 Book Reviews Altmetrics: A Practical Guide for Librarians, Researchers and Academ- ics. Andy Tattersall, ed. London: Facet Publishing, 2016. 224p. $95.00 (ISBN 978-1-78330-010-5). Many of us working in information services, administration, or research in the higher education sector grapple daily with the measurement of that most nebulous of concepts: “impact.” Researchers are required to dem- onstrate it, administrators are required to measure it, and funding bodies demand evidence of it—but a definition remains elusive, and recent years have seen an explosion of metrics claiming to capture various aspects of it. This pursuit can often feel like trying to nail fog to a wall; and, as someone who has previously worked as a researcher and now works on the administrative side of things, I have been intrigued by, and curious about, the growth of various alternatives to traditional citation-based metrics: so-called “altmetrics.” The term “altmetrics”—a portmanteau of “alternative metrics”—was coined in 2010 when a group of academics, dissatisfied with the traditional measurement of research impact via the rather blunt instrument of citation analysis, published the altmetrics manifesto (Priem et al., 2010). In this manifesto, they argued that the explosion in academic publishing brought about by the information age was causing the failure of the traditional methods by which academics discover and assimilate new research. They proposed a family of metrics based on the new forms of data available online that reflected the accumulated online activity around a piece of research. These alternative metrics are a family of indicators of online interest attached to a particular published piece of work, and many have been proposed. For example, an altmetric can be the number of researchers who have saved an article in their Mendeley, CiteULike, or Zotero library, the number of tweets on Twitter about an article, the number of mentions in academic blogs, the number of mentions in mainstream news outlets, citations on Wikipedia, downloads, shares, likes…. These are the ripples generated in the intel- lectual waters as knowledge-hungry fish swarm toward a tasty tidbit. Riding the wave of excitement, interest, and confusion about altmetrics is the new book Altmetrics: A Practical Guide for Librarians, Researchers and Academics, edited and largely authored by Andy Tattersall, an Information Specialist at the University of Sheffield. The publication of this book is unquestionably timely in that many will be curious and confused about the importance of altmetrics to their careers and the reputation of the institutions for which they work. The book’s opening chapters include a general introduction and a history of tra- ditional metrics, through the development of the Web 2.0 technologies that made it possible to measure altmetrics. Contributions from Andrew Booth, a Reader in evidence-based information practice at the University of Sheffield, and Ben Showers, who supports the British government in aspects of digitalization, provide important context and critical discussion of traditional impact metrics. Chapters 5 and 6 are writ- ten by Altmetrics.com founder and CEO Euan Adie and William Gunn from Mendeley and provide some interesting insights into how altmetric data is collected and used, and they are perhaps predictably upbeat about the prospects for altmetrics. Chapters 7 through 10 provide an extensive list of online resources and tools, and these chapters also consider some of the challenges faced by academics and the librar- ians and administrators who support them when implementing these new technologies in the workplace. The book is sold as a practical guide, and there is much practical doi:10.5860/crl.78.1.114 Book Reviews 115 doi:10.5860/crl.77.1.129 information here regarding the various online services for measuring, increasing, and demonstrating the impact of your research. There were many descriptions of websites or apps with which I was only fleetingly familiar. In this middle portion of the book, there was extensive focus on, and much practical advice aimed at, enabling academics to engage with social media and online tools. This is a laudable aim, and the many and varied reasons for engaging with social media have been argued elsewhere (such as Farbrot 2015), but there was not enough discus- sion of altmetrics themselves. I would have liked to see more critical consideration of the altmetrics program. What exactly is being measured here? Are altmetrics measur- ing quality, or hype? Are altmetrics encouraging the cult of the celebrity academic by rewarding media attention rather than quality research? What is the relationship between altmetrics and traditional metrics in the form of citation counts and impact factors? Many of these questions are raised by Tattersall in chapter 9, only to be left hanging, unanswered, and a short critical aside by Booth in chapter 3 would have provided important balance had it been expanded upon. Many of the chapters encourage academics to jump on a whole variety of band- wagons, and librarians and administrators are urged to facilitate in this endeavor, but there is insufficient discussion of whether these are worthy bandwagons on which to jump. As such, the book often seemed more like a how-to manual for gaming the sys- tem, which is a criticism that the proponents of altmetrics have repeatedly leveled at traditional metrics of research impact. This exposes an inherent flaw in the altmetrics program. It may be just as open to gaming as other metrics—perhaps even more so. There is little or no discussion of the validity or reliability of altmetrics or the data underlying them. Many altmetric scores, such as those provided by Altmetric.com, Plum Analytics, and Impactstory, appear to be heavily influenced by tweets on Twitter. Tweets are cheap and easy to produce and much too easy to uncritically retweet, the character limit almost ensuring inaccuracy in the reporting of complicated research findings, and tweets originating from the Twitter accounts of scientific publishers, journals, and academic institutions represent an obvious conflict of interest. A recent meta-analysis has shown a weak to moderate correlation between various altmetrics and citation counts, but an extremely weak correlation for Twitter (Erdt et al., in press). The altmetric community is seemingly still undecided on whether a correlation with traditional metrics is a strength or a weakness. There is also too little discussion of the technical aspects of the calculation of alt- metric scores, or the weighting of different data in this calculation. A problem faced by altmetrics is that the types of attention most reflective of wider societal impact—TV and radio appearances—are not handled particularly well by the mainly text-based bots and algorithms of altmetrics collators. Unless a media outlet has provided a written transcript of a TV or radio interview with a researcher, and helpfully included a link to the original article, a DOI, or a PubMed ID (and exactly how many busy journalists are going to prioritize that?), then a mention in the mainstream broadcast media—ar- guably the holy grail of academic impact—is unlikely to be reflected in an altmetrics score. The book does not address details like these, some of which would seem to undermine the altmetrics argument. Tattersall raises an important question (147) that I would like a book of this sort to answer, but it too remains unanswered: is any organization in a position of influence actually monitoring or using altmetrics? To reiterate my earlier point about whether this is a worthwhile bandwagon on which to jump, is there any evidence that funding bodies, accreditation agencies, scholarly societies, or tenure committees are interested in altmetrics? Does anyone care? Without this sort of information, altmetrics will be a tough sell to already overburdened academics. 116 College & Research Libraries January 2017 Toward the end of the book, there is a chapter on open peer review, which, although interesting, has only a tenuous connection with altmetrics and seems rather out of place. This contribution is seemingly in favor of open peer review—a process about which many researchers, myself included, are still not entirely persuaded. The traditional single-blind peer-review process is unquestionably open to abuse, but other potential alternatives to this system such as double-blind peer review, or empowering and incen- tivizing editors to rein in unruly reviewers, are mentioned only in passing or not at all. If this book had been called something more general, such as “The Online Academic” or “Social Media for Researchers,” it would have fulfilled its brief nicely. Unfortunately, when a book is called “Altmetrics,” one expects to read about altmetrics. I would have liked to see more critical discussion of altmetrics and less about social media. I learned a lot about various social media tools of which I had not previously heard, and I will undoubtedly be using some of what I have learned in my own work supporting re- searchers, but I wanted to learn more about altmetrics and whether they are a viable alternative to traditional metrics.—Craig Aaen-Stockdale, BI Norwegian Business School References Erdt, Mojisola, Aarthy Nagarajan, Sei-Ching Joanna Sin, and Yin-Leng Theng. In press. “Altmetrics: an analysis of the state-of-the-art in measuring research impact on social media.” Scientometrics. doi:10.1007/s11192-016-2077-0. Farbrot, A. 2015. Sosiale medier for forskere, kommunikasjonsrådgivere og fageksperter. Oslo: Cappelen Damm. Priem, J., D. Taraborelli, P. Groth, and C. Neylon. 2010. “Altmetrics: A Manifesto.” Last modified 28 September 2011. http://altmetrics.org/manifesto. Barbara Allan. Emerging Strategies for Supporting Student Learning: A Practical Guide for Librarians and Educators. London: Facet Publishing, 2016. 178p. Paper, $75.00 (ISBN 978-1-78330-070-9). On the opening page of her latest work, Dr. Barbara Allan, whose professional career has included working as a librarian, an instructor in higher education, and Dean of the Business School at the University of Westminster (U.K.), promises to deliver “an introduction to the current landscape in higher education” (1). To that end, Dr. Allan, whose previous publications include well-reviewed volumes such as The No-nonsense Guide to Training in Libraries (2013), Supporting Research Students (2009), and Blended Learning (2007), has chosen well the topics to discuss, balancing between some that are more theoretical in nature and others that are more practical. The overall structure of the book is highly effective: The introduction includes abstracts for each chapter, and each chapter includes a brief introduction to the topic at hand. Following that are brief treatments (typically 3–5 paragraphs) on subtopics therein, as well as examples and case studies intended to offer clarification on key points. Further aiding readers is the inclusion of a works-cited page at the conclusion of each chapter, and the final eight pages of the book are dedicated to a comprehensive index. All of these factors create a text that piques the professional interest and lends itself to easy, targeted exploration. Much of the content is what one would expect of an introductory text; however, in what must be an effort to be thorough, familiar concepts such as “information literacy,” “flipped classroom,” and the software PowerPoint are briefly explained. Further, in a chapter on effective instruction, the common sense admonition to “provide students with sufficient time to work on” active learning exercises seems unnecessary (84). Similarly, explaining that student learning can be assessed through the use of assign- ments is hardly an unknown concept, even to the least experienced higher education doi:10.5860/crl.78.1.116 http://altmetrics.org/manifesto _GoBack _GoBack _GoBack _GoBack _GoBack _GoBack