Faculty of 1000 and VIVO: Invisible Colleges and Team Science Previous Contents Next Issues in Science and Technology Librarianship Spring 2011 DOI: 10.5062/F4F769GT URLs in this document have been updated. Links enclosed in {curly brackets} have been changed. If a replacement link was located, the new URL was added and the link is active; if a new site could not be identified, the broken link was removed. Faculty of 1000 and VIVO: Invisible Colleges and Team Science John Carey Head Librarian Health Professions Library Hunter College City University of New York New York, New York john.carey@hunter.cuny.edu Copyright 2011, John Carey. Used with permission. Abstract Within the traditional model of scholarly communications, "invisible colleges" facilitate a process of social diffusion that fuels the growth of scientific specialties. This diffusion of ideas operates not through published journal articles but rather through informal communications between researchers. In recent years, researchers have availed themselves of collaborative Web 2.0 forums such as blogs, wikis, and social networking sites to meet their need for increasingly sophisticated vehicles of informal communication. Examinations of the database Faculty of 1000 and the semantic web application VIVO help to illustrate how invisible colleges have migrated to a networked environment where they can play an even stronger role within scholarly communications. This paper will argue that the enhanced social diffusion enabled by such collaborative online venues offers an opportunity for researchers globally to accelerate the pace of innovation in an increasingly open era of team science. The Formation of Invisible Colleges Observers have long noted the importance of "invisible colleges" in transmitting knowledge within disciplines. Famously defined by Diana Crane in her book of the same name, invisible colleges constitute "groups of collaborators linked together informally" so that "[l]arge groups of collaborators and centrally located leaders play important roles in communicating knowledge and diffusing innovations in a research area" (Hagstrom 1973). Crane was herself borrowing a term first coined in 1646 by the British scientist Robert Boyle regarding the inception of the London Royal Society (London Royal Society 2004) and -- as she acknowledges -- was building on Derek J. de Solla Price's work in the 1960s on citation networks between scientific papers (Crane 1972, p. ix). The model of the invisible college has at its core an interdisciplinary philosophy -- no area of scientific research is seen as completely isolated from other areas, and scientists themselves describe their disciplines as having fluid boundaries (Crane 1972, p. 13). From this Crane argues that any set of scientists in a particular research area is best described as a "social circle" (p. 13), its boundaries difficult to delimit and its total membership difficult to locate. Each member will know of some but not all the others; some members will meet face-to-face periodically, others never at all. Many will be influenced by the work of colleagues they have never met. Thus, Crane writes, "[i]ndirect action, action mediated by intervening parties, is an important aspect....There is no formal leadership in a social circle although there are usually central figures" (pp. 13-14). The scientific communities thus formed -- distinct entities yet rich with complex interconnections -- are uniquely situated to exploit innovations from within their own or a related research area. Invisible Colleges and Collaborative Technologies At the time of Crane's writing, the output of scientific literature was estimated to be doubling every ten years (Crane 1972, p. 12). Formal communications between scientists consisted primarily of papers published in scientific journals, along with published discussion or letters to the editor about those papers. Informal communication among colleagues took the form of personal communications or interactions around the workplace or at meetings of professional associations. As Schaffner (1994) has noted, this system changed remarkably little over centuries because the traditional scholarly publishing model continued to fulfill certain fundamental needs of researchers, such as communicating information, validating quality, and helping to build scientific communities. In the Web 2.0 environment, however, researchers have many more channels open to them for expanding such communities. Within the traditional scholarly communications model, critics typically held that the importance of informal communication "varies widely from discipline to discipline" and "is restricted to a relatively small group" of the most professionally active researchers (Schaffner 1994). Crane (1972) herself noted the existence of differences in culture between research areas, especially in fields "in which a high degree of consensus regarding theory and methodology exists" so that "the exchange of information among leaders may be [more] formalized" (p. 53). Recent commentators, however, contend that the information-seeking behaviors of scientists have evolved, with informal communications rising in importance as research becomes more interdisciplinary and more data intensive, and as colleagues become more widely distributed geographically (Luce and Di Giacomo 2003). Increasingly, these informal communications take place via Web 2.0 technologies. Suber (2007) describes how blogs, podcasts, RSS feeds, and peer-to-peer networks have all found scholarly applications (p. 70). Jean-Claude Bradley, creator of the UsefulChem wiki and the ChemSpider blog, has argued for the importance of such tools to the advancement of a vital new "open notebook science" in which datasets and details of experiments are made freely available on the Internet (Poynder 2010). The social element driving the move toward open science should not be overlooked. As Wagner (2008) notes in her insightful book on contemporary invisible colleges, "networks evolve based on the needs of network members and the incentives offered...to join" (Wagner 2008, p. 118). The Nature Publishing Group would not have offered a free, web-based bibliographic management tool such as Connotea if they had not detected a need from within the scientific community to "organize, share, and discover," and the University of Pennsylvania (to take just one example) would not have offered its faculty the citation bookmarking tool PennTags if they had not heard a similar demand from their own user community. Other networking sites for scientists abound, from Mendeley to LabRoots. Whether institution-specific or public, the added value that these tools have in common is the ability for users to not just store and manage citations, but also to see what resources colleagues are bookmarking and what tags colleagues are using -- linking to a vast social circle from any computer or device with an Internet connection. Faculty of 1000: Peer Review 2.0 The database Faculty of 1000 (F1000) illustrates how the invisible college has migrated to an online environment, even while perpetuating certain trappings of formal academia. Launched in 2002 by the Science Navigation Group, F1000 collects and evaluates articles in 43 disciplines within medicine and biology. Each subject area has its own "faculty", experts in that discipline who have been nominated by their peers. Despite the database's name, at the time of this writing there were some 10,000 scientists and clinicians serving on this virtual faculty. An International Advisory Board guides overall policy and selects Heads of Faculty to oversee each subject area. These Heads of Faculty in turn divide their discipline into its major Sections and nominate Section Heads, who then find appropriate reviewers for articles within their area of expertise. Thus, while F1000 takes advantage of collaborative technologies to bring together a global network of experts, its administrative structure retains hierarchies familiar from traditional institutions or professional societies. Operating a system of "post-publication peer review," the Faculty Members "evaluate and comment on the most interesting research articles they read each month from any source." In this task, individual Faculty Members also rely on Associate Faculty Members, who they appoint to help review the literature in an effort to cover all relevant journals. F1000 currently selects about 1,500 articles per month, intended to represent "the top 2% of all published articles in biology and the medical sciences." Members not only nominate articles for inclusion in the database, but also assign classifications such as "Recommended," "Must Read," or "Exceptional," which then translate into numerical values within a proprietary system of bibliometrics resulting in an F1000 Article Factor, or "FFa." The FFa is based on the highest rating from any Faculty Member, with extra points granted incrementally for each rating from other Members; the higher the FFa, the more highly recommended the article. The FFa is displayed next to each article when users browse or search the database. Multiple evaluations may be posted for a single article -- so that users can compare colleagues' opinions -- and dissenting comments are also invited, as are responses from the author. Because all postings are signed, users can see exactly who recommended an article, who disagreed with that choice, and why. Thus, F1000 blends new online capabilities with traditional scholarly standards. Schaffner (1994) points out how scientific journals from their early days relied on peer review to vet articles and prevent fraud. Even outside of the traditional scholarly communications model, open access publishers such as BioMed Central have been careful to retain this practice while moving toward a system of "open" peer review in which reviewer comments, responses from the author, and revised manuscripts are all accessible to the user. However, by continuing this process of open commentary even after a paper's publication, Faculty of 1000 has expanded these once informal communications into a powerful new venue for open science. VIVO and the Science of Team Science If social diffusion fuels the growth of scientific specialties, then the pace of this growth should accelerate with VIVO (not an acronym), a project first implemented at Cornell University in 2004 and at the time of this writing shared among seven {partner institutions}. VIVO's mission is to facilitate networking among researchers, but not merely as a "Facebook" for scientists. According to Medha Devare of the VIVO outreach team, "[t]he idea of VIVO was to transcend administrative divisions and create a single point of access for scholarly interaction" (Brooks, Case, Corson-Rikert, et al. 2010). VIVO enables this interaction through semantic web technologies. Institutions are free to participate by installing the open-source application that supports VIVO or by supplying semantic web-compliant data to the VIVO network. Researchers themselves do not create or even update their own profiles; VIVO automatically populates users' profiles with information on publications, teaching interests, research grants, and professional honors and awards, thus creating "{a semantic cloud of information that can be searched and browsed.}" VIVO ingests this data from verified sources -- human resources databases, grants databases, PubMed or Scopus records, and so on. VIVO uses a controlled vocabulary to standardize and retrieve information, putting all this data at the fingertips of a wide range of users -- researchers, administrators, funding agencies, students looking for an academic program, or the general public. Those who stand to benefit the most from VIVO, however, are scientists searching for the right members to form a team. In the face of increasingly complex social, environmental, and technological problems, long-term cross-disciplinary collaboration between groups of researchers is "increasing across virtually all fields of science and social science" (Falk-Krzesinski et al. 2010). Furthermore, evidence suggests that within the past decade, team-authored work has surpassed solo research in terms of producing exceptional, high-impact results, so that "teams now dominate the top of the citation distribution" not just in science and the social sciences but also in such domains as the arts, humanities, and patents (Wuchty, Jones, & Uzzi 2007). This rise in team-based research work has in turn spurred the emerging field of "the science of team science," aimed at "understanding and managing circumstances that facilitate or hinder the effectiveness of collaborative cross-disciplinary science" (Falk-Krzesinski et al. 2010). Thus, if more work is being done in teams, and if that work is of greater impact, than surely locating the right members for any team is more important than ever. In Wagner's (2008) analysis of informal scientific collaboration, she notes that a process of "preferential attachment -- who is most likely to help solve a specific problem or reach a particular goal -- shapes the growth of the new invisible college" (p. 118). According to Michael Conlon of the VIVO project, facilitating this type of attachment is exactly where a platform such as VIVO can be of the most value (Falk-Krzesinski et al. 2010). For each researcher profiled, VIVO creates a graphic, interactive Co-Author Network. Users can see who not only who that researcher has collaborated with -- and how many times -- but also any other co-authors the two have in common (Holmes 2010). Users can click on interactive nodes representing each co-author to view that researcher's details. Nor is VIVO the only tool emerging to facilitate the practice of team science. The National Cancer Institute has developed an online "Team Science Toolkit," a repository of resources to support both the practice and study of team science; the University of California at Irvine offers a "Collaboration Success Wizard" for researchers to find advice; and a team at Northwestern University is working on online modules meant to introduce the concepts of team science to a wider audience (Falk-Krzesinski et al. 2010). If current trends hold, many more such efforts should follow. Looking Ahead 2010 proved to be a formative year for team science and those who study it. The VIVO project held its {first annual conference}, on the theme of "Enabling National Networking of Scientists," in New York City this past summer. However, several months before that -- in April of 2010 -- Northwestern University's Clinical and Translational Sciences Institute helped produce the first annual international conference on the science of team science. This conference brought together "[m]ore than 200 team science leaders/practitioners" from such disciplines as translational research, communications, social and behavioral sciences, and management (Falk-Krzesinski et al. 2010). This flourishing of cross-disciplinary discussion, coupled with the surge in social networking sites for scientists, illustrates the momentum now building for team science. However, as promising a platform as VIVO is, what will it take to become more than just a grants and publications database? Wagner (2008) reminds us that "researchers...collaborate not because they are told to but because they want to" and "work together not because they share a laboratory or even a discipline but because they can offer each other complementary insight, knowledge, or skills" (p. 2). Researchers can share many of these insights with colleagues in forums such as Faculty of 1000. Ultimately, however, social networking sites for scientists will have to link to "the most important thing that scientists share -- data" (Gewin 2010). Michael Conlon notes that while VIVO is still developing this functionality, linking datasets to users' profiles is what will allow researchers to fully see what potential collaborators might have to offer (Gewin 2010). When that day arrives, scientists will finally share an invisible college that is fully compatible with the era of open notebook science. References Brooks, E., Case, C., Corson-Rikert, J., et al. 2010. National VIVO network: Implementation plan. [Internet]. [Cited 2011 Mar 11]. Available from {http://www.vivoweb.org/files/ImplementationPlan_8_6.pdf} Crane, D. 1972. Invisible colleges: Diffusion of knowledge in scientific communities. Chicago: University of Chicago Press. Falk-Krzesinski, H.J., Börner, K., Contractor, N., et al. 2010. Advancing the science of team science. Clinical and Translational Science 3: 263-266. [Internet]. [Cited 2011 Mar 11]. Available from: http://onlinelibrary.wiley.com/doi/10.1111/j.1752-8062.2010.00223.x/abstract Gewin, V. 2010. Collaboration: Social networking seeks critical mass. Nature 468: 993-994. Hagstrom, W.O. 1973. Book review: Invisible colleges: Diffusion of knowledge in scientific communities. Contemporary Sociology 2(4): 382-383. Holmes, K. 2010. VIVO: Enabling national networking of scientists. Institute of Clinical and Translational Sciences News 3(3): 1-2. London Royal Society. 2004. The Royal Society. [Internet]. [Cited 11 Mar 2011]. Available from: http://www-history.mcs.st-and.ac.uk/Societies/RS.html Luce, R. & Di Giacomo, M. 2003. Personalized and collaborative digital library capabilities: Responding to the changing nature of scientific research. Science & Technology Libraries 24(1/2): 135-152. Poynder, R. 2010. The impact of open notebook science: Interview with Jean-Claude Bradley. Information Today, 27(8): 1, 50-51. Schaffner, A.C. 1994. The future of scientific journals: Lessons from the past. Information Technology and Libraries, 13(4): 239-248. Suber, P. 2007. Trends favoring open access. CT Watch Quarterly, 3(3): 67-74. Wagner, C. 2008. The new invisible college: Science for development. Washington, D.C.: Brooking Institution. Wuchty, S., Jones, B.F., & Uzzi, B. 2007. The increasing dominance of teams in production of knowledge. Science 316(18 May): 1036-1039. Previous Contents Next