"A Comparison of Citations, Downloads And Readership Data for an Information Systems Journal"

Christian Schlögl et al. have published "A Comparison of Citations, Downloads And Readership Data for an Information Systems Journal" in a special issue on altmetrics of Research Trends.

Here's an excerpt:

In our analysis we identified a high (though not a perfect) correlation between citations and downloads which was slightly lower between downloads and readership frequencies and again between citations and readership counts. This is mainly due to the fact that the used data (sources) are related either to research or at least to teaching in higher education institutions.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"Teaching an Old University Press Publisher New Tricks: Living in the Present and Preparing for the Future of Scholarly Communications"

Patrick H. Alexander has published "Teaching an Old University Press Publisher New Tricks: Living in the Present and Preparing for the Future of Scholarly Communications" in the Journal of Electronic Publishing.

Here's an excerpt:

University presses currently exist in the dual worlds of print and digital publishing. Current staffing needs require that they hire personnel with skills and experience that mirror that present duality. Training and maintaining a skilled workforce requires a commitment to flexibility and an openness to the ever-changing nature of scholarly communication. As the scholarly publishing ecosystem continues to evolve, university presses will need to look to a future workforce that has additional training, knowledge, and experience beyond the traditional skills associated with academic publishing, one that fully embraces the realities of a digital world, the habits of new generations of researchers, and the increasing role of technology in scholarly communication. This article looks at what the future might look like, what skills might be required, and how one might prepare for that future.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"Empirical Evidences in Citation-Based Search Engines: Is Microsoft Academic Search Dead?"

Enrique Orduna-Malea et al. have self-archived "Empirical Evidences in Citation-Based Search Engines: Is Microsoft Academic Search Dead?"

Here's an excerpt:

The goal of this working paper is to summarize the main empirical evidences provided by the scientific community as regards the comparison between the two main citation based academic search engines: Google Scholar and Microsoft Academic Search, paying special attention to the following issues: coverage, correlations between journal rankings, and usage of these academic search engines. Additionally, selfelaborated data is offered, which are intended to provide current evidence about the popularity of these tools on the Web, by measuring the number of rich files PDF, PPT and DOC in which these tools are mentioned, the amount of external links that both products receive, and the search queries frequency from Google Trends. The poor results obtained by MAS led us to an unexpected and unnoticed discovery: Microsoft Academic Search is outdated since 2013.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"How Well Developed Are Altmetrics? A Cross-Disciplinary Analysis of the Presence of ‘Alternative Metrics’ in Scientific Publications"

Zohreh Zahedi, Rodrigo Costas, and Paul Wouters have self-archived "How Well Developed Are Altmetrics? A Cross-Disciplinary Analysis of the Presence of 'Alternative Metrics' in Scientific Publications."

Here's an excerpt:

In this paper an analysis of the presence and possibilities of altmetrics for bibliometric and performance analysis is carried out. Using the web based tool Impact Story, we collected metrics for 20,000 random publications from the Web of Science. We studied both the presence and distribution of altmetrics in the set of publications, across fields, document types and over publication years, as well as the extent to which altmetrics correlate with citation indicators. The main result of the study is that the altmetrics source that provides the most metrics is Mendeley, with metrics on readerships for 62.6% of all the publications studied, other sources only provide marginal information. In terms of relation with citations, a moderate spearman correlation (r=0.49) has been found between Mendeley readership counts and citation indicators.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"Do Altmetrics Correlate with Citations? Extensive Comparison of Altmetric Indicators with Citations from a Multidisciplinary Perspective"

Rodrigo Costas, Zohreh Zahedi, Paul Wouters have self-archived "Do Altmetrics Correlate with Citations? Extensive Comparison of Altmetric Indicators with Citations from a Multidisciplinary Perspective" in arXiv.org.

Here's an excerpt:

An extensive analysis of the presence of different altmetric indicators provided by Altmetric.com across scientific fields is presented, particularly focusing on their relationship with citations. Our results confirm that the presence and density of social media altmetric counts are still very low and not very frequent among scientific publications, with 15%-24% of the publications presenting some altmetric activity and concentrating in the most recent publications, although their presence is increasing over time. Publications from the social sciences, humanities and the medical and life sciences show the highest presence of altmetrics, indicating their potential value and interest for these fields. The analysis of the relationships between altmetrics and citations confirms previous claims of positive correlations but relatively weak, thus supporting the idea that altmetrics do not reflect the same concept of impact as citations. Also, altmetric counts do not always present a better filtering of highly cited publications than journal citation scores. Altmetrics scores (particularly mentions in blogs) are able to identify highly cited publications with higher levels of precision than journal citation scores (JCS), but they have a lower level of recall. The value of altmetrics as a complementary tool of citation analysis is highlighted, although more research is suggested to disentangle the potential meaning and value of altmetric indicators for research evaluation.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

Keeping Up With… Altmetrics

ACRL has released Keeping Up With… Altmetrics.

Here's an excerpt:

Into this setting, enter altmetrics. Altmetrics is an emerging category of impact measurement premised upon the value of "alternative metrics," or metrics based distinctly on the opportunities offered by the 21st century digital environment. Originally defined in contrast to the more established field of bibliometrics, altmetrics is fast becoming a fluid area of research and practice, in which various alternative and traditional measures of personal and scholarly impact can be explored and compared simultaneously.

In this Keeping Up With… edition, we look at key points in the rapid development of altmetrics, from its 2010 origins to its more recent relevance to librarians and administrators.

If you are interested in altmetrics, also check out the Altmetrics Bibliography.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

Altmetrics in Context

The Canadian Association of Research Libraries (CARL) has released Altmetrics in Context.

Here's an excerpt from the announcement:

As scholarly communication takes on new forms and moves increasingly to digital and open access venues, the value of new types of metrics is increasingly important for the research community. It is causing discussion and, in some camps, heated debate.

Altmetrics report the impact of a wide range of research outputs, including data sets, articles and code. This document, available on the CARL Website, provides a quick introduction to this new field of research impact assessment and encourages researchers to use altmetrics in their work.

Want more information on altmetrics? Try the Altmetrics Bibliography.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"Identifying the Effect of Open Access on Citations Using a Panel of Science Journals"

Mark J. McCabe and Christopher M. Snyder have self-archived "Identifying the Effect of Open Access on Citations Using a Panel of Science Journals." in SSRN

Here's an excerpt:

An open-access journal allows free online access to its articles, obtaining revenue from fees charged to submitting authors or from institutional support. Using panel data on science journals, we are able to circumvent problems plaguing previous studies of the impact of open access on citations. In contrast to the huge effects found in these previous studies, we find a more modest effect: moving from paid to open access increases cites by 8% on average in our sample. The benefit is concentrated among top-ranked journals. In fact, open access causes a statistically significant reduction in cites to the bottom-ranked journals in our sample, leading us to conjecture that open access may intensify competition among articles for readers' attention, generating losers as well as winners.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

Altmetrics Bibliography

Digital Scholarship has released the Altmetrics Bibliography, which includes over 50 selected English-language articles and technical reports that are useful in understanding altmetrics.

The "altmetrics" concept is still evolving. In "The Altmetrics Collection," Jason Priem, Paul Groth, and Dario Taraborelli define altmetrics as follows:

Altmetrics is the study and use of scholarly impact measures based on activity in online tools and environments. The term has also been used to describe the metrics themselves—one could propose in plural a "set of new altmetrics." Altmetrics is in most cases a subset of both scientometrics and webometrics; it is a subset of the latter in that it focuses more narrowly on scholarly influence as measured in online tools and environments, rather than on the Web more generally.

Sources have been published from January 2001 through September 2013.

The bibliography includes links to freely available versions of included works. If such versions are unavailable, italicized links to the publishers' descriptions are provided.

It is available under a Creative Commons Attribution-Noncommercial 3.0 United States License.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"A Look at Altmetrics and Its Growing Significance to Research Libraries"

Emily Puckett Rodgers and Sarah Barbrow have self-archived "A Look at Altmetrics and Its Growing Significance to Research Libraries" in Deep Blue.

Here's an excerpt:

This document serves as an informational review of the emerging field and practices of alternative metrics or altmetrics. It is intended to be used by librarians and faculty members in research libraries and universities to better understand the trends and challenges associated with altmetrics in higher education. It is also intended to be used by research libraries to offer guidance on how to participate in shaping this emerging field.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"Open Access and the Changing Landscape of Research Impact Indicators: New Roles for Repositories"

Isabel Bernal has published "Open Access and the Changing Landscape of Research Impact Indicators: New Roles for Repositories" in Publications.

Here's an excerpt:

The debate about the need to revise metrics that evaluate research excellence has been ongoing for years, and a number of studies have identified important issues that have yet to be addressed. Internet and other technological developments have enabled the collection of richer data and new approaches to research assessment exercises. Open access strongly advocates for maximizing research impact by enhancing seamless accessibility. In addition, new tools and strategies have been used by open access journals and repositories to showcase how science can benefit from free online dissemination. Latest players in the debate include initiatives based on alt-metrics, which enrich the landscape with promising indicators. To start with, the article gives a brief overview of the debate and the role of open access in advancing a new frame to assess science. Next, the work focuses on the strategy that the Spanish National Research Council's repository DIGITAL.CSIC is implementing to collect a rich set of statistics and other metrics that are useful for repository administrators, researchers and the institution alike. A preliminary analysis of data hints at correlations between free dissemination of research through DIGITAL.CSIC and enhanced impact, reusability and sharing of CSIC science on the web.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"Correlation between Article Download and Citation Figures for Highly Accessed Articles from Five Open Access Oncology Journals"

Carsten Nieder, Astrid Dalhaug, and Gro Aandahl have published "Correlation between Article Download and Citation Figures for Highly Accessed Articles from Five Open Access Oncology Journals" in SpringerPlus.

Here's an excerpt:

Different approaches can be chosen to quantify the impact and merits of scientific oncology publications. These include source of publication (including journal reputation and impact factor), whether or not articles are cited by others, and access/download figures. When relying on citation counts, one needs to obtain access to citation databases and has to consider that results differ from one database to another. Accumulation of citations takes time and their dynamics might differ from journal to journal and topic to topic. Therefore, we wanted to evaluate the correlation between citation and download figures, hypothesising that articles with fewer downloads also accumulate fewer citations. Typically, publishers provide download figures together with the article. We extracted and analysed the 50 most viewed articles from 5 different open access oncology journals. For each of the 5 journals and also all journals combined, correlation between number of accesses and citations was limited (r=0.01-0.30). Considerable variations were also observed when analyses were restricted to specific article types such as reviews only (r=0.21) or case reports only (r=0.53). Even if year of publication was taken into account, high correlation coefficients were the exception from the rule. In conclusion, downloads are not a universal surrogate for citation figures.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"Riding the Crest of the Altmetrics Wave: How Librarians Can Help Prepare Faculty for the Next Generation of Research Impact Metrics"

Scott Lapinski, Heather Piwowar, and Jason Priem have published "Riding the Crest of the Altmetrics Wave: How Librarians Can Help Prepare Faculty for the Next Generation of Research Impact Metrics" in the latest issue of College & Research Libraries News.

Here's an excerpt:

University faculty, administration, librarians, and publishers alike are beginning to discuss how and where altmetrics can be useful towards evaluating a researcher's academic contribution.2 As interest grows, libraries are in a unique position to help facilitate an informed dialogue with the various constituencies that will intersect with altmetrics on campus, including both researchers (students and faculty) and the academic administrative office (faculty affairs, research and grants, promotion and tenure committees, and so on).

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"The Rich Get Richer and the Poor Get Poorer: The Effect of Open Access on Cites to Science Journals Across the Quality Spectrum"

Mark J. McCabe and Christopher M. Snyder have self-archived "The Rich Get Richer and the Poor Get Poorer: The Effect of Open Access on Cites to Science Journals Across the Quality Spectrum" in SSRN.

Here's an excerpt:

An open-access journal allows free online access to its articles, obtaining revenue from fees charged to submitting authors. Using panel data on science journals, we are able to circumvent some problems plaguing previous studies of the impact of open access on citations. We find that moving from paid to open access increases cites by 8% on average in our sample, but the effect varies across the quality of content. Open access increases cites to the best content (top-ranked journals or articles in upper quintiles of citations within a volume) but reduces cites to lower-quality content. We construct a model to explain these findings in which being placed on a broad open-access platform can increase the competition among articles for readers' attention. We can find structural parameters allowing the model to fit the quintile results quite closely.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

Article-Level Metrics—A SPARC Primer

SPARC has released Article-Level Metrics—A SPARC Primer.

Here's an excerpt:

Article-Level Metrics (ALMs) are rapidly emerging as important tools to quantify how individual articles are being discussed, shared, and used. ALMs can be employed in conjunction with existing metrics, which have traditionally focused on the long-term impact of a collection of articles (i.e., a journal) based on the number of citations generated. This primer is designed to give campus leaders and other interested parties an overview of what ALMs are, why they matter, how they complement established utilities, and how they can be used in the tenure and promotion process.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"New Opportunities for Repositories in the Age of Altmetrics"

Stacy Konkiel and Dave Scherer have published "New Opportunities for Repositories in the Age of Altmetrics" in the latest issue of the Bulletin of the Association for Information Science and Technology.

Here's an excerpt:

By reporting altmetrics (alternative metrics based on online activity) for their content, institutional repositories can add value to existing metrics—and prove their relevance and importance in an age of growing cutbacks to library services. This article will discuss the metrics that repositories currently deliver and how altmetrics can supplement existing usage statistics to provide a broader interpretation of research-output impact for the benefit of authors, library-based publishers and repository managers, and university administrators alike.

| Digital Scholarship | Digital Scholarship Publications Overview | Sitemap |

"F1000 Recommendations as a New Data Source for Research Evaluation: A Comparison with Citations"

Ludo Waltman and Rodrigo Costas have self-archived "F1000 Recommendations as a New Data Source for Research Evaluation: A Comparison with Citations" in arXiv.org.

Here's an excerpt:

F1000 is a post-publication peer review service for biological and medical research. F1000 aims to recommend important publications in the biomedical literature, and from this perspective F1000 could be an interesting tool for research evaluation. By linking the complete database of F1000 recommendations to the Web of Science bibliographic database, we are able to make a comprehensive comparison between F1000 recommendations and citations. We find that about 2% of the publications in the biomedical literature receive at least one F1000 recommendation. Recommended publications on average receive 1.30 recommendations, and over 90% of the recommendations are given within half a year after a publication has appeared. There turns out to be a clear correlation between F1000 recommendations and citations. However, the correlation is relatively weak, at least weaker than the correlation between journal impact and citations.

| Digital Scholarship | Digital Scholarship Publications Overview | Sitemap |

"Manipulating Google Scholar Citations and Google Scholar Metrics: Simple, Easy and Tempting"

Emilio Delgado López-Cózar, Nicolás Robinson-García, and Daniel Torres-Salinas have self-archived "Manipulating Google Scholar Citations and Google Scholar Metrics: Simple, Easy and Tempting" in arXiv.org.

Here's an excerpt:

The launch of Google Scholar Citations and Google Scholar Metrics may provoke a revolution in the research evaluation field as it places within every researchers reach tools that allow bibliometric measuring. In order to alert the research community over how easily one can manipulate the data and bibliometric indicators offered by Google's products we present an experiment in which we manipulate the Google Citations profiles of a research group through the creation of false documents that cite their documents, and consequently, the journals in which they have published modifying their H index. . . . We analyse the malicious effect this type of practices can cause to Google Scholar Citations and Google Scholar Metrics. Finally, we conclude with several deliberations over the effects these malpractices may have and the lack of control tools these tools offer.

| Digital Scholarship's Digital/Print Books | Digital Scholarship |

"On the Impact of Gold Open Access Journals"

Christian Gumpenberger, María-Antonia Ovalle-Perandones, and Juan Gorraiz have self-archived "On the Impact of Gold Open Access Journals" in U: Scholar.

Here's an excerpt:

This study identified the current set of Gold Open Access journals featuring a Journal Impact Factor (JIF) by means of Ulrichsweb, Directory of Open Access Journals and Journal Citation Reports (JCR). The results were analyzed regarding disciplines, countries, quartiles of the JIF distribution in JCR and publishers. Furthermore the temporal impact evolution was studied for a Top 50 titles list (according to JIF) by means of Journal Impact Factor, SJR and SNIP in the time interval 2000-2010. The identified top Gold Open Access journals proved to be well-established and their impact is generally increasing for all the analyzed indicators. The majority of JCR-indexed OA journals can be assigned to Life Sciences and Medicine. The success-rate for JCR inclusion differs from country to country and is often inversely proportional to the number of national OA journal titles.

Transforming Scholarly Publishing through Open Access: A Bibliography Cover

| Digital Scholarship | Transforming Scholarly Publishing through Open Access: A Bibliography |

| Digital Scholarship |

Thomson Reuters Launches Data Citation Index

Thomson Reuters has launched the Data Citation Index within the Web of Knowledge.

Here's an excerpt from the press release:

This new research resource from Thomson Reuters creates a single source of discovery for scientific, social sciences and arts and humanities information. It provides a single access point to discover foundational research within data repositories around the world in the broader context of peer-reviewed literature in journals, books, and conference proceedings already indexed in the Web of Knowledge. . . .

The Thomson Reuters Data Citation Index makes research within the digital universe discoverable, citable and viewable within the context of the output the data has informed. Thomson Reuters partnered with numerous data repositories worldwide to capture bibliographic records and cited references for digital research, facilitating visibility, author attribution, and ultimately the measurement of impact of this growing body of scholarship.

| Digital Curation Bibliography: Preservation and Stewardship of Scholarly Works | Digital Scholarship |

"Open Access versus Subscription Journals: A Comparison of Scientific Impact"

Bo-Christer Björk and David Solomon have published "Open Access versus Subscription Journals: A Comparison of Scientific Impact" in BMC Medicine.

Here's an excerpt:

Overall, average citation rates, both unweighted and weighted for the number of articles per journal, were about 30% higher for subscription journals. However, after controlling for discipline (medicine and health versus other), age of the journal (three time periods) and the location of the publisher (four largest publishing countries versus other countries) the differences largely disappeared in most subcategories except for journals that had been launched prior to 1996. OA journals that fund publishing with article processing charges (APCs) are on average cited more than other OA journals. In medicine and health, OA journals founded in the last 10 years are receiving about as many citations as subscription journals launched during the same period.

| Transforming Scholarly Publishing through Open Access: A Bibliography | Digital Scholarship |

"Open Metrics for Open Repositories"

Brian Kelly, Nick Sheppard, Jenny Delasalle, Mark Dewey, Owen Stephens, Gareth J Johnson, and Stephanie Taylor have self-archived "Open Metrics for Open Repositories" in University of Bath Research.

Here's an excerpt:

Increasingly there is a need for quantitative evidence in order to help demonstrate the value of online services. Such evidence can also help to detect emerging patterns of usage and identify associated operational best practice. This paper seeks to initiate a discussion on approaches to metrics for institutional repositories by providing a high-level overview of the benefits of metrics for a variety of stakeholders. The paper outlines the potential benefits which can be gained from providing richer statistics related to the use of institutional repositories and also reviews related work in this area. The authors describe a JISC-funded project which harvested a large number of repositories in order to identify patterns of use of metadata attributes and summarise the key findings. The paper provides a case study which reviews plans to provide a richer set of statistics within one institutional repository as well as requirements from the researcher community. An example of how third-party aggregation services may provide metrics on behalf of the repository community is given. The authors conclude with a call for repository managers, developers and policy makers to be pro-active in providing open access to metrics for open repositories.

| Institutional Repository and ETD Bibliography 2011 | Digital Scholarship |

"Altmetrics in the Wild: Using Social Media to Explore Scholarly Impact"

Jason Priem, Heather A. Piwowar, and Bradley M. Hemminger have self-archived "Altmetrics in the Wild: Using Social Media to Explore Scholarly Impact" in arXiv.org.

Here's an excerpt:

In growing numbers, scholars are integrating social media tools like blogs, Twitter, and Mendeley into their professional communications. The online, public nature of these tools exposes and reifies scholarly processes once hidden and ephemeral. Metrics based on this activities could inform broader, faster measures of impact, complementing traditional citation metrics. This study explores the properties of these social media-based metrics or "altmetrics," sampling 24,331 articles published by the Public Library of Science. We find that different indicators vary greatly in activity. Around 5% of sampled articles are cited in Wikipedia, while close to 80% have been included in at least one Mendeley library. There is, however, an encouraging diversity; a quarter of articles have nonzero data from five or more different sources. Correlation and factor analysis suggest citation and altmetrics indicators track related but distinct impacts, with neither able to describe the complete picture of scholarly use alone. There are moderate correlations between Mendeley and Web of Science citation, but many altmetric indicators seem to measure impact mostly orthogonal to citation. Articles cluster in ways that suggest five different impact "flavors," capturing impacts of different types on different audiences; for instance, some articles may be heavily read and saved by scholars but seldom cited. Together, these findings encourage more research into altmetrics as complements to traditional citation measures.

| Scholarly Electronic Publishing Bibliography 2010 | Digital Scholarship |

"How the Scientific Community Reacts to Newly Submitted Preprints: Article Downloads, Twitter Mentions, and Citations"

Xin Shuai, Alberto Pepe, Johan Bollen have self-archived "How the Scientific Community Reacts to Newly Submitted Preprints: Article Downloads, Twitter Mentions, and Citations" in arXiv.org.

Here's an excerpt:

We analyze the online response of the scientific community to the preprint publication of scholarly articles. We employ a cohort of 4,606 scientific articles submitted to the preprint database arXiv.org between October 2010 and April 2011. We study three forms of reactions to these preprints: how they are downloaded on the arXiv.org site, how they are mentioned on the social media site Twitter, and how they are cited in the scholarly record. We perform two analyses. First, we analyze the delay and time span of article downloads and Twitter mentions following submission, to understand the temporal configuration of these reactions and whether significant differences exist between them. Second, we run correlation tests to investigate the relationship between Twitter mentions and both article downloads and article citations. We find that Twitter mentions follow rapidly after article submission and that they are correlated with later article downloads and later article citations, indicating that social media may be an important factor in determining the scientific impact of an article.

| Transforming Scholarly Publishing through Open Access: A Bibliography| Digital Scholarship Publications Overview |

"Evaluating Repository Annual Metrics for SCONUL"

Gareth James Johnson has self-archived "Evaluating Repository Annual Metrics for SCONUL" in the Leicester Research Archive.

Here's an excerpt:

This report is a summarisation of the responses to a recent survey of the UKCoRR membership concerning the use of full-text downloads as a repository performance metric within the SCONUL annual statistical survey. It hopes to present a representative snapshot of the current opinions in this area from repository managers.

| Digital Scholarship | Digital Scholarship Publications Overview | Institutional Repository Bibliography |