"Does Google Scholar Contain All Highly Cited Documents (1950-2013)?"

Alberto Martín-Martín et al. have self-archived "Does Google Scholar Contain All Highly Cited Documents (1950-2013)."

Here's an excerpt:

The study of highly cited documents on Google Scholar (GS) has never been addressed to date in a comprehensive manner. The objective of this work is to identify the set of highly cited documents in Google Scholar and define their core characteristics: their languages, their file format, or how many of them can be accessed free of charge. We will also try to answer some additional questions that hopefully shed some light about the use of GS as a tool for assessing scientific impact through citations. The decalogue of research questions is shown below:

1. Which are the most cited documents in GS?
2. Which are the most cited document types in GS?
3. What languages are the most cited documents written in GS?
4. How many highly cited documents are freely accessible?
4.1 What file types are the most commonly used to store these highly cited documents?
4.2 Which are the main providers of these documents?
5. How many of the highly cited documents indexed by GS are also indexed by WoS?
6. Is there a correlation between the number of citations that these highly cited documents have received in GS and the number of citations they have received in WoS?
7. How many versions of these highly cited documents has GS detected?
8. Is there a correlation between the number of versions GS has detected for these documents, and the number citations they have received?
9. Is there a correlation between the number of versions GS has detected for these documents, and their position in the search engine result pages?
10. Is there some relation between the positions these documents occupy in the search engine result pages, and the number of citations they have received?

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

"Tweets as Impact Indicators: Examining the Implications of Automated Bot Accounts on Twitter"

Stefanie Haustein et al. have self-archived "Tweets as Impact Indicators: Examining the Implications of Automated Bot Accounts on Twitter."

Here's an excerpt:

This brief communication presents preliminary findings on automated Twitter accounts distributing links to scientific papers deposited on the preprint repository arXiv. It discusses the implication of the presence of such bots from the perspective of social media metrics (altmetrics), where mentions of scholarly documents on Twitter have been suggested as a means of measuring impact that is both broader and timelier than citations. We present preliminary findings that automated Twitter accounts create a considerable amount of tweets to scientific papers and that they behave differently than common social bots, which has critical implications for the use of raw tweet counts in research evaluation and assessment. We discuss some definitions of Twitter cyborgs and bots in scholarly communication and propose differentiating between different levels of engagement from tweeting only bibliographic information to discussing or commenting on the content of a paper.

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

"The Imperative for Open Altmetrics"

Stacy Konkiel, Heather Piwowar, and Jason Priem have published "The Imperative for Open Altmetrics" in The Journal of Electronic Publishing.

Here's an excerpt:

If scholarly communication is broken, how will we fix it? At Impactstory—a non-profit devoted to helping scholars gather and share evidence of their research impact by tracking online usage of scholarship via blogs, Wikipedia, Mendeley, and more—we believe that incentivizing web-native research via altmetrics is the place to start. In this article, we describe the current state of the art in altmetrics and its effects on publishing, we share Impactstory's plan to build an open infrastructure for altmetrics, and describe our company's ethos and actions.

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

"Which Kind of Papers Has Higher or Lower Altmetric Counts? A Study Using Article-Level Metrics from PLOS and F1000Prime"

Lutz Bornmann has self-archived "Which Kind of Papers Has Higher or Lower Altmetric Counts? A Study Using Article-Level Metrics from PLOS and F1000Prime."

Here's an excerpt:

The present study investigates the usefulness of altmetrics for measuring the broader impact of research. Methods: This study is essentially based on a dataset with papers obtained from F1000. This dataset was augmented with altmetrics (such as Twitter counts) which were downloaded from the homepage of PLOS (the Public Library of Science). This study covers a total of 1,082 papers. Findings: The results from regression models indicate that Facebook and Twitter, but not Figshare or Mendeley, can provide indications of papers which are of interest to a broader circle of readers (and not only for the peers in a specialist area), and seem therefore be useful for societal impact measurement.

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

Do Altmetrics Point to the Broader Impact of Research? An Overview of Benefits and Disadvantages of Altmetrics

Lutz Bornmann has self-archived "Do Altmetrics Point to the Broader Impact of Research? An Overview of Benefits and Disadvantages of Altmetrics."

Here's an excerpt:

Today, it is not clear how the impact of research on other areas of society than science should be measured. While peer review and bibliometrics have become standard methods for measuring the impact of research in science, there is not yet an accepted framework within which to measure societal impact. Alternative metrics (called altmetrics to distinguish them from bibliometrics) are considered an interesting option for assessing the societal impact of research, as they offer new ways to measure (public) engagement with research output. Altmetrics is a term to describe web-based metrics for the impact of publications and other scholarly material by using data from social media platforms (e.g. Twitter or Mendeley). This overview of studies explores the potential of altmetrics for measuring societal impact. It deals with the definition and classification of altmetrics. Furthermore, their benefits and disadvantages for measuring impact are discussed.

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

"A Review of the Characteristics of 108 Author-Level Bibliometric Indicators"

Lorna Wildgaard, Jesper W. Schneider, and Birger Larsen have self-archived "A Review of the Characteristics of 108 Author-Level Bibliometric Indicators."

Here's an excerpt:

An increasing demand for bibliometric assessment of individuals has led to a growth of new bibliometric indicators as well as new variants or combinations of established ones. The aim of this review is to contribute with objective facts about the usefulness of bibliometric indicators of the effects of publication activity at the individual level. This paper reviews 108 indicators that can potentially be used to measure performance on the individual author level, and examines the complexity of their calculations in relation to what they are supposed to reflect and ease of end-user application.

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

"Measuring the Broader Impact of Research: The Potential of Altmetrics"

Lutz Bornmann has self-archived "Measuring the Broader Impact of Research: The Potential of Altmetrics."

Here's an excerpt:

Today, it is not clear how the impact of research on other areas of society than science should be measured. While peer review and bibliometrics have become standard methods for measuring the impact of research in science, there is not yet an accepted framework within which to measure societal impact. Alternative metrics (called altmetrics to distinguish them from bibliometrics) are considered an interesting option for assessing the societal impact of research, as they offer new ways to measure (public) engagement with research output. Altmetrics is a term to describe web-based metrics for the impact of publications and other scholarly material by using data from social media platforms (e.g. Twitter or Mendeley). This overview of studies explores the potential of altmetrics for measuring societal impact. It deals with the definition and classification of altmetrics. Furthermore, their benefits and disadvantages for measuring impact are discussed.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"The Multidimensional Assessment of Scholarly Research Impact"

Henk F. Moed and Gali Halevi have self-archived "The Multidimensional Assessment of Scholarly Research Impact."

Here's an excerpt:

This article introduces the Multidimensional Research Assessment Matrix of scientific output. Its base notion holds that the choice of metrics to be applied in a research assessment process depends upon the unit of assessment, the research dimension to be assessed, and the purposes and policy context of the assessment. An indicator may by highly useful within one assessment process, but less so in another. For instance, publication counts are useful tools to help discriminating between those staff members who are research active, and those who are not, but are of little value if active scientists are to be compared one another according to their research performance. This paper gives a systematic account of the potential usefulness and limitations of a set of 10 important metrics including altmetrics, applied at the level of individual articles, individual researchers, research groups and institutions. It presents a typology of research impact dimensions, and indicates which metrics are the most appropriate to measure each dimension. It introduces the concept of a meta-analysis of the units under assessment in which metrics are not used as tools to evaluate individual units, but to reach policy inferences regarding the objectives and general setup of an assessment process.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"A Comparison of Citations, Downloads And Readership Data for an Information Systems Journal"

Christian Schlögl et al. have published "A Comparison of Citations, Downloads And Readership Data for an Information Systems Journal" in a special issue on altmetrics of Research Trends.

Here's an excerpt:

In our analysis we identified a high (though not a perfect) correlation between citations and downloads which was slightly lower between downloads and readership frequencies and again between citations and readership counts. This is mainly due to the fact that the used data (sources) are related either to research or at least to teaching in higher education institutions.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"Teaching an Old University Press Publisher New Tricks: Living in the Present and Preparing for the Future of Scholarly Communications"

Patrick H. Alexander has published "Teaching an Old University Press Publisher New Tricks: Living in the Present and Preparing for the Future of Scholarly Communications" in the Journal of Electronic Publishing.

Here's an excerpt:

University presses currently exist in the dual worlds of print and digital publishing. Current staffing needs require that they hire personnel with skills and experience that mirror that present duality. Training and maintaining a skilled workforce requires a commitment to flexibility and an openness to the ever-changing nature of scholarly communication. As the scholarly publishing ecosystem continues to evolve, university presses will need to look to a future workforce that has additional training, knowledge, and experience beyond the traditional skills associated with academic publishing, one that fully embraces the realities of a digital world, the habits of new generations of researchers, and the increasing role of technology in scholarly communication. This article looks at what the future might look like, what skills might be required, and how one might prepare for that future.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"Empirical Evidences in Citation-Based Search Engines: Is Microsoft Academic Search Dead?"

Enrique Orduna-Malea et al. have self-archived "Empirical Evidences in Citation-Based Search Engines: Is Microsoft Academic Search Dead?"

Here's an excerpt:

The goal of this working paper is to summarize the main empirical evidences provided by the scientific community as regards the comparison between the two main citation based academic search engines: Google Scholar and Microsoft Academic Search, paying special attention to the following issues: coverage, correlations between journal rankings, and usage of these academic search engines. Additionally, selfelaborated data is offered, which are intended to provide current evidence about the popularity of these tools on the Web, by measuring the number of rich files PDF, PPT and DOC in which these tools are mentioned, the amount of external links that both products receive, and the search queries frequency from Google Trends. The poor results obtained by MAS led us to an unexpected and unnoticed discovery: Microsoft Academic Search is outdated since 2013.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"How Well Developed Are Altmetrics? A Cross-Disciplinary Analysis of the Presence of ‘Alternative Metrics’ in Scientific Publications"

Zohreh Zahedi, Rodrigo Costas, and Paul Wouters have self-archived "How Well Developed Are Altmetrics? A Cross-Disciplinary Analysis of the Presence of 'Alternative Metrics' in Scientific Publications."

Here's an excerpt:

In this paper an analysis of the presence and possibilities of altmetrics for bibliometric and performance analysis is carried out. Using the web based tool Impact Story, we collected metrics for 20,000 random publications from the Web of Science. We studied both the presence and distribution of altmetrics in the set of publications, across fields, document types and over publication years, as well as the extent to which altmetrics correlate with citation indicators. The main result of the study is that the altmetrics source that provides the most metrics is Mendeley, with metrics on readerships for 62.6% of all the publications studied, other sources only provide marginal information. In terms of relation with citations, a moderate spearman correlation (r=0.49) has been found between Mendeley readership counts and citation indicators.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"Do Altmetrics Correlate with Citations? Extensive Comparison of Altmetric Indicators with Citations from a Multidisciplinary Perspective"

Rodrigo Costas, Zohreh Zahedi, Paul Wouters have self-archived "Do Altmetrics Correlate with Citations? Extensive Comparison of Altmetric Indicators with Citations from a Multidisciplinary Perspective" in arXiv.org.

Here's an excerpt:

An extensive analysis of the presence of different altmetric indicators provided by Altmetric.com across scientific fields is presented, particularly focusing on their relationship with citations. Our results confirm that the presence and density of social media altmetric counts are still very low and not very frequent among scientific publications, with 15%-24% of the publications presenting some altmetric activity and concentrating in the most recent publications, although their presence is increasing over time. Publications from the social sciences, humanities and the medical and life sciences show the highest presence of altmetrics, indicating their potential value and interest for these fields. The analysis of the relationships between altmetrics and citations confirms previous claims of positive correlations but relatively weak, thus supporting the idea that altmetrics do not reflect the same concept of impact as citations. Also, altmetric counts do not always present a better filtering of highly cited publications than journal citation scores. Altmetrics scores (particularly mentions in blogs) are able to identify highly cited publications with higher levels of precision than journal citation scores (JCS), but they have a lower level of recall. The value of altmetrics as a complementary tool of citation analysis is highlighted, although more research is suggested to disentangle the potential meaning and value of altmetric indicators for research evaluation.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

Keeping Up With… Altmetrics

ACRL has released Keeping Up With… Altmetrics.

Here's an excerpt:

Into this setting, enter altmetrics. Altmetrics is an emerging category of impact measurement premised upon the value of "alternative metrics," or metrics based distinctly on the opportunities offered by the 21st century digital environment. Originally defined in contrast to the more established field of bibliometrics, altmetrics is fast becoming a fluid area of research and practice, in which various alternative and traditional measures of personal and scholarly impact can be explored and compared simultaneously.

In this Keeping Up With… edition, we look at key points in the rapid development of altmetrics, from its 2010 origins to its more recent relevance to librarians and administrators.

If you are interested in altmetrics, also check out the Altmetrics Bibliography.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

Altmetrics in Context

The Canadian Association of Research Libraries (CARL) has released Altmetrics in Context.

Here's an excerpt from the announcement:

As scholarly communication takes on new forms and moves increasingly to digital and open access venues, the value of new types of metrics is increasingly important for the research community. It is causing discussion and, in some camps, heated debate.

Altmetrics report the impact of a wide range of research outputs, including data sets, articles and code. This document, available on the CARL Website, provides a quick introduction to this new field of research impact assessment and encourages researchers to use altmetrics in their work.

Want more information on altmetrics? Try the Altmetrics Bibliography.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"Identifying the Effect of Open Access on Citations Using a Panel of Science Journals"

Mark J. McCabe and Christopher M. Snyder have self-archived "Identifying the Effect of Open Access on Citations Using a Panel of Science Journals." in SSRN

Here's an excerpt:

An open-access journal allows free online access to its articles, obtaining revenue from fees charged to submitting authors or from institutional support. Using panel data on science journals, we are able to circumvent problems plaguing previous studies of the impact of open access on citations. In contrast to the huge effects found in these previous studies, we find a more modest effect: moving from paid to open access increases cites by 8% on average in our sample. The benefit is concentrated among top-ranked journals. In fact, open access causes a statistically significant reduction in cites to the bottom-ranked journals in our sample, leading us to conjecture that open access may intensify competition among articles for readers' attention, generating losers as well as winners.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

Altmetrics Bibliography

Digital Scholarship has released the Altmetrics Bibliography, which includes over 50 selected English-language articles and technical reports that are useful in understanding altmetrics.

The "altmetrics" concept is still evolving. In "The Altmetrics Collection," Jason Priem, Paul Groth, and Dario Taraborelli define altmetrics as follows:

Altmetrics is the study and use of scholarly impact measures based on activity in online tools and environments. The term has also been used to describe the metrics themselves—one could propose in plural a "set of new altmetrics." Altmetrics is in most cases a subset of both scientometrics and webometrics; it is a subset of the latter in that it focuses more narrowly on scholarly influence as measured in online tools and environments, rather than on the Web more generally.

Sources have been published from January 2001 through September 2013.

The bibliography includes links to freely available versions of included works. If such versions are unavailable, italicized links to the publishers' descriptions are provided.

It is available under a Creative Commons Attribution-Noncommercial 3.0 United States License.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"A Look at Altmetrics and Its Growing Significance to Research Libraries"

Emily Puckett Rodgers and Sarah Barbrow have self-archived "A Look at Altmetrics and Its Growing Significance to Research Libraries" in Deep Blue.

Here's an excerpt:

This document serves as an informational review of the emerging field and practices of alternative metrics or altmetrics. It is intended to be used by librarians and faculty members in research libraries and universities to better understand the trends and challenges associated with altmetrics in higher education. It is also intended to be used by research libraries to offer guidance on how to participate in shaping this emerging field.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"Open Access and the Changing Landscape of Research Impact Indicators: New Roles for Repositories"

Isabel Bernal has published "Open Access and the Changing Landscape of Research Impact Indicators: New Roles for Repositories" in Publications.

Here's an excerpt:

The debate about the need to revise metrics that evaluate research excellence has been ongoing for years, and a number of studies have identified important issues that have yet to be addressed. Internet and other technological developments have enabled the collection of richer data and new approaches to research assessment exercises. Open access strongly advocates for maximizing research impact by enhancing seamless accessibility. In addition, new tools and strategies have been used by open access journals and repositories to showcase how science can benefit from free online dissemination. Latest players in the debate include initiatives based on alt-metrics, which enrich the landscape with promising indicators. To start with, the article gives a brief overview of the debate and the role of open access in advancing a new frame to assess science. Next, the work focuses on the strategy that the Spanish National Research Council's repository DIGITAL.CSIC is implementing to collect a rich set of statistics and other metrics that are useful for repository administrators, researchers and the institution alike. A preliminary analysis of data hints at correlations between free dissemination of research through DIGITAL.CSIC and enhanced impact, reusability and sharing of CSIC science on the web.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"Correlation between Article Download and Citation Figures for Highly Accessed Articles from Five Open Access Oncology Journals"

Carsten Nieder, Astrid Dalhaug, and Gro Aandahl have published "Correlation between Article Download and Citation Figures for Highly Accessed Articles from Five Open Access Oncology Journals" in SpringerPlus.

Here's an excerpt:

Different approaches can be chosen to quantify the impact and merits of scientific oncology publications. These include source of publication (including journal reputation and impact factor), whether or not articles are cited by others, and access/download figures. When relying on citation counts, one needs to obtain access to citation databases and has to consider that results differ from one database to another. Accumulation of citations takes time and their dynamics might differ from journal to journal and topic to topic. Therefore, we wanted to evaluate the correlation between citation and download figures, hypothesising that articles with fewer downloads also accumulate fewer citations. Typically, publishers provide download figures together with the article. We extracted and analysed the 50 most viewed articles from 5 different open access oncology journals. For each of the 5 journals and also all journals combined, correlation between number of accesses and citations was limited (r=0.01-0.30). Considerable variations were also observed when analyses were restricted to specific article types such as reviews only (r=0.21) or case reports only (r=0.53). Even if year of publication was taken into account, high correlation coefficients were the exception from the rule. In conclusion, downloads are not a universal surrogate for citation figures.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"Riding the Crest of the Altmetrics Wave: How Librarians Can Help Prepare Faculty for the Next Generation of Research Impact Metrics"

Scott Lapinski, Heather Piwowar, and Jason Priem have published "Riding the Crest of the Altmetrics Wave: How Librarians Can Help Prepare Faculty for the Next Generation of Research Impact Metrics" in the latest issue of College & Research Libraries News.

Here's an excerpt:

University faculty, administration, librarians, and publishers alike are beginning to discuss how and where altmetrics can be useful towards evaluating a researcher's academic contribution.2 As interest grows, libraries are in a unique position to help facilitate an informed dialogue with the various constituencies that will intersect with altmetrics on campus, including both researchers (students and faculty) and the academic administrative office (faculty affairs, research and grants, promotion and tenure committees, and so on).

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"The Rich Get Richer and the Poor Get Poorer: The Effect of Open Access on Cites to Science Journals Across the Quality Spectrum"

Mark J. McCabe and Christopher M. Snyder have self-archived "The Rich Get Richer and the Poor Get Poorer: The Effect of Open Access on Cites to Science Journals Across the Quality Spectrum" in SSRN.

Here's an excerpt:

An open-access journal allows free online access to its articles, obtaining revenue from fees charged to submitting authors. Using panel data on science journals, we are able to circumvent some problems plaguing previous studies of the impact of open access on citations. We find that moving from paid to open access increases cites by 8% on average in our sample, but the effect varies across the quality of content. Open access increases cites to the best content (top-ranked journals or articles in upper quintiles of citations within a volume) but reduces cites to lower-quality content. We construct a model to explain these findings in which being placed on a broad open-access platform can increase the competition among articles for readers' attention. We can find structural parameters allowing the model to fit the quintile results quite closely.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

Article-Level Metrics—A SPARC Primer

SPARC has released Article-Level Metrics—A SPARC Primer.

Here's an excerpt:

Article-Level Metrics (ALMs) are rapidly emerging as important tools to quantify how individual articles are being discussed, shared, and used. ALMs can be employed in conjunction with existing metrics, which have traditionally focused on the long-term impact of a collection of articles (i.e., a journal) based on the number of citations generated. This primer is designed to give campus leaders and other interested parties an overview of what ALMs are, why they matter, how they complement established utilities, and how they can be used in the tenure and promotion process.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"New Opportunities for Repositories in the Age of Altmetrics"

Stacy Konkiel and Dave Scherer have published "New Opportunities for Repositories in the Age of Altmetrics" in the latest issue of the Bulletin of the Association for Information Science and Technology.

Here's an excerpt:

By reporting altmetrics (alternative metrics based on online activity) for their content, institutional repositories can add value to existing metrics—and prove their relevance and importance in an age of growing cutbacks to library services. This article will discuss the metrics that repositories currently deliver and how altmetrics can supplement existing usage statistics to provide a broader interpretation of research-output impact for the benefit of authors, library-based publishers and repository managers, and university administrators alike.

| Digital Scholarship | Digital Scholarship Publications Overview | Sitemap |

"F1000 Recommendations as a New Data Source for Research Evaluation: A Comparison with Citations"

Ludo Waltman and Rodrigo Costas have self-archived "F1000 Recommendations as a New Data Source for Research Evaluation: A Comparison with Citations" in arXiv.org.

Here's an excerpt:

F1000 is a post-publication peer review service for biological and medical research. F1000 aims to recommend important publications in the biomedical literature, and from this perspective F1000 could be an interesting tool for research evaluation. By linking the complete database of F1000 recommendations to the Web of Science bibliographic database, we are able to make a comprehensive comparison between F1000 recommendations and citations. We find that about 2% of the publications in the biomedical literature receive at least one F1000 recommendation. Recommended publications on average receive 1.30 recommendations, and over 90% of the recommendations are given within half a year after a publication has appeared. There turns out to be a clear correlation between F1000 recommendations and citations. However, the correlation is relatively weak, at least weaker than the correlation between journal impact and citations.

| Digital Scholarship | Digital Scholarship Publications Overview | Sitemap |