"Scholarly Use of Social Media and Altmetrics: A Review of the Literature"

Cassidy R. Sugimoto et al. have self-archived "Scholarly Use of Social Media and Altmetrics: A Review of the Literature."

Here's an excerpt:

This review provides an extensive account of the state-of-the art in both scholarly use of social media and altmetrics. The review consists of two main parts: the first examines the use of social media in academia, examining the various functions these platforms have in the scholarly communication process and the factors that affect this use. The second part reviews empirical studies of altmetrics, discussing the various interpretations of altmetrics, data collection and methodological limitations, and differences according to platform. The review ends with a critical discussion of the implications of this transformation in the scholarly communication system.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"2016 Scholar Metrics Released"

Google has published "2016 Scholar Metrics Released."

Here's an excerpt:

Scholar Metrics provide an easy way for authors to quickly gauge the visibility and influence of recent articles in scholarly publications. Today, we are releasing the 2016 version of Scholar Metrics. This release covers articles published in 2011-2015 and includes citations from all articles that were indexed in Google Scholar as of June 2016.

The top 100 publications include e-print servers and open access journals, such as arXiv Cosmology and Extragalactic Astrophysics (astro-ph.CO), arXiv High Energy Physics – Experiment (hep-ex), PLoS ONE, and PLoS Genetics.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"A Two-Sided Academic Landscape: Portrait of Highly-Cited Documents in Google Scholar (1950-2013)"

Alberto Martin-Martin et al. have self-archived "A Two-Sided Academic Landscape: Portrait of Highly-Cited Documents in Google Scholar (1950-2013)."

Here's an excerpt:

Since the existence of a full-text link does not guarantee the disposal of the full-text (some links actually refer to publisher's abstracts), the results (40% of the documents had a free full-text link) might be somewhat overestimated. In any case, these values are consistent with those published by Archambault et al. (2013), who found that over 40% of the articles from their sample were freely accessible; higher than those by Khabsa and Giles (2014) and Björk et al. (2010), who found only a 24% and 20.4% of open access documents respectively; and much lower than Jamali and Nabavi (2015) and Pitol and De Groote (2014), who found 61.1% and 70% respectively.

The different nature of the samples makes it difficult to draw comparisons among these studies. Nonetheless, the sample used in this study (64,000 documents) is the largest ever used to date.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"A Simple Proposal for the Publication of Journal Citation Distributions"

Vincent Lariviere et al. have self-archived "A Simple Proposal for the Publication of Journal Citation Distributions."

Here's an excerpt:

Although the Journal Impact Factor (JIF) is widely acknowledged to be a poor indicator of the quality of individual papers, it is used routinely to evaluate research and researchers. Here, we present a simple method for generating the citation distributions that underlie JIFs. Application of this straightforward protocol reveals the full extent of the skew of distributions and variation in citations received by published papers that is characteristic of all scientific journals. Although there are differences among journals across the spectrum of JIFs, the citation distributions overlap extensively, demonstrating that the citation performance of individual papers cannot be inferred from the JIF. We propose that this methodology be adopted by all journals as a move to greater transparency, one that should help to refocus attention on individual pieces of work and counter the inappropriate usage of JIFs during the process of research assessment.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Laying the Groundwork for a New Library Service: Scholar-Practitioner & Graduate Student Attitudes Toward Altmetrics and the Curation of Online Profiles"

Kathleen Reed, Dana McFarland, and Rosie Croft have published "Laying the Groundwork for a New Library Service: Scholar-Practitioner & Graduate Student Attitudes Toward Altmetrics and the Curation of Online Profiles" in Evidence Based Library and Information Practice.

Here's an excerpt:

While all participants had Googled themselves, few were strategic about their online scholarly identity. Participants affirmed the perception that altmetrics can be of value in helping to craft a story of the value of their research and its diverse outputs. When participants had prior knowledge of altmetrics tools, it tended to be very narrow and deep, and perhaps field-specific. Participants identified time as the major barrier to use of scholarly profile and altmetrics tools.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Open Access Publishing Trend Analysis: Statistics Beyond the Perception"

Elisabetta Poltronieri et al. have published "Open Access Publishing Trend Analysis: Statistics Beyond the Perception" in Information Research.

Here's an excerpt:

The purpose of this analysis was twofold: to track the number of open access journals acquiring impact factor, and to investigate the distribution of subject categories pertaining to these journals. As a case study, journals in which the researchers of the National Institute of Health (Istituto Superiore di Sanità) in Italy have published were surveyed.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Back to the Past: On the Shoulders of an Academic Search Engine Giant"

Alberto Martin-Martin et al. have self-archived "Back to the Past: On the Shoulders of an Academic Search Engine Giant."

Here's an excerpt:

A study released by the Google Scholar team found an apparently increasing fraction of citations to old articles from studies published in the last 24 years (1990-2013). To demonstrate this finding we conducted a complementary study using a different data source (Journal Citation Reports), metric (aggregate cited half-life), time spam (2003-2013), and set of categories (53 Social Science subject categories and 167 Science subject categories). Although the results obtained confirm and reinforce the previous findings, the possible causes of this phenomenon keep unclear. We finally hypothesize that first page results syndrome in conjunction with the fact that Google Scholar favours the most cited documents are suggesting the growing trend of citing old documents is partly caused by Google Scholar.

Digital Scholarship | Digital Scholarship Sitemap

"Grand Challenges in Measuring and Characterizing Scholarly Impact"

Chaomei Chen has self-archived "Grand Challenges in Measuring and Characterizing Scholarly Impact."

Here's an excerpt:

The constantly growing body of scholarly knowledge of science, technology, and humanities is an asset of the mankind. While new discoveries expand the existing knowledge, they may simultaneously render some of it obsolete. It is crucial for scientists and other stakeholders to keep their knowledge up to date. Policy makers, decision makers, and the general public also need an efficient communication of scientific knowledge. Several grand challenges concerning the creation, adaptation, and diffusion of scholarly knowledge, and advance quantitative and qualitative approaches to the study of scholarly knowledge are identified.

Digital Scholarship | Digital Scholarship Sitemap

"Bibliometric and Benchmark Analysis of Gold Open Access in Spain: Big Output and Little Impact"

Daniel Torres-Salinas et al. have published "Bibliometric and Benchmark Analysis of Gold Open Access in Spain: Big Output and Little Impact" in El Profesional de la Información.

Here's an excerpt:

This bibliometric study analyzes the research output produced by Spain during the 2005-2014 time period in Open Access (OA) journals indexed in Web of Science.. . . . Spain is the second highest ranking European country with gold OA publication output and the fourth highest in Open Access output (9%). . . . Spain's normalized citation impact in Open access (0.72) is lower than the world average and that of the main European countries.

Digital Scholarship | Digital Scholarship Sitemap

"Examining the Impact of the National Institutes of Health Public Access Policy on the Citation Rates of Journal Articles"

Sandra L. De Groote et al. have published "Examining the Impact of the National Institutes of Health Public Access Policy on the Citation Rates of Journal Articles" in PLoS One.

Here's an excerpt:

Purpose

To examine whether National Institutes of Health (NIH) funded articles that were archived in PubMed Central (PMC) after the release of the 2008 NIH Public Access Policy show greater scholarly impact than comparable articles not archived in PMC. . . .

Results

A total of 45,716 articles were examined, including 7,960 with NIH-funding. An analysis of the number of times these articles were cited found that NIH-funded 2006 articles in PMC were not cited significantly more than NIH-funded non-PMC articles. However, 2009 NIH funded articles in PMC were cited 26% more than 2009 NIH funded articles not in PMC, 5 years after publication. This result is highly significant even after controlling for journal (as a proxy of article quality and topic).

Digital Scholarship | Digital Scholarship Sitemap

"Crowdsourcing Metrics of Digital Collections"

LIBER Quarterly has released "Crowdsourcing Metrics of Digital Collections" by Tuula Pääkkönen.

Here's an excerpt:

In the National Library of Finland (NLF) there are millions of digitized newspaper and journal pages, which are openly available via the public website http://digi.kansalliskirjasto.fi. To serve users better, last year the front end was completely overhauled with its main aim in crowdsourcing features, e.g., by giving end-users the opportunity to create digital clippings and a personal scrapbook from the digital collections. But how can you know whether crowdsourcing has had an impact? How much crowdsourcing functionalities have been used so far? Did crowdsourcing work? In this paper the statistics and metrics of a recent crowdsourcing effort are analysed across the different digitized material types (newspapers, journals, ephemera). The subjects, categories and keywords given by the users are analysed to see which topics are the most appealing. Some notable public uses of the crowdsourced article clippings are highlighted.

Digital Scholarship | Digital Scholarship Sitemap

"A Review of the Literature on Citation Impact Indicators"

Ludo Waltman has self-archived "A Review of the Literature on Citation Impact Indicators."

Here's an excerpt:

This paper provides an in-depth review of the literature on citation impact indicators. First, an overview is given of the literature on bibliographic databases that can be used to calculate citation impact indicators (Web of Science, Scopus, and Google Scholar). Next, selected topics in the literature on citation impact indicators are reviewed in detail. The first topic is the selection of publications and citations to be included in the calculation of citation impact indicators. The second topic is the normalization of citation impact indicators, in particular normalization for field differences. Counting methods for dealing with co-authored publications are the third topic, and citation impact indicators for journals are the last topic. The paper concludes by offering some recommendations for future research.

| New: Research Data Curation Bibliography, Version 5 | Digital Scholarship | Digital Scholarship Sitemap

The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management

The Higher Education Funding Council for England has released The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management.

Here's an excerpt:

Our report starts by tracing the history of metrics in research management and assessment, in the UK and internationally. It looks at the applicability of metrics within different research cultures, compares the peer review system with metric-based alternatives, and considers what balance might be struck between the two. It charts the development of research management systems within institutions, and examines the effects of the growing use of quantitative indicators on different aspects of research culture, including performance management, equality, diversity, interdisciplinarity, and the 'gaming' of assessment systems. The review looks at how different funders are using quantitative indicators, and considers their potential role in research and innovation policy.

| New: Research Data Curation Bibliography, Version 5 | Digital Scholarship | Digital Scholarship Sitemap

"Scholarly Social Media Profiles and Libraries: A Review"

Judit Ward et al. have published "Scholarly Social Media Profiles and Libraries: A Review" in LIBER Quarterly.

Here's an excerpt:

This article aims to point out emerging roles and responsibilities for academic librarians with the potential of better integrating the library in the research process. In order to find out how to enhance the online reputation and discoverability of individual faculty members as well as their affiliated institutions, the authors worked side-by-side with researchers in the United States and Europe to explore, create, revise, and disambiguate scholarly profiles in various software applications. In an attempt to understand and organize scholarly social media, including new, alternative metrics, the authors reviewed and classified the major academic profile platforms, highlighting the overlapping elements, benefits, and drawbacks inherent in each. The consensus is that it would be time-consuming to keep one's profile current and accurate on all of these platforms, given the plethora of underlying problems, also discussed in detail in the article. However, it came as a startling discovery that reluctance to engage with scholarly social media may cause a misrepresentation of a researcher's academic achievements and may come with unforeseen consequences. The authors claim that current skills and competencies can secure an essential role for academic librarians in the research workflow by means of monitoring and navigating researcher profiles in scholarly social media in order to best represent the scholarship of their host institutions.

Digital Scholarship | Digital Scholarship Sitemap

Altmetric Mentions and the Communication of Medical Research

Digital Science has released Altmetric Mentions and the Communication of Medical Research.

Here's an excerpt:

Social and mainstream media mentions of research publications appear much more rapidly than conventional academic citations and are generated by a wider range of users. They therefore offer the potential for early and complementary indicators of research impact. Such indicators could also identify new kinds of economic and social impact.

In this report we explore the relevance of such new indicators to research in medical and health sciences.

Digital Scholarship | Digital Scholarship Sitemap

"A Review of Theory and Practice in Scientometrics"

John Mingers and Loet Leydesdorff have self-archived "A Review of Theory and Practice in Scientometrics."

Here's an excerpt:

Scientometrics is the study of the quantitative aspects of the process of science as a communication system. It is centrally, but not only, concerned with the analysis of citations in the academic literature. In recent years it has come to play a major role in the measurement and evaluation of research performance. In this review we consider: the historical development of scientometrics, sources of citation data, citation metrics and the "laws" of scientometrics, normalisation, journal impact factors and other journal metrics, visualising and mapping science, evaluation and policy, and future developments.

Digital Scholarship | Digital Scholarship Sitemap

"Does Google Scholar Contain All Highly Cited Documents (1950-2013)?"

Alberto Martín-Martín et al. have self-archived "Does Google Scholar Contain All Highly Cited Documents (1950-2013)."

Here's an excerpt:

The study of highly cited documents on Google Scholar (GS) has never been addressed to date in a comprehensive manner. The objective of this work is to identify the set of highly cited documents in Google Scholar and define their core characteristics: their languages, their file format, or how many of them can be accessed free of charge. We will also try to answer some additional questions that hopefully shed some light about the use of GS as a tool for assessing scientific impact through citations. The decalogue of research questions is shown below:

1. Which are the most cited documents in GS?
2. Which are the most cited document types in GS?
3. What languages are the most cited documents written in GS?
4. How many highly cited documents are freely accessible?
4.1 What file types are the most commonly used to store these highly cited documents?
4.2 Which are the main providers of these documents?
5. How many of the highly cited documents indexed by GS are also indexed by WoS?
6. Is there a correlation between the number of citations that these highly cited documents have received in GS and the number of citations they have received in WoS?
7. How many versions of these highly cited documents has GS detected?
8. Is there a correlation between the number of versions GS has detected for these documents, and the number citations they have received?
9. Is there a correlation between the number of versions GS has detected for these documents, and their position in the search engine result pages?
10. Is there some relation between the positions these documents occupy in the search engine result pages, and the number of citations they have received?

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

"Tweets as Impact Indicators: Examining the Implications of Automated Bot Accounts on Twitter"

Stefanie Haustein et al. have self-archived "Tweets as Impact Indicators: Examining the Implications of Automated Bot Accounts on Twitter."

Here's an excerpt:

This brief communication presents preliminary findings on automated Twitter accounts distributing links to scientific papers deposited on the preprint repository arXiv. It discusses the implication of the presence of such bots from the perspective of social media metrics (altmetrics), where mentions of scholarly documents on Twitter have been suggested as a means of measuring impact that is both broader and timelier than citations. We present preliminary findings that automated Twitter accounts create a considerable amount of tweets to scientific papers and that they behave differently than common social bots, which has critical implications for the use of raw tweet counts in research evaluation and assessment. We discuss some definitions of Twitter cyborgs and bots in scholarly communication and propose differentiating between different levels of engagement from tweeting only bibliographic information to discussing or commenting on the content of a paper.

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

"The Imperative for Open Altmetrics"

Stacy Konkiel, Heather Piwowar, and Jason Priem have published "The Imperative for Open Altmetrics" in The Journal of Electronic Publishing.

Here's an excerpt:

If scholarly communication is broken, how will we fix it? At Impactstory—a non-profit devoted to helping scholars gather and share evidence of their research impact by tracking online usage of scholarship via blogs, Wikipedia, Mendeley, and more—we believe that incentivizing web-native research via altmetrics is the place to start. In this article, we describe the current state of the art in altmetrics and its effects on publishing, we share Impactstory's plan to build an open infrastructure for altmetrics, and describe our company's ethos and actions.

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

"Which Kind of Papers Has Higher or Lower Altmetric Counts? A Study Using Article-Level Metrics from PLOS and F1000Prime"

Lutz Bornmann has self-archived "Which Kind of Papers Has Higher or Lower Altmetric Counts? A Study Using Article-Level Metrics from PLOS and F1000Prime."

Here's an excerpt:

The present study investigates the usefulness of altmetrics for measuring the broader impact of research. Methods: This study is essentially based on a dataset with papers obtained from F1000. This dataset was augmented with altmetrics (such as Twitter counts) which were downloaded from the homepage of PLOS (the Public Library of Science). This study covers a total of 1,082 papers. Findings: The results from regression models indicate that Facebook and Twitter, but not Figshare or Mendeley, can provide indications of papers which are of interest to a broader circle of readers (and not only for the peers in a specialist area), and seem therefore be useful for societal impact measurement.

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

Do Altmetrics Point to the Broader Impact of Research? An Overview of Benefits and Disadvantages of Altmetrics

Lutz Bornmann has self-archived "Do Altmetrics Point to the Broader Impact of Research? An Overview of Benefits and Disadvantages of Altmetrics."

Here's an excerpt:

Today, it is not clear how the impact of research on other areas of society than science should be measured. While peer review and bibliometrics have become standard methods for measuring the impact of research in science, there is not yet an accepted framework within which to measure societal impact. Alternative metrics (called altmetrics to distinguish them from bibliometrics) are considered an interesting option for assessing the societal impact of research, as they offer new ways to measure (public) engagement with research output. Altmetrics is a term to describe web-based metrics for the impact of publications and other scholarly material by using data from social media platforms (e.g. Twitter or Mendeley). This overview of studies explores the potential of altmetrics for measuring societal impact. It deals with the definition and classification of altmetrics. Furthermore, their benefits and disadvantages for measuring impact are discussed.

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

"A Review of the Characteristics of 108 Author-Level Bibliometric Indicators"

Lorna Wildgaard, Jesper W. Schneider, and Birger Larsen have self-archived "A Review of the Characteristics of 108 Author-Level Bibliometric Indicators."

Here's an excerpt:

An increasing demand for bibliometric assessment of individuals has led to a growth of new bibliometric indicators as well as new variants or combinations of established ones. The aim of this review is to contribute with objective facts about the usefulness of bibliometric indicators of the effects of publication activity at the individual level. This paper reviews 108 indicators that can potentially be used to measure performance on the individual author level, and examines the complexity of their calculations in relation to what they are supposed to reflect and ease of end-user application.

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

"Measuring the Broader Impact of Research: The Potential of Altmetrics"

Lutz Bornmann has self-archived "Measuring the Broader Impact of Research: The Potential of Altmetrics."

Here's an excerpt:

Today, it is not clear how the impact of research on other areas of society than science should be measured. While peer review and bibliometrics have become standard methods for measuring the impact of research in science, there is not yet an accepted framework within which to measure societal impact. Alternative metrics (called altmetrics to distinguish them from bibliometrics) are considered an interesting option for assessing the societal impact of research, as they offer new ways to measure (public) engagement with research output. Altmetrics is a term to describe web-based metrics for the impact of publications and other scholarly material by using data from social media platforms (e.g. Twitter or Mendeley). This overview of studies explores the potential of altmetrics for measuring societal impact. It deals with the definition and classification of altmetrics. Furthermore, their benefits and disadvantages for measuring impact are discussed.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"The Multidimensional Assessment of Scholarly Research Impact"

Henk F. Moed and Gali Halevi have self-archived "The Multidimensional Assessment of Scholarly Research Impact."

Here's an excerpt:

This article introduces the Multidimensional Research Assessment Matrix of scientific output. Its base notion holds that the choice of metrics to be applied in a research assessment process depends upon the unit of assessment, the research dimension to be assessed, and the purposes and policy context of the assessment. An indicator may by highly useful within one assessment process, but less so in another. For instance, publication counts are useful tools to help discriminating between those staff members who are research active, and those who are not, but are of little value if active scientists are to be compared one another according to their research performance. This paper gives a systematic account of the potential usefulness and limitations of a set of 10 important metrics including altmetrics, applied at the level of individual articles, individual researchers, research groups and institutions. It presents a typology of research impact dimensions, and indicates which metrics are the most appropriate to measure each dimension. It introduces the concept of a meta-analysis of the units under assessment in which metrics are not used as tools to evaluate individual units, but to reach policy inferences regarding the objectives and general setup of an assessment process.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"A Comparison of Citations, Downloads And Readership Data for an Information Systems Journal"

Christian Schlögl et al. have published "A Comparison of Citations, Downloads And Readership Data for an Information Systems Journal" in a special issue on altmetrics of Research Trends.

Here's an excerpt:

In our analysis we identified a high (though not a perfect) correlation between citations and downloads which was slightly lower between downloads and readership frequencies and again between citations and readership counts. This is mainly due to the fact that the used data (sources) are related either to research or at least to teaching in higher education institutions.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap