"How Can Bibliometric and Altmetric Suppliers Improve? Messages from the End-User Community"

Elizabeth Gadd and Ian Rowlands have published "How Can Bibliometric and Altmetric Suppliers Improve? Messages from the End-User Community" in Insights.

Here's an excerpt:

This article reports on a 2018 survey of bibliometric and altmetric practitioners—'Three things you want your metrics supplier to know'—that was undertaken to better understand the practitioners' usage of existing tools and services and to invite them to suggest ways in which they would like to see these improve. In total, 149 suggestions were made by 42 respondents, mainly UK librarians. Responses could be categorized into four main themes: A) Improve and share your data; B) Be more responsible; C) Improve your tools; D) Improve your indicators. The findings of the survey are discussed and sample comments shared. Based on these findings, and expanding on the four themes, the article makes a number of practical recommendations to metrics suppliers for ways in which their services could better serve the need of the community for robust and responsible bibliometric and altmetric evaluation.

Academic Library as Scholarly Publisher Bibliography | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Over-Optimization of Academic Publishing Metrics: Observing Goodhart’s Law in Action"

Michael Fire and Carlos Guestrin have self-archived "Over-Optimization of Academic Publishing Metrics: Observing Goodhart's Law in Action."

Here's an excerpt:

In this study, we analyzed over 120 million papers to examine how the academic publishing world has evolved over the last century. Our study shows that the validity of citation-based measures is being compromised and their usefulness is lessening. In particular, the number of publications has ceased to be a good metric as a result of longer author lists, shorter papers, and surging publication numbers. Citation-based metrics, such citation number and h-index, are likewise affected by the flood of papers, self-citations, and lengthy reference lists. Measures such as a journal's impact factor have also ceased to be good metrics due to the soaring numbers of papers that are published in top journals, particularly from the same pool of authors.

Academic Library as Scholarly Publisher Bibliography | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"The Open Access Citation Advantage: Does It Exist and What Does It Mean for Libraries?"

Colby Lil Lewis has published "The Open Access Citation Advantage: Does It Exist and What Does It Mean for Libraries?" in Information Technology and Libraries.

Here's an excerpt:

The last literature review of research on the existence of an Open Access Citation Advantage (OACA) was published in 2011 by Philip M. Davis and William H. Walters. This paper reexamines the conclusions reached by Davis and Walters by providing a critical review of OACA literature that has been published 2011, and explores how increases in OA publication trends could serve as a leveraging tool for libraries against the high costs of journal subscriptions.

Academic Library as Scholarly Publisher Bibliography | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Will Open Access Increase Journal CiteScores? An Empirical Investigation over Multiple Disciplines"

Yang Li et al. have published "Will Open Access Increase Journal CiteScores? An Empirical Investigation over Multiple Disciplines" in PLoS ONE.

Here's an excerpt:

This paper empirically studies the effect of Open Access on journal CiteScores. We have found that the general effect is positive but not uniform across different types of journals. In particular, we investigate two types of heterogeneous treatment effect: (1) the differential treatment effect among journals grouped by academic field, publisher, and tier; and (2) differential treatment effects of Open Access as a function of propensity to be treated. The results are robust to a number of sensitivity checks and falsification tests. Our findings shed new light on Open Access effect on journals and can help stakeholders of journals in the decision of adopting the Open Access policy.

Academic Library as Scholarly Publisher Bibliography | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Google Scholar, Web of Science, and Scopus: A Systematic Comparison of Citations in 252 Subject Categories"

Alberto Martín-Martín have self-archived "Google Scholar, Web of Science, and Scopus: A Systematic Comparison of Citations in 252 Subject Categories."

Here's an excerpt:

Despite citation counts from Google Scholar (GS), Web of Science (WoS), and Scopus being widely consulted by researchers and sometimes used in research evaluations, there is no recent or systematic evidence about the differences between them. In response, this paper investigates 2,448,055 citations to 2,299 English-language highly-cited documents from 252 GS subject categories published in 2006, comparing GS, the WoS Core Collection, and Scopus. GS consistently found the largest percentage of citations across all areas (93%-96%), far ahead of Scopus (35%-77%) and WoS (27%-73%). GS found nearly all the WoS (95%) and Scopus (92%) citations. Most citations found only by GS were from non-journal sources (48%-65%), including theses, books, conference papers, and unpublished materials. Many were non-English (19%-38%).. . . The results suggest that in all areas GS citation data is essentially a superset of WoS and Scopus, with substantial extra coverage.

Academic Library as Scholarly Publisher Bibliography | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Can Microsoft Academic Help to Assess the Citation Impact of Academic Books?"

Kayvan Kousha and Mike Thelwall have self-archived "Can Microsoft Academic Help to Assess the Citation Impact of Academic Books?"

Here's an excerpt:

Despite recent evidence that Microsoft Academic is an extensive source of citation counts for journal articles, it is not known if the same is true for academic books. This paper fills this gap by comparing citations to 16,463 books from 2013-2016 in the Book Citation Index (BKCI) against automatically extracted citations from Microsoft Academic and Google Books in 17 fields. About 60% of the BKCI books had records in Microsoft Academic, varying by year and field. Citation counts from Microsoft Academic were 1.5 to 3.6 times higher than from BKCI in nine subject areas across all years for books indexed by both. Microsoft Academic found more citations than BKCI because it indexes more scholarly publications and combines citations to different editions and chapters. In contrast, BKCI only found more citations than Microsoft Academic for books in three fields from 2013-2014. Microsoft Academic also found more citations than Google Books in six fields for all years. Thus, Microsoft Academic may be a useful source for the impact assessment of books when comprehensive coverage is not essential.

Academic Library as Scholarly Publisher Bibliography | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

Scholarly Metrics Recommendations for Research Libraries: Deciphering the Trees in the Forest

LIBER has released Scholarly Metrics Recommendations For Research Libraries: Deciphering the Trees in the Forest.

Here's an excerpt from the announcement:

The report sets out recommendations on how research libraries and information infrastructures can deal with scholarly metrics, and how to get started with the development of services to support this. The recommendations are grouped into four important types of activities relating to metrics:

  1. Discovery and Discoverability
  2. Showcasing Achievements
  3. Service Development
  4. Research Assessment

Research Data Curation Bibliography, Version 9 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Social Media Metrics for New Research Evaluation"

Paul Wouters et al. have self-archived "Social Media Metrics for New Research Evaluation."

Here's an excerpt:

This chapter approaches, both from a theoretical and practical perspective, the most important principles and conceptual frameworks that can be considered in the application of social media metrics for scientific evaluation. We propose conceptually valid uses for social media metrics in research evaluation. The chapter discusses frameworks and uses of these metrics as well as principles and recommendations for the consideration and application of current (and potentially new) metrics in research evaluation.

Research Data Curation Bibliography, Version 9 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Collecting Inclusive Usage Metrics for Open Access Publications: the HIRMEOS Project"

Javier Arias has self-archived "Collecting Inclusive Usage Metrics for Open Access Publications: the HIRMEOS Project."

Here's an excerpt:

Open Access has matured for journals, but its uptake in the book market is still delayed, despite the fact that books continue to be the leading publishing format for social sciences and humanities. The 30-months EU-funded project HIRMEOS (High Integration of Research Monographs in the European Open Science infrastructure) tackles the main obstacles of the full integration of five important digital platforms supporting open access monographs. The content of participating platforms will be enriched with tools that enable identification, authentication and interoperability (via DOI, ORCID, Fundref), and tools that enrich information and entity extraction (INRIA (N)ERD), the ability to annotate monographs (Hypothes.is), and gather usage and alternative metric data. This paper focuses on the development and implementation of Open Source Metrics Services that enable the collection of OA Metrics and Altmetrics from third-party platforms, and how the architecture of these tools will allow implementation in any external platform, particularly in start-up Open Access publishers.

Read more about it: "Shared Infrastructure for Next- Generation Books: HIRMEOS."

Research Data Curation Bibliography, Version 9 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Google Scholar as a Data Source for Research Assessment"

Emilio Delgado López-Cózar et al. have self-archived "Google Scholar as a Data Source for Research Assessment."

Here's an excerpt:

The goal of this chapter is to lay the foundations for the use of GS as a supplementary source (and in some disciplines, arguably the best alternative) for scientific evaluation. First, we present a general overview of how GS works. Second, we present empirical evidences about its main characteristics (size, coverage, and growth rate). Third, we carry out a systematic analysis of the main limitations this search engine presents as a tool for the evaluation of scientific performance. Lastly, we discuss the main differences between GS and other more traditional bibliographic databases in light of the correlations found between their citation data. We conclude that Google Scholar presents a broader view of the academic world because it has brought to light a great amount of sources that were not previously visible.

Research Data Curation Bibliography, Version 9 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Data-Level Metrics Now Available through Make Data Count"

DataONE has released "Data-Level Metrics Now Available through Make Data Count."

Here's an excerpt:

One year into our Sloan funded Make Data Count project, the Make Data Count Team comprising DataONE, California Digital Library and Data Cite are proud to release Version 1 of standardized data usage and citation metrics! . . .

Since the development of our COUNTER Code of Practice for Research Data we have implemented comparable, standardized data usage and citation metrics at Dash (CDL) and DataONE, two project team repositories. . . .

The Make Data Count project team works in an agile "minimum viable product" methodology. This first release has focused on developing a standard recommendation, processing our logs against that Code of Practice [COUNTER Code of Practice for Research Data] to develop comparable data usage metrics, and display of both usage and citation metrics at the repository level.

Research Data Curation Bibliography, Version 9 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Academic Information on Twitter: A User Survey"

Ehsan Mohammadi et al. have published "Academic Information on Twitter: A User Survey" in PLOS ONE.

Here's an excerpt:

Although counts of tweets citing academic papers are used as an informal indicator of interest, little is known about who tweets academic papers and who uses Twitter to find scholarly information. Without knowing this, it is difficult to draw useful conclusions from a publication being frequently tweeted. This study surveyed 1,912 users that have tweeted journal articles to ask about their scholarly-related Twitter uses. Almost half of the respondents (45%) did not work in academia, despite the sample probably being biased towards academics. Twitter was used most by people with a social science or humanities background. People tend to leverage social ties on Twitter to find information rather than searching for relevant tweets. Twitter is used in academia to acquire and share real-time information and to develop connections with others. Motivations for using Twitter vary by discipline, occupation, and employment sector, but not much by gender. These factors also influence the sharing of different types of academic information. This study provides evidence that Twitter plays a significant role in the discovery of scholarly information and cross-disciplinary knowledge spreading. Most importantly, the large numbers of non-academic users support the claims of those using tweet counts as evidence for the non-academic impacts of scholarly research.

Research Data Curation Bibliography, Version 9 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

Open Science: Altmetrics and Rewards

The Mutual Learning Exercise on Open Science has released Open Science: Altmetrics and Rewards.

Here's an excerpt from the announcement:

Its focus is on three topics: 1) The potential of altmetrics—alternative (i.e. non-traditional) metrics that go beyond citations of articles—to foster Open Science; 2) Incentives and rewards for researchers to engage in Open Science activities; 3) Guidelines for developing and implementing national policies for Open Science. It identifies good practices, lists priorities and outlines potential courses of action for the best possible transition to Open Science.

Research Data Curation Bibliography, Version 9 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Can Microsoft Academic Assess the Early Citation Impact of In-Press Articles? A Multi-Discipline Exploratory Analysis"

Kayvan Kousha et al. have self-archived "Can Microsoft Academic Assess the Early Citation Impact of In-Press Articles? A Multi-Discipline Exploratory Analysis."

Here's an excerpt:

For over 65,000 Scopus in-press articles from 2016 and 2017 across 26 fields, Microsoft Academic found 2-5 times as many citations as Scopus, depending on year and field. From manual checks of 1,122 Microsoft Academic citations not found in Scopus, Microsoft Academic's citation indexing was faster but not much wider than Scopus for journals. It achieved this by associating citations to preprints with their subsequent in-press versions and by extracting citations from in-press articles. In some fields its coverage of scholarly digital libraries, such as arXiv.org, was also an advantage. Thus, Microsoft Academic seems to be a more comprehensive automatic source of citation counts for in-press articles than Scopus.

Research Data Curation Bibliography, Version 8 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Prevalence and Citation Advantage of Gold Open Access n the Subject Areas of the Scopus Database"

Pablo Dorta-González and Yolanda Santana-Jiménez have self-archived "Prevalence and Citation Advantage of Gold Open Access n the Subject Areas of the Scopus Database."

Here's an excerpt:

In the present paper, an analysis of gold OA from across all areas of research -the 27 subject areas of the Scopus database- is realized. As a novel contribution, this paper takes a journal-level approach to assessing the OA citation advantage, whereas many others take a paper-level approach. Data were obtained from Scimago Lab, sorted using Scopus database, and tagged as OA/non-OA using the DOAJ list. Jointly with the OA citation advantage, the OA prevalence as well as the differences between access types (OA vs. non-OA) in production and referencing are tested. A total of 3,737 OA journals (16.8%) and 18,485 non-OA journals (83.2%) published in 2015 are considered. As the main conclusion, there is no generalizable gold OA citation advantage at journal level.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

COUNTER Code of Practice, Release 5

COUNTER has released "COUNTER Code of Practice, Release 5 ."

Here's an excerpt:

Release 4 is the current Code of Practice and the requirement for COUNTER-compliance. The effective date for compliance with Release 5 is January 2019. The Transition Timeline and Transition Options graphics explains the detail.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"DataCite as a Novel Bibliometric Source: Coverage, Strengths and Limitations"

Nicolas Robinson-Garcia et al. have self-archived "DataCite as a Novel Bibliometric Source: Coverage, Strengths and Limitations."

Here's an excerpt:

This paper explores the characteristics of DataCite to determine its possibilities and potential as a new bibliometric data source to analyze the scholarly production of open data. Open science and the increasing data sharing requirements from governments, funding bodies, institutions and scientific journals has led to a pressing demand for the development of data metrics. As a very first step towards reliable data metrics, we need to better comprehend the limitations and caveats of the information provided by sources of open data. In this paper, we critically examine records downloaded from the DataCite's OAI API and elaborate a series of recommendations regarding the use of this source for bibliometric analyses of open data. We highlight issues related to metadata incompleteness, lack of standardization, and ambiguous definitions of several fields. Despite these limitations, we emphasize DataCite's value and potential to become one of the main sources for data metrics development.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap