"Social Media Metrics for New Research Evaluation"

Paul Wouters et al. have self-archived "Social Media Metrics for New Research Evaluation."

Here's an excerpt:

This chapter approaches, both from a theoretical and practical perspective, the most important principles and conceptual frameworks that can be considered in the application of social media metrics for scientific evaluation. We propose conceptually valid uses for social media metrics in research evaluation. The chapter discusses frameworks and uses of these metrics as well as principles and recommendations for the consideration and application of current (and potentially new) metrics in research evaluation.

Research Data Curation Bibliography, Version 9 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Collecting Inclusive Usage Metrics for Open Access Publications: the HIRMEOS Project"

Javier Arias has self-archived "Collecting Inclusive Usage Metrics for Open Access Publications: the HIRMEOS Project."

Here's an excerpt:

Open Access has matured for journals, but its uptake in the book market is still delayed, despite the fact that books continue to be the leading publishing format for social sciences and humanities. The 30-months EU-funded project HIRMEOS (High Integration of Research Monographs in the European Open Science infrastructure) tackles the main obstacles of the full integration of five important digital platforms supporting open access monographs. The content of participating platforms will be enriched with tools that enable identification, authentication and interoperability (via DOI, ORCID, Fundref), and tools that enrich information and entity extraction (INRIA (N)ERD), the ability to annotate monographs (Hypothes.is), and gather usage and alternative metric data. This paper focuses on the development and implementation of Open Source Metrics Services that enable the collection of OA Metrics and Altmetrics from third-party platforms, and how the architecture of these tools will allow implementation in any external platform, particularly in start-up Open Access publishers.

Read more about it: "Shared Infrastructure for Next- Generation Books: HIRMEOS."

Research Data Curation Bibliography, Version 9 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Google Scholar as a Data Source for Research Assessment"

Emilio Delgado López-Cózar et al. have self-archived "Google Scholar as a Data Source for Research Assessment."

Here's an excerpt:

The goal of this chapter is to lay the foundations for the use of GS as a supplementary source (and in some disciplines, arguably the best alternative) for scientific evaluation. First, we present a general overview of how GS works. Second, we present empirical evidences about its main characteristics (size, coverage, and growth rate). Third, we carry out a systematic analysis of the main limitations this search engine presents as a tool for the evaluation of scientific performance. Lastly, we discuss the main differences between GS and other more traditional bibliographic databases in light of the correlations found between their citation data. We conclude that Google Scholar presents a broader view of the academic world because it has brought to light a great amount of sources that were not previously visible.

Research Data Curation Bibliography, Version 9 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Data-Level Metrics Now Available through Make Data Count"

DataONE has released "Data-Level Metrics Now Available through Make Data Count."

Here's an excerpt:

One year into our Sloan funded Make Data Count project, the Make Data Count Team comprising DataONE, California Digital Library and Data Cite are proud to release Version 1 of standardized data usage and citation metrics! . . .

Since the development of our COUNTER Code of Practice for Research Data we have implemented comparable, standardized data usage and citation metrics at Dash (CDL) and DataONE, two project team repositories. . . .

The Make Data Count project team works in an agile "minimum viable product" methodology. This first release has focused on developing a standard recommendation, processing our logs against that Code of Practice [COUNTER Code of Practice for Research Data] to develop comparable data usage metrics, and display of both usage and citation metrics at the repository level.

Research Data Curation Bibliography, Version 9 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Academic Information on Twitter: A User Survey"

Ehsan Mohammadi et al. have published "Academic Information on Twitter: A User Survey" in PLOS ONE.

Here's an excerpt:

Although counts of tweets citing academic papers are used as an informal indicator of interest, little is known about who tweets academic papers and who uses Twitter to find scholarly information. Without knowing this, it is difficult to draw useful conclusions from a publication being frequently tweeted. This study surveyed 1,912 users that have tweeted journal articles to ask about their scholarly-related Twitter uses. Almost half of the respondents (45%) did not work in academia, despite the sample probably being biased towards academics. Twitter was used most by people with a social science or humanities background. People tend to leverage social ties on Twitter to find information rather than searching for relevant tweets. Twitter is used in academia to acquire and share real-time information and to develop connections with others. Motivations for using Twitter vary by discipline, occupation, and employment sector, but not much by gender. These factors also influence the sharing of different types of academic information. This study provides evidence that Twitter plays a significant role in the discovery of scholarly information and cross-disciplinary knowledge spreading. Most importantly, the large numbers of non-academic users support the claims of those using tweet counts as evidence for the non-academic impacts of scholarly research.

Research Data Curation Bibliography, Version 9 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

Open Science: Altmetrics and Rewards

The Mutual Learning Exercise on Open Science has released Open Science: Altmetrics and Rewards.

Here's an excerpt from the announcement:

Its focus is on three topics: 1) The potential of altmetrics—alternative (i.e. non-traditional) metrics that go beyond citations of articles—to foster Open Science; 2) Incentives and rewards for researchers to engage in Open Science activities; 3) Guidelines for developing and implementing national policies for Open Science. It identifies good practices, lists priorities and outlines potential courses of action for the best possible transition to Open Science.

Research Data Curation Bibliography, Version 9 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Can Microsoft Academic Assess the Early Citation Impact of In-Press Articles? A Multi-Discipline Exploratory Analysis"

Kayvan Kousha et al. have self-archived "Can Microsoft Academic Assess the Early Citation Impact of In-Press Articles? A Multi-Discipline Exploratory Analysis."

Here's an excerpt:

For over 65,000 Scopus in-press articles from 2016 and 2017 across 26 fields, Microsoft Academic found 2-5 times as many citations as Scopus, depending on year and field. From manual checks of 1,122 Microsoft Academic citations not found in Scopus, Microsoft Academic's citation indexing was faster but not much wider than Scopus for journals. It achieved this by associating citations to preprints with their subsequent in-press versions and by extracting citations from in-press articles. In some fields its coverage of scholarly digital libraries, such as arXiv.org, was also an advantage. Thus, Microsoft Academic seems to be a more comprehensive automatic source of citation counts for in-press articles than Scopus.

Research Data Curation Bibliography, Version 8 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Prevalence and Citation Advantage of Gold Open Access n the Subject Areas of the Scopus Database"

Pablo Dorta-González and Yolanda Santana-Jiménez have self-archived "Prevalence and Citation Advantage of Gold Open Access n the Subject Areas of the Scopus Database."

Here's an excerpt:

In the present paper, an analysis of gold OA from across all areas of research -the 27 subject areas of the Scopus database- is realized. As a novel contribution, this paper takes a journal-level approach to assessing the OA citation advantage, whereas many others take a paper-level approach. Data were obtained from Scimago Lab, sorted using Scopus database, and tagged as OA/non-OA using the DOAJ list. Jointly with the OA citation advantage, the OA prevalence as well as the differences between access types (OA vs. non-OA) in production and referencing are tested. A total of 3,737 OA journals (16.8%) and 18,485 non-OA journals (83.2%) published in 2015 are considered. As the main conclusion, there is no generalizable gold OA citation advantage at journal level.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

COUNTER Code of Practice, Release 5

COUNTER has released "COUNTER Code of Practice, Release 5 ."

Here's an excerpt:

Release 4 is the current Code of Practice and the requirement for COUNTER-compliance. The effective date for compliance with Release 5 is January 2019. The Transition Timeline and Transition Options graphics explains the detail.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"DataCite as a Novel Bibliometric Source: Coverage, Strengths and Limitations"

Nicolas Robinson-Garcia et al. have self-archived "DataCite as a Novel Bibliometric Source: Coverage, Strengths and Limitations."

Here's an excerpt:

This paper explores the characteristics of DataCite to determine its possibilities and potential as a new bibliometric data source to analyze the scholarly production of open data. Open science and the increasing data sharing requirements from governments, funding bodies, institutions and scientific journals has led to a pressing demand for the development of data metrics. As a very first step towards reliable data metrics, we need to better comprehend the limitations and caveats of the information provided by sources of open data. In this paper, we critically examine records downloaded from the DataCite's OAI API and elaborate a series of recommendations regarding the use of this source for bibliometric analyses of open data. We highlight issues related to metadata incompleteness, lack of standardization, and ambiguous definitions of several fields. Despite these limitations, we emphasize DataCite's value and potential to become one of the main sources for data metrics development.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"A Century of Science: Globalization of Scientific Collaborations, Citations, and Innovations"

Yuxiao Dong, Hao Ma, Zhihong Shen, and Kuansan Wang have self-archived "A Century of Science: Globalization of Scientific Collaborations, Citations, and Innovations."

Here's an excerpt:

In this work, we study the evolution of scientific development over the past century by presenting an anatomy of 89 million digitalized papers published between 1900 and 2015. We find that science has benefited from the shift from individual work to collaborative effort, with over 90% of the world-leading innovations generated by collaborations in this century, nearly four times higher than they were in the 1900s. We discover that rather than the frequent myopic- and self-referencing that was common in the early 20th century, modern scientists instead tend to look for literature further back and farther around. Finally, we also observe the globalization of scientific development from 1900 to 2015, including 25-fold and 7-fold increases in international collaborations and citations, respectively, as well as a dramatic decline in the dominant accumulation of citations by the US, the UK, and Germany, from 95% to 50% over the same period.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Transitioning from a Conventional to a’‘Mega’ Journal: A Bibliometric Case Study of the Journal Medicine"

Simon Wakeling et al. have published "Transitioning from a Conventional to a'‘Mega' Journal: A Bibliometric Case Study of the Journal Medicine" in Publications.

Here's an excerpt:

This study compares the bibliometric profile of the journal Medicine before and after its transition to the OAMJ model. Three standard modes of bibliometric analysis are employed, based on data from Web of Science: journal output volume, author characteristics, and citation analysis. The journal’s article output is seen to have grown hugely since its conversion to an OAMJ, a rise driven in large part by authors from China. Articles published since 2015 have fewer citations, and are cited by lower impact journals than articles published before the OAMJ transition.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"What Makes Papers Visible on Social Media? An Analysis of Various Document Characteristics"

Zohreh Zahedi et al. have self-archived "What Makes Papers Visible on Social Media? An Analysis of Various Document Characteristics."

Here's an excerpt:

In this study we have investigated the relationship between different document characteristics and the number of Mendeley readership counts, tweets, Facebook posts, mentions in blogs and mainstream media for 1.3 million papers published in journals covered by the Web of Science (WoS). It aims to demonstrate that how factors affecting various social media-based indicators differ from those influencing citations and which document types are more popular across different platforms. Our results highlight the heterogeneous nature of altmetrics, which encompasses different types of uses and user groups engaging with research on social media.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"The Coverage of Microsoft Academic: Analyzing the Publication Output of a University"

Sven E. Hug and Martin P. Braendle have self-archived "The Coverage of Microsoft Academic: Analyzing the Publication Output of a University."

Here's an excerpt:

This is the first in-depth study on the coverage of Microsoft Academic (MA). The coverage of a verified publication list of a university was analyzed on the level of individual publications in MA, Scopus, and Web of Science (WoS). Citation counts were analyzed and issues related to data retrieval and data quality were examined. . . . MA surpasses Scopus and WoS clearly with respect to book-related document types and conference items but falls slightly behind Scopus with regard to journal articles. MA shows the same biases as Scopus and WoS with regard to the coverage of the social sciences and humanities, non-English publications, and open-access publications. Rank correlations of citation counts are high between MA and the benchmark databases. . . .Given the fast and ongoing development of MA, we conclude that MA is on the verge of becoming a bibliometric superpower. However, comprehensive studies on the quality of MA data are still lacking.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"A Data Citation Roadmap for Scientific Publishers"

Helena Cousijn et al. have self-archived "A Data Citation Roadmap for Scientific Publishers."

Here's an excerpt:

This article presents a practical roadmap for scholarly publishers to implement data citation in accordance with the Joint Declaration of Data Citation Principles (JDDCP), a synopsis and harmonization of the recommendations of major science policy bodies. It was developed by the Publishers Early Adopters Expert Group as part of the Data Citation Implementation Pilot (DCIP) project, an initiative of FORCE11.org and the NIH BioCADDIE program. The structure of the roadmap presented here follows the 'life of a paper' workflow and includes the categories Pre-submission, Submission, Production, and Publication. The roadmap is intended to be publisher-agnostic so that all publishers can use this as a starting point when implementing JDDCP-compliant data citation.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"CiteScore—Flawed but Still A Game Changer"

Phil Davis has published "CiteScore—Flawed but Still A Game Change" in The Scholarly Kitchen.

Here's an excerpt:

The CiteScore metric is controversial because of its overt biases against journals that publish a lot of front-matter. Nevertheless, for most academic journals, CiteScore will provide rankings that are similar to the Impact Factor.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Does Evaluative Scientometrics Lose Its Main Focus on Scientific Quality by the New Orientation towards Societal Impact?"

Lutz Bornmann and Robin Haunschild have published "Does Evaluative Scientometrics Lose Its Main Focus on Scientific Quality by the New Orientation towards Societal Impact?" in Scientometrics.

Here's an excerpt:

In this Short Communication, we have outlined that the current revolution in scientometrics does not only imply a broadening of the impact perspective, but also the devaluation of quality considerations in evaluative contexts. Impact might no longer be seen as a proxy for quality, but in its original sense: the simple resonance in some sectors of society. This is an alarming development, because fraudulent research is definitely of low quality, but is expected to have great resonance if measured in terms of altmetrics.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Altmetrics and Grey Literature: Perspectives and Challenges"

Joachim Schöpfel and Hêlêne Prost have self-archived "Altmetrics and Grey Literature: Perspectives and Challenges."

Here's an excerpt:

The topic of our paper is the connection between altmetrics and grey literature. Do altmetrics offer new opportunities for the development and impact of grey literature? In particular, the paper explores how altmetrics could add value to grey literature, in particular how reference managers, repositories, academic search engines and social networks can produce altmetrics of dissertations, reports, conference papers etc. We explore, too, how new altmetric tools incorporate grey literature as source for impact assessment, and if they do. The discussion analyses the potential but also the limits of the actual application of altmetrics to grey literatures and highlights the importance of unique identifiers, above all the DOI.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Undercounting File Downloads from Institutional Repositories"

Patrick Obrien et al. have published "Undercounting File Downloads from Institutional Repositories" in the Journal of Library Administration.

Here's an excerpt:

A primary impact metric for institutional repositories (IR) is the number of file downloads, which are commonly measured through third-party Web analytics software. Google Analytics, a free service used by most academic libraries, relies on HTML page tagging to log visitor activity on Google's servers. However, Web aggregators such as Google Scholar link directly to high value content (usually PDF files), bypassing the HTML page and failing to register these direct access events. This article presents evidence of a study of four institutions demonstrating that the majority of IR activity is not counted by page tagging Web analytics software, and proposes a practical solution for significantly improving the reporting relevancy and accuracy of IR performance metrics using Google Analytics.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

Outputs of the NISO Alternative Assessment Project

NISO has released Outputs of the NISO Alternative Assessment Project.

Here's an excerpt from the announcement:

The National Information Standards Organization has published NISO RP-25-2016, Outputs of the NISO Alternative Assessment Project. This recommended practice on altmetrics, an expansion of the tools available for measuring the scholarly impact of research in the knowledge environment, was developed by working groups that were part of NISO's Altmetrics Initiative, a project funded by the Alfred P. Sloan Foundation. The document outlines altmetrics definitions and use cases, alternative outputs in scholarly communications, data metrics, and persistent identifiers in scholarly communications. This guidance was necessary because, before the project began, scholars had long expressed dissatisfaction with traditional measures of success, such as the Impact Factor, but needed standards relating to other viable assessment methods.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Citation Analysis with Microsoft Academic"

Sven E. Hug, Michael Ochsner, and Martin P. Braendle have self-archived "Citation Analysis with Microsoft Academic."

Here's an excerpt:

We explored if and how Microsoft Academic (MA) could be used for bibliometric analyses. First, we examined the Academic Knowledge API (AK API), an interface to access MA data. Second, we performed a comparative citation analysis of researchers by normalizing data from MA and Scopus. We found that MA offers structured and rich metadata, which facilitates data retrieval, handling and processing. In addition, the AK API allows retrieving histograms. These features have to be considered a major advantage of MA over Google Scholar. However, there are two serious limitations regarding the available metadata. First, MA does not provide the document type of a publication and, second, the 'fields of study' are dynamic, too fine-grained and field-hierarchies are incoherent. Nevertheless, we showed that average-based indicators as well as distribution-based indicators can be calculated with MA data. We postulate that MA has the potential to be used for fully-fledged bibliometric analyses.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Measuring Scientific Impact Beyond Citation Counts"

Robert M. Patton, Christopher G. Stahl and Jack C. Wells have published "Measuring Scientific Impact Beyond Citation Counts" in D-Lib Magazine.

Here's an excerpt:

The measurement of scientific progress remains a significant challenge exasperated by the use of multiple different types of metrics that are often incorrectly used, overused, or even explicitly abused. Several metrics such as h-index or journal impact factor (JIF) are often used as a means to assess whether an author, article, or journal creates an "impact" on science. Unfortunately, external forces can be used to manipulate these metrics thereby diluting the value of their intended, original purpose. This work highlights these issues and the need to more clearly define "impact" as well as emphasize the need for better metrics that leverage full content analysis of publications.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level"

B. Ian Hutchins et al. have published "Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level" in PLoS Biology.

Here's an excerpt:

Despite their recognized limitations, bibliometric assessments of scientific productivity have been widely adopted. We describe here an improved method to quantify the influence of a research article by making novel use of its co-citation network to field-normalize the number of citations it has received. Article citation rates are divided by an expected citation rate that is derived from performance of articles in the same field and benchmarked to a peer comparison group. The resulting Relative Citation Ratio is article level and field independent and provides an alternative to the invalid practice of using journal impact factors to identify influential papers. To illustrate one application of our method, we analyzed 88,835 articles published between 2003 and 2010 and found that the National Institutes of Health awardees who authored those papers occupy relatively stable positions of influence across all disciplines. We demonstrate that the values generated by this method strongly correlate with the opinions of subject matter experts in biomedical research and suggest that the same approach should be generally applicable to articles published in all areas of science. A beta version of iCite, our web tool for calculating Relative Citation Ratios of articles listed in PubMed, is available at https://icite.od.nih.gov.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap