"Practice Meets Principle: Tracking Software and Data Citations to Zenodo DOIs"

Stephanie van de Sandt et al. have self-archived "Practice Meets Principle: Tracking Software and Data Citations to Zenodo DOIs."

Here's an excerpt:

Data and software citations are crucial for the transparency of research results and for the transmission of credit. But they are hard to track, because of the absence of a common citation standard. As a consequence, the FORCE11 recently proposed data and software citation principles as guidance for authors. Zenodo is recognized for the implementation of DOIs for software on a large scale. The minting of complementary DOIs for the version and concept allows measuring the impact of dynamic software. This article investigates characteristics of 5,456 citations to Zenodo data and software that were captured by the Asclepias Broker in January 2019. We analyzed the current state of data and software citation practices and the quality of software citation recommendations with regard to the impact of recent standardization efforts. Our findings prove that current citation practices and recommendations do not match proposed citation standards. We consequently suggest practical first steps towards the implementation of the software citation principles.

Research Data Curation Bibliography, Version 10 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Institutional Repositories and the Item and Research Data Metrics Landscape"

Paul Needham and Jo Lambert have published "Institutional Repositories and the Item and Research Data Metrics Landscape" in Insights.

Here's an excerpt:

The success of COUNTER in supporting adoption of a standard to measure e-resource usage over the past 15 years is apparent within the scholarly communications community. The prevalence of global OA policies and mandates, and the role of institutional repositories within this context, prompts demand for more granular metrics. It also raises the profile of data sharing of item-level usage and research data metrics. The need for reliable and authoritative measures is paramount. This burgeoning interest is complemented by a number of initiatives to explore the measurement and tracking of usage of a broad range of objects outside traditional publisher platforms. Drawing on examples such as OpenAIRE, IRUS-UK, Crossref's Distributed Usage Logging and Event Data service and COAR Next Generation Repositories, this article provides a brief introduction and overview of developments in this area.

Research Data Curation Bibliography, Version 10 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Citation Counts and Journal Impact Factors Do Not Capture Research Quality in the Behavioral and Brain Sciences"

Michael Dougherty and Zachary Horne have self-archived "Citation Counts and Journal Impact Factors Do Not Capture Research Quality in the Behavioral and Brain Sciences."

Here's an excerpt:

Citation data and journal impact factors are important components of faculty dossiers and figure prominently in both promotion decisions and assessments of a researcher's broader societal impact. Although these metrics play a large role in high-stakes decisions, the evidence is mixed regarding whether they are valid proxies for research quality. We use data from three large scale studies to assess the degree to which citation counts and impact factors predict four indicators of research quality: (1) the number of statistical reporting errors in a paper, (2) the evidential value of the reported data, (3) the expected replicability of reported research findings in peer reviewed journals, and (4) the actual replicability of a given experimental result. Both citation counts and impact factors were weak and inconsistent predictors of research quality and sometimes negatively related to quality. Our findings impugn the validity of citation data and impact factors as indices of research quality and call into question their usefulness in evaluating scientists and their research. In light of these results, we argue that research evaluation should instead focus on the process of how research is conducted and incentivize behaviors that support open, transparent, and reproducible research.

Research Data Curation Bibliography, Version 10 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Citation Advantage for Open Access Articles in European Radiology"

Rayan H. M. Alkhawtani et al. have published "Citation Advantage for Open Access Articles in European Radiology" in .

Here's an excerpt:

The results of our study show that open access articles in European Radiology are significantly and independently more frequently cited than subscription access articles. This can be explained by the facts that open access by definition does not require a journal subscription or payment of a fee to read the article, open access offers potentially faster and easier article access even to subscribers because there is no need to login, and open access articles are also published in PubMed Central, which improves article visibility. Altogether, this may increase the number of article reads and subsequent citations.

Research Data Curation Bibliography, Version 10 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap