"Practice Meets Principle: Tracking Software and Data Citations to Zenodo DOIs"

Stephanie van de Sandt et al. have self-archived "Practice Meets Principle: Tracking Software and Data Citations to Zenodo DOIs."

Here's an excerpt:

Data and software citations are crucial for the transparency of research results and for the transmission of credit. But they are hard to track, because of the absence of a common citation standard. As a consequence, the FORCE11 recently proposed data and software citation principles as guidance for authors. Zenodo is recognized for the implementation of DOIs for software on a large scale. The minting of complementary DOIs for the version and concept allows measuring the impact of dynamic software. This article investigates characteristics of 5,456 citations to Zenodo data and software that were captured by the Asclepias Broker in January 2019. We analyzed the current state of data and software citation practices and the quality of software citation recommendations with regard to the impact of recent standardization efforts. Our findings prove that current citation practices and recommendations do not match proposed citation standards. We consequently suggest practical first steps towards the implementation of the software citation principles.

Research Data Curation Bibliography, Version 10 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Institutional Repositories and the Item and Research Data Metrics Landscape"

Paul Needham and Jo Lambert have published "Institutional Repositories and the Item and Research Data Metrics Landscape" in Insights.

Here's an excerpt:

The success of COUNTER in supporting adoption of a standard to measure e-resource usage over the past 15 years is apparent within the scholarly communications community. The prevalence of global OA policies and mandates, and the role of institutional repositories within this context, prompts demand for more granular metrics. It also raises the profile of data sharing of item-level usage and research data metrics. The need for reliable and authoritative measures is paramount. This burgeoning interest is complemented by a number of initiatives to explore the measurement and tracking of usage of a broad range of objects outside traditional publisher platforms. Drawing on examples such as OpenAIRE, IRUS-UK, Crossref's Distributed Usage Logging and Event Data service and COAR Next Generation Repositories, this article provides a brief introduction and overview of developments in this area.

Research Data Curation Bibliography, Version 10 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Citation Counts and Journal Impact Factors Do Not Capture Research Quality in the Behavioral and Brain Sciences"

Michael Dougherty and Zachary Horne have self-archived "Citation Counts and Journal Impact Factors Do Not Capture Research Quality in the Behavioral and Brain Sciences."

Here's an excerpt:

Citation data and journal impact factors are important components of faculty dossiers and figure prominently in both promotion decisions and assessments of a researcher's broader societal impact. Although these metrics play a large role in high-stakes decisions, the evidence is mixed regarding whether they are valid proxies for research quality. We use data from three large scale studies to assess the degree to which citation counts and impact factors predict four indicators of research quality: (1) the number of statistical reporting errors in a paper, (2) the evidential value of the reported data, (3) the expected replicability of reported research findings in peer reviewed journals, and (4) the actual replicability of a given experimental result. Both citation counts and impact factors were weak and inconsistent predictors of research quality and sometimes negatively related to quality. Our findings impugn the validity of citation data and impact factors as indices of research quality and call into question their usefulness in evaluating scientists and their research. In light of these results, we argue that research evaluation should instead focus on the process of how research is conducted and incentivize behaviors that support open, transparent, and reproducible research.

Research Data Curation Bibliography, Version 10 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Citation Advantage for Open Access Articles in European Radiology"

Rayan H. M. Alkhawtani et al. have published "Citation Advantage for Open Access Articles in European Radiology" in .

Here's an excerpt:

The results of our study show that open access articles in European Radiology are significantly and independently more frequently cited than subscription access articles. This can be explained by the facts that open access by definition does not require a journal subscription or payment of a fee to read the article, open access offers potentially faster and easier article access even to subscribers because there is no need to login, and open access articles are also published in PubMed Central, which improves article visibility. Altogether, this may increase the number of article reads and subsequent citations.

Research Data Curation Bibliography, Version 10 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Do Download Reports Reliably Measure Journal Usage? Trusting the Fox to Count Your Hens?"

Alex Wood-Doughty, Ted Bergstrom, and Douglas G. Steigerwald have published "Do Download Reports Reliably Measure Journal Usage? Trusting the Fox to Count Your Hens?" in College & Research Libraries.

Here's an excerpt:

Download rates of academic journals have joined citation counts as commonly used indicators of the value of journal subscriptions. While citations reflect worldwide influence, the value of a journal subscription to a single library is more reliably measured by the rate at which it is downloaded by local users. If reported download rates accurately measure local usage, there is a strong case for using them to compare the cost-effectiveness of journal subscriptions. We examine data for nearly 8,000 journals downloaded at the ten universities in the University of California system during a period of six years. We find that controlling for number of articles, publisher, and year of download, the ratio of downloads to citations differs substantially among academic disciplines. After adding academic disciplines to the control variables, there remain substantial “publisher effects”, with some publishers reporting significantly more downloads than would be predicted by the characteristics of their journals. These cross-publisher differences suggest that the currently available download statistics, which are supplied by publishers, are not sufficiently reliable to allow libraries to make subscription decisions based on price and reported downloads, at least without making an adjustment for publisher effects in download reports.

Research Data Curation Bibliography, Version 10 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Publication Modalities ‘Article in Press’ and ‘Open Access’ in Relation to Journal Average Citation"

Sara M. González-Betancor and Pablo Dorta-González have self-archived "Publication Modalities 'Article in Press' and 'Open Access' in Relation to Journal Average Citation."

Here's an excerpt:

There has been a generalization in the use of two publication practices by scientific journals during the past decade: 1. 'article in press' or early view, which allows access to the accepted paper before its formal publication in an issue; 2. 'open access', which allows readers to obtain it freely and free of charge. This paper studies the influence of both publication modalities on the average impact of the journal and its evolution over time. It tries to identify the separate effect of access on citation into two major parts: early view and selection effect, managing to provide some evidence of the positive effect of both. Scopus is used as the database and CiteScore as the measure of journal impact. The prevalence of both publication modalities is quantified. Differences in the average impact factor of group of journals, according to their publication modalities, are tested. The evolution over time of the citation influence, from 2011 to 2016, is also analysed. Finally, a linear regression to explain the correlation of these publication practices with the CiteScore in 2016, in a ceteris paribus context, is estimated. Our main findings show evidence of a positive correlation between average journal impact and advancing the publication of accepted articles, moreover this correlation increases over time. The open access modality, in a ceteris paribus context, also correlates positively with average journal impact.

Research Data Curation Bibliography, Version 10 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Developing a Research Data Policy Framework for All Journals and Publishers"

Iain Hrynaszkiewicz et al. have self-archived "Developing a Research Data Policy Framework for All Journals and Publishers."

Here's an excerpt:

More journals and publishers—and funding agencies and institutions—are introducing research data policies. But as the prevalence of policies increases, there is potential to confuse researchers and support staff with numerous or conflicting policy requirements. We define and describe 14 features of journal research data policies and arrange these into a set of six standard policy types or tiers, which can be adopted by journals and publishers to promote data sharing in a way that encourages good practice and is appropriate for their audience's perceived needs. Policy features include coverage of topics such as data citation, data repositories, data availability statements, data standards and formats, and peer review of research data. These policy features and types have been created by reviewing the policies of multiple scholarly publishers, which collectively publish more than 10,000 journals, and through discussions and consensus building with multiple stakeholders in research data policy via the Data Policy Standardisation and Implementation Interest Group of the Research Data Alliance. Implementation guidelines for the standard research data policies for journals and publishers are also provided, along with template policy texts which can be implemented by journals in their Information for Authors and publishing workflows. We conclude with a call for collaboration across the scholarly publishing and wider research community to drive further implementation and adoption of consistent research data policies.

Research Data Curation Bibliography, Version 10 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"The Citation Advantage of Linking Publications to Research Data"

Giovanni Colavizza et al. have self-archived "The Citation Advantage of Linking Publications to Research Data."

Here's an excerpt:

We consider 531,889 journal articles published by PLOS and BMC which are part of the PubMed Open Access collection, categorize their data availability statements according to their content and analyze the citation advantage of different statement categories via regression. We find that, following mandated publisher policies, data availability statements have become common by now, yet statements containing a link to a repository are still just a fraction of the total. We also find that articles with these statements, in particular, can have up to 25.36% higher citation impact on average: an encouraging result for all publishers and authors who make the effort of sharing their data.

Research Data Curation Bibliography, Version 10 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"A Bibliometrics Analysis on Big Data Research (2009–2018)"

Zeshui Xu and Dejian Yu have published "A Bibliometrics Analysis on Big Data Research (2009–2018)" in the Journal of Data, Information and Management.

Here's an excerpt:

This paper uses the bibliometric analysis and the visual analysis methods to systematically study and analyze the big data publications included in the Science Citation Index (SCI) and Social Science Citation Index (SSCI) databases.

Research Data Curation Bibliography, Version 9 | Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap