"Research Productivity Among Scholarly Communication Librarians"


Introduction: A growing number of academic libraries have specialized their support for scholarly communication by creating new positions or by expanding units with a focus on providing relevant services. This study was undertaken to explore the extent to which librarians with scholarly communication responsibilities produce research and scholarship, their motivations for doing so, the nature of that productivity, and the perceived impact of that activity on their professional responsibilities. Methods: The authors administered a survey of librarians who identified as having their primary job responsibilities in scholarly communication. Results: Almost all study participants produced their own scholarly work. However, a high percentage indicated that they received no relevant training in their library degree programs, and the majority experienced imposter syndrome pertaining to their own scholarship. Although most respondents were motivated to produce research by institutional expectations for promotion and tenure, greater percentages were driven by personal or professional interests. In addition, participants indicated a strong correlation between producing their own scholarship and their ability to effectively carry out their professional responsibilities. Discussion: There may be an emerging convention for scholarly communication librarianship, i.e., one that includes open education services. Findings suggest a need for scholarly communication training to be more prominent in library degree programs. They also point to the utility of making research production a job requirement, regardless of institutional expectations for professional advancement. Conclusion: The authors argue for adjustments in library education curricula and the inclusion of research production in the portfolios of scholarly communication librarians. Future research directions are proposed.

https://doi.org/10.31274/jlsc.15621

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"Open Science, Closed Doors: The Perils and Potential of Open Science for Research in Practice"


This paper advocates for the value of open science in many areas of research. However, after briefly reviewing the fundamental principles underlying open science practices and their use and justification, the paper identifies four incompatibilities between those principles and scientific progress through applied research. The incompatibilities concern barriers to sharing and disclosure, limitations and deficiencies of overidentifying with hypothetico-deductive methods of inference, the paradox of replication efforts resulting in less robust findings, and changes to the professional research and publication culture such that it will narrow in favor of a specific style of research. Seven recommendations are presented to maximize the value of open science while minimizing its adverse effects on the advancement of science in practice.

https://doi.org/10.1017/iop.2022.61

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"Community Consensus on Core Open Science Practices to Monitor in Biomedicine"


The state of open science needs to be monitored to track changes over time and identify areas to create interventions to drive improvements. In order to monitor open science practices, they first need to be well defined and operationalized. To reach consensus on what open science practices to monitor at biomedical research institutions, we conducted a modified 3-round Delphi study. Participants were research administrators, researchers, specialists in dedicated open science roles, and librarians. In rounds 1 and 2, participants completed an online survey evaluating a set of potential open science practices, and for round 3, we hosted two half-day virtual meetings to discuss and vote on items that had not reached consensus. Ultimately, participants reached consensus on 19 open science practices. This core set of open science practices will form the foundation for institutional dashboards and may also be of value for the development of policy, education, and interventions.

https://doi.org/10.1371/journal.pbio.3001949

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"Challenges of Qualitative Data Sharing in Social Sciences"


Open science offers hope for new accountability and transparency in social sciences. Nevertheless, it still fails to fully consider the complexities of qualitative research, as exemplified by a reflection on sensitive qualitative data sharing. As a result, the developing patterns of rewards and sanctions promoting open science raise concern that quantitative research, whose "replication crisis" brought the open science movement to life, will benefit from "good science" re-evaluations at the expense of other research epistemologies, despite the necessity to define accountability and transparency in social sciences more widely and not to conflate those with either reproducibility or data sharing.

bit.ly/3j6NTTV

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"Phase 1 of the NIH Preprint Pilot: Testing the Viability of Making Preprints Discoverable in PubMed Central and PubMed"


Introduction: The National Library of Medicine (NLM) launched a pilot in June 2020 to 1) explore the feasibility and utility of adding preprints to PubMed Central (PMC) and making them discoverable in PubMed and 2) to support accelerated discoverability of NIH-supported research without compromising user trust in NLM’s widely used literature services. Methods: The first phase of the Pilot focused on archiving preprints reporting NIH-supported SARS-CoV-2 virus and COVID-19 research. To launch Phase 1, NLM identified eligible preprint servers and developed processes for identifying NIH-supported preprints within scope in these servers. Processes were also developed for the ingest and conversion of preprints in PMC and to send corresponding records to PubMed. User interfaces were modified for display of preprint records. NLM collected data on the preprints ingested and discovery of preprint records in PMC and PubMed and engaged users through focus groups and a survey to obtain direct feedback on the Pilot and perceptions of preprints. Results: Between June 2020 and June 2022, NLM added more than 3,300 preprint records to PMC and PubMed, which were viewed 4 million times and 3 million times, respectively. Nearly a quarter of preprints in the Pilot were not associated with a peer-reviewed published journal article. User feedback revealed that the inclusion of preprints did not have a notable impact on trust in PMC or PubMed. Discussion: NIH-supported preprints can be identified and added to PMC and PubMed without disrupting existing operations processes. Additionally, inclusion of preprints in PMC and PubMed accelerates discovery of NIH research without reducing trust in NLM literature services. Phase 1 of the Pilot provided a useful testbed for studying NIH investigator preprint posting practices, as well as knowledge gaps among user groups, during the COVID-19 public health emergency, an unusual time with heightened interest in immediate access to research results.

https://doi.org/10.1101/2022.12.12.520156

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"Guest Post — How Do We Measure Success for Open Science?"


If the success of an innovation relates to the practice of Open Science — which at PLOS is about much more than reputation; it’s central to our mission — then what does success look like? And how do you measure it at the publisher scale? Indeed, to make progress towards any goal, good data are needed, including a view of your current and desired future states. Unfortunately, as recently as last year, there were no tools or services that could tell us everything we wanted to know, at PLOS, about Open Science practices. . . . This is, in part, why we developed and have recently shared the initial results of our "Open Science Indicators" initiative.

bit.ly/3PlXWAR

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"Evaluation of Publication of COVID-19–Related Articles Initially Presented as Preprints"


In this study, we identified 3343 COVID-19–related preprints posted on medRxiv in 2020. Our March 2022 search indicated that 1712 of those preprints (51.2%) were subsequently published in the peer-reviewed literature; this number increased to 1742 (52.1%) when we repeated the search in October 2022. Not considering January 2020, in which only 1 article on COVID-19 was posted, the rate of subsequent publication in a scientific journal ranged from 43.5% (94 of 216 preprints; observed in March 2020) to 60.6% (177 of 292 preprints posted in August 2020). The Table shows the top 25 of 579 peer-reviewed journals in which these preprints were published; 827 preprints (47.5%) were subsequently published in quartile 1 journals (Figure).

bit.ly/3HprhIq

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

Wolters Kluwer: The Path to Open Medicine: Driving Global Health Equity through Medical Research


The paper is divided into three parts. Part 1 traces the historical events that led to the modern system of scientific research, funding, knowledge dissemination, and recognition, which largely confines health and medical knowledge production to those in HICs [high income countries]. By understanding our shared past and the rise of structural barriers to global health equity, we can better inform our shared path to dismantle them. Part 2 takes a clear-eyed look at where the scientific community is now. Are the ideals of Open Medicine playing out as envisioned? Are the benefits of Open Medicine shared amongst all of humanity, or with only a select few? Lastly, Part 3 offers ideas and recommendations for all stakeholders to chart a path to bring Open Medicine into alignment with its goals and aspirations.

https://cutt.ly/E15vETj

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"Why We Need Open-Source Science Innovation — Not Patents and Paywalls"


The results of a survey study of university professors in Canada found 81.1 percent of Canadian faculty would trade all IP for an open-source endowed chair and 34.4 percent of these faculty would require no additional compensation. Surprisingly, even more American faculty (86.7 percent) are willing to accept an open-source endowed professorship.

https://cutt.ly/x190Hso

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"Are We Undervaluing Open Access by Not Correctly Factoring in the Potentially Huge Impacts of Machine Learning? — An Academic Librarian’s View (I)"


Synopsis: I have recently adjusted my view to the position that the benefits of Machine learning techniques are more likely to be real and large. This is based on the recent incredible results of LLM (Large Language models) and about a year’s experimenting with some of the newly emerging tools based on such technologies.

If I am right about this, are we academic librarians systematically undervaluing Open Access by not taking this into account sufficiently when negotiating with publishers? Given that we control the purse strings, we are one of the most impactful parties (next to publishers and researchers) that will help decide how fast if at all the transition to an Open Access World occurs.

https://cutt.ly/U19MZzK

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"Access to Research Data and EU Copyright"


The article seeks to contribute to this aim by exploring the legal framework in which research data can be accessed and used in EU copyright law. First, it delineates the authors’ understanding of research data. It then examines the protection research data currently receives under EU and Member State law via copyright and related rights, as well as the ownership of these rights by different stakeholders in the scientific community. After clarifying relevant conflict-of-laws issues that surround research data, it maps ways to legally access and use them, including statutory exceptions, the open science movement and current developments in law and practice.

bit.ly/3VVx7pg

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"New Report on Value and Utility of FAIR Implementation Profiles (FIPs) Available from the WorldFAIR project"


In the WorldFAIR project, CODATA (the Committee on Data of the International Science Council), with the RDA (Research Data Alliance) Association as a major partner, is working with a set of eleven disciplinary and cross-disciplinary case studies to advance implementation of the FAIR principles and, in particular, to improve interoperability and reusability of digital research objects, including data.

To that end, the WorldFAIR project created a range of FAIR Implementation Profiles (FIPs) between July and October 2022 to better understand current FAIR data-related practices. The report, "FAIR Implementation Profiles (FIPs) in WorldFAIR: What Have We Learnt?", is published this week and available at https://doi.org/10.5281/zenodo.7378109.

The report describes the WorldFAIR project, its objectives and its rich set of Case Studies; and it introduces FIPs as a methodology for listing the FAIR implementation decisions made by a given community of practice. Subsequently, the report gives an overview of the initial feedback and findings from the Case Studies, and considers a number of issues and points of discussion that emerged from this exercise. Finally, and most importantly, we describe how we think the experience of using FIPs will assist each Case Study in its work to implement FAIR, and will assist the project as a whole in the development of two key outputs: the Cross-Domain Interoperability Framework (CDIF), and domain-sensitive recommendations for FAIR assessment.

https://cutt.ly/x1NDUAd

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"A Landscape of Open Science Policies Research"


This literature review aims to examine the approach given to open science policy in the different studies. The main findings are that the approach given to open science has different aspects: policy framing and its geopolitical aspects are described as an asymmetries replication and epistemic governance tool. The main geopolitical aspects of open science policies described in the literature are the relations between international, regional, and national policies. There are also different components of open science covered in the literature: open data seems much discussed in the works in the English language, while open access is the main component discussed in the Portuguese and Spanish speaking papers. Finally, the relationship between open science policies and the science policy is framed by highlighting the innovation and transparency that open science can bring into it.

https://doi.org/10.1177/21582440221140358

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

Federating Research Infrastructures in Europe for Fair Access to Data: Science Europe Briefing on EOSC

The European research and innovation ecosystem is going through a period of profound change. Researchers, organisations that fund or perform research, and policymakers are reshaping the research process and its outputs based on the opportunities offered by the digital transition. The findability, accessibility, interoperability, and reusability (FAIRness) of research publications, data, and software in the digital space will define research and innovation going forward. Closely related, the transition to an open research process and Open Access of its outputs is becoming the ‘new normal’. One of the most prominent initiatives in the digital and open transition of research is the European Open Science Cloud (EOSC). This federation of existing research data infrastructures in Europe aims to create a web of FAIR data and related services for research.

https://doi.org/10.5281/zenodo.7346887

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"The Impact of Open and Reproducible Scholarship on Students’ Scientific Literacy, Engagement, and Attitudes Towards Science: A Review and Synthesis of the Evidence"


Currently, the impact of integrating an open and reproducible approach into the curriculum on student outcomes is not well articulated in the literature. Therefore, in this paper, we provide the first comprehensive review of how integrating open and reproducible scholarship into teaching and learning may impact students, using a large-scale, collaborative, team-science approach. Our review highlighted how embedding open and reproducible scholarship may impact: (1) students’ scientific literacies (i.e., students’ understanding of open research, consumption of science, and the development of transferable skills); (2) student engagement (i.e., motivation and engagement with learning, collaboration, and engagement in open research), and (3) students’ attitudes towards science (i.e., trust in science and confidence in research findings). Our review also identified a need for more robust and rigorous methods within evaluations of teaching practice. We discuss implications for teaching and learning scholarship in this area.

https://doi.org/10.31222/osf.io/9e526

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"Adoption of Transparency and Openness Promotion (TOP) Guidelines across Journals"


Journal policies continuously evolve to enable knowledge sharing and support reproducible science. However, that change happens within a certain framework. Eight modular standards with three levels of increasing stringency make Transparency and Openness Promotion (TOP) guidelines which can be used to evaluate to what extent and with which stringency journals promote open science. Guidelines define standards for data citation, transparency of data, material, code and design and analysis, replication, plan and study pre-registration, and two effective interventions: "Registered reports" and "Open science badges", and levels of adoption summed up across standards define journal’s TOP Factor. In this paper, we analysed the status of adoption of TOP guidelines across two thousand journals reported in the TOP Factor metrics. We show that the majority of the journals’ policies align with at least one of the TOP’s standards, most likely "Data citation" (70%) followed by "Data transparency" (19%). Two-thirds of adoptions of TOP standard are of the stringency Level 1 (less stringent), whereas only 9% is of the stringency Level 3. Adoption of TOP standards differs across science disciplines and multidisciplinary journals (N = 1505) and journals from social sciences (N = 1077) show the greatest number of adoptions. Improvement of the measures that journals take to implement open science practices could be done: (1) discipline-specific, (2) journals that have not yet adopted TOP guidelines could do so, (3) the stringency of adoptions could be increased.

https://doi.org/10.3390/publications10040046

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"Open Science Infrastructure as a Key Component of Open Science"


The Open Science movement is a response to the accumulated problems in scholarly communication, like the "reproducibility crisis", "serials crisis", and "peer review crisis". The European Commission defines priorities of Open Science as Findable, Accessible, Interoperable and Reproducible (FAIR) data, infrastructure and services in the European Open Science Cloud (EOSC), Next generation metrics, altmetrics and rewards, the future of scientific communication, research integrity and reproducibility, education and skills and citizen science. Open Science Infrastructure is also one of four key components of Open Science defined by UNESCO.

Mainly represented among Open Science Infrastructures are institutional and thematic repositories for publications, research data, software and code. Furthermore, the Open Science Infrastructure services range may include discovery, mining, publishing, the peer review process, archiving and preservation, social networking tools, training, high-performance computing, and tools for processing and analysis. Successful Open Science Infrastructure should be based on community values and responsive to needed changes. Preferably the Open Science Infrastructure should be distributed, enabling machine-actionable tools and services, supporting reusability and reproducibility, quality FAIR data, interoperability, sustainability, long-term preservation and funding.

https://doi.org/10.7557/5.6777

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"Why Don’t We Share Data and Code? Perceived Barriers and Benefits to Public Archiving Practices"


Here, we define, categorize and discuss barriers to data and code sharing that are relevant to many research fields. We explore how real and perceived barriers might be overcome or reframed in the light of the benefits relative to costs. By elucidating these barriers and the contexts in which they arise, we can take steps to mitigate them and align our actions with the goals of open science, both as individual scientists and as a scientific community.

https://doi.org/10.1098/rspb.2022.1113

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"Reducing Barriers to Open Science by Standardizing Practices and Realigning Incentives"


In this policy position paper, we outline current open science practices and key bottlenecks in their broader adoption. We propose that national science agencies create a digital infrastructure framework that would standardize open science principles and make them actionable. We also suggest ways of redefining research success to align better with open science, and to incentivize a system where sharing various research outputs is beneficial to researchers.

https://doi.org/10.38126/JSPG210201

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"Data Quality Assurance at Research Data Repositories"


This paper presents findings from a survey on the status quo of data quality assurance practices at research data repositories.

The personalised online survey was conducted among repositories indexed in re3data in 2021. It covered the scope of the repository, types of data quality assessment, quality criteria, responsibilities, details of the review process, and data quality information and yielded 332 complete responses.

The results demonstrate that most repositories perform data quality assurance measures, and overall, research data repositories significantly contribute to data quality. Quality assurance at research data repositories is multifaceted and nonlinear, and although there are some common patterns, individual approaches to ensuring data quality are diverse. The survey showed that data quality assurance sets high expectations for repositories and requires a lot of resources. Several challenges were discovered: for example, the adequate recognition of the contribution of data reviewers and repositories, the path dependence of data review on review processes for text publications, and the lack of data quality information. The study could not confirm that the certification status of a repository is a clear indicator of whether a repository conducts in-depth quality assurance.

http://doi.org/10.5334/dsj-2022-018

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

Paywall: "A Comprehensive Review of Open Data Platforms, Prevalent Technologies, and Functionalities"


We will discuss seven major open data platforms, such as (1) CKAN (2) DKAN (3) Socrata (4) OpenDataSoft (5) GitHub (6) Google datasets (7) Kaggle. We will evaluate the technological commons, techniques, features, methods, and visualization offered by each tool. In addition, why are these platforms important to users such as providers, curators, and end-users? And what are the key options available on these platforms to publish open data?

https://doi.org/10.1145/3560107.3560142

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"Producing Open Data"


Mainly building on our own experience as scholars from different research traditions (life sciences, social sciences and humanities), we describe best-practice approaches for opening up research data. We reflect on common barriers and strategies to overcome them, condensed into a step-by-step guide focused on actionable advice in order to mitigate the costs and promote the benefit of open data on three levels at once: society, the disciplines and individual researchers.

https://doi.org/10.3897/rio.8.e86384

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

Open Source "Academic Tracker: Software for Tracking and Reporting Publications Associated with Authors and Grants"


In recent years, United States federal funding agencies, including the National Institutes of Health (NIH) and the National Science Foundation (NSF), have implemented public access policies to make research supported by funding from these federal agencies freely available to the public. Enforcement is primarily through annual and final reports submitted to these funding agencies, where all peer-reviewed publications must be registered through the appropriate mechanism as required by the specific federal funding agency. Unreported and/or incorrectly reported papers can result in delayed acceptance of annual and final reports and even funding delays for current and new research grants. So, it’s important to make sure every peer-reviewed publication is reported properly and in a timely manner. For large collaborative research efforts, the tracking and proper registration of peer-reviewed publications along with generation of accurate annual and final reports can create a large administrative burden. With large collaborative teams, it is easy for these administrative tasks to be overlooked, forgotten, or lost in the shuffle. In order to help with this reporting burden, we have developed the Academic Tracker software package, implemented in the Python 3 programming language and supporting Linux, Windows, and Mac operating systems. Academic Tracker helps with publication tracking and reporting by comprehensively searching major peer-reviewed publication tracking web portals, including PubMed, Crossref, ORCID, and Google Scholar, given a list of authors. Academic Tracker provides highly customizable reporting templates so information about the resulting publications is easily transformed into appropriate formats for tracking and reporting purposes. The source code and extensive documentation is hosted on GitHub (https://moseleybioinformaticslab.github.io/academic_tracker/) and is also available on the Python Package Index (https://pypi.org/project/academic_tracker) for easy installation.

https://doi.org/10.1371/journal.pone.0277834

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"Nature Authors Can Now Seamlessly Share Their Data"


In April of this year, Springer Nature and Figshare announced a new integrated route for data deposition at Nature Portfolio titles to help address this problem and encourage researchers to share data rather than seeing it as a hurdle to article publication.

Following the success of the pilot, this streamlined integration is now being extended. Authors submitting to the Nature Portfolio journals, including Nature, in the fields of life, health, chemical and physical sciences will now be able to easily opt into data sharing, via Figshare, as part of one integrated submission process.

https://cutt.ly/RMTKcpo

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"Research Data Management Needs Assessment of Clemson University"


The faculty, staff, and graduate students at Clemson University were surveyed by the library about their RDM needs in the spring of 2021. The survey was based on previous surveys from 2012 and 2016 to allow for comparison, but language was updated, and additional questions were added because the field of RDM has evolved. Survey findings indicated that researchers are overall more likely to back up and share their data, but the process of cleaning and preparing the data for sharing was an obstacle. Few researchers reported including metadata when sharing or consulting the library for help with writing a Data Management Plan (DMP). Researchers want RDM resources; offering and effectively marketing those resources will enable libraries to both support researchers and encourage best practices. Understanding researcher needs and offering time-saving services and convenient training options makes following RDM best practices easier for researchers. Outreach and integrated partnerships that support the research life cycle are crucial next steps for ensuring effective data management.

https://doi.org/10.31274/jlsc.13970

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |