Open access is a scholarly publishing model that has emerged as an alternative to traditional subscription-based journal publishing. This study explores the adoption of the open access movement worldwide and the role that libraries can play in addressing those factors which are slowing its progress within developing countries. The study has drawn upon both qualitative data from a focused literature review and quantitative data from major open access platforms. The results indicate that while the open access movement is steadily gaining acceptance worldwide, the progress in developing countries within geographical areas such as Africa, Asia and Oceania is quite a bit slower. Two significant factors are the cost of publishing fees and the lack of institutional open access mandates and policies to encourage uptake. The study provides suggested strategies for academic libraries to help overcome current challenges.
In November, the US Repository Network (USRN) will launch a pilot project aimed at improving the discoverability of articles in repositories. This pilot project involves the use of services from CORE, a not-for-profit aggregator based at Open University in the UK, to evaluate and improve local repository practices. Additional technical support will be provided by Antleaf Ltd.
As part of the project, CORE will aggregate the metadata and full text of articles from a subset of US repositories, allowing them to be findable through a centralized discovery service with prominent links back to the original full text of the repository. At the same time, the project will assess current practices related to metadata quality, the tracking of Open Access deposits, the use of PIDs, technical support for OAI-PMH, and the adoption of more recent protocols, such as FAIR Signposting. At the level of the centralized aggregation, CORE will enrich the existing US metadata with information from its larger international aggregation. A Dashboard service for participating institutions will be provided, enabling them to assess, validate and monitor their practices.
This study investigates the use of institutional repositories for self-archiving peer-reviewed work in the U15 (an association of fifteen Canadian research-intensive universities). It relates usage with university open access (OA) policy types and publisher policy embargoes. We show that of all articles found in OpenAlex attributed to U15 researchers, 45.1 to 56.6% are available as Gold or Green OA, yet only 0.5 to 10.7% (mean 4.2%) of these can be found on their respective U15 IRs. Our investigation shows a lack of OA policies from most institutions, journal policies with embargoes exceeding 12 months, and incomplete policy information.
The FAIR Principles are a set of good practices to improve the reproducibility and quality of data in an Open Science context. Different sets of indicators have been proposed to evaluate the FAIRness of digital objects, including datasets that are usually stored in repositories or data portals. However, indicators like those proposed by the Research Data Alliance are provided from a high-level perspective that can be interpreted and they are not always realistic to particular environments like multidisciplinary repositories. This paper describes FAIR EVA, a new tool developed within the European Open Science Cloud context that is oriented to particular data management systems like open repositories, which can be customized to a specific case in a scalable and automatic environment. It aims to be adaptive enough to work for different environments, repository software and disciplines, taking into account the flexibility of the FAIR Principles. As an example, we present DIGITAL.CSIC repository as the first target of the tool, gathering the particular needs of a multidisciplinary institution as well as its institutional repository.
What’s more, content submitted to Zenodo would be published automatically within the repository before and whether or not it was accepted into a community. Now, when a researcher goes to publish their outputs, they must select their community and submit their work for peer review, before it is made public. Community curators will then review the content to see if it fits within the community even have the capability to improve and correct the metadata to ensure that it meets quality standards. Once the metadata is approved, it will then be published in Zenodo and, consequently, integrated into the OpenAIRE Graph.
Since 2016, the [MSUL] digital repository has been using Faceted Application of Subject Terminology (FAST) subject headings as its primary subject vocabulary. . . The MSUL FAST use case presents some challenges that are not addressed by existing MARC-focused FAST tools. This paper will outline the MSUL digital repository team’s justification for including FAST headings in the digital repository as well as workflows for adding FAST headings to Metadata Object Description Schema (MODS) metadata, their maintenance, and utilization for discovery.
COAR strongly objects to this charge for the following reasons:
- Authors own their manuscripts and should retain their rights. Authors typically hold the copyright to their research, but too often transfer those rights to publishers when publishing their manuscript. When authors retain the copyright to their manuscript, they have the right to disseminate and use their own manuscript as they choose. If authors’ rights are retained, publishers do not own an article accepted manuscript (AAM) and researchers should not be duped into paying a fee to exercise a right they already have.
- This fee is in direct contravention with the ethos of open science and scholarship and equity. . .
- ACS is charging $2,500 while providing no added value. There is not a fee for an extra service offered. It requires no extra work on the side of the publisher, but rather is an attempt to develop a new revenue stream, while at the same time they will be receiving funds from subscriptions and pay-to-access for this same article.
- ACS is creating a false impression about compliance with funder policies. . . . A fee is only required if you want to publish in an ACS journal and sign over your rights.
See ACS’ "Open Access Pricing for Authors: The Power of Choice" for more fee details.
The most important tension that we identified relates to anonymisation of reviewers and authors. In line with the ideas of the Democracy & Transparency school, preprint review services promote more open forms of peer review in which authors and reviewers participate on a more equal basis. However, from the perspective of the Equity & Inclusion school, this raises concerns. To make peer review processes more equitable and inclusiv e, this school emphasises the importance of enabling anonymisation of reviewers and possibly also authors, which is in tension with the focus on openness and transparency of preprint review services.
While the goal of the OA movement remains good, it appears the epistemic disbalance in global knowledge creation and access has not abated. However, the promise of OA, the motivation on which it stands, its consequence and current state are reviewed in this paper with particular focus on the contribution of Africa to the global OA movement. It has been reported that the emergence of OA on the continent is albeit slow but with a mixed fortune of both progress and challenges. Notwithstanding, open access is seen as a development imperative for Africa that offers tremendous opportunities to the continent to actively contribute to global knowledge. It was reported that a number of universities and research institutions in Africa have adopted open access policies that require their researchers to publish their work in open access journals or repositories. The paper presented a number of open access initiatives and platforms that are actively being deployed to achieve OA mandate in the continent and concluded with recommendations.
The health of the research enterprise is closely tied to the effectiveness of the scientific and scholarly publishing ecosystem. Policy-, technology-, and market-driven changes in publishing models over the last two decades have triggered a number of disruptions within this ecosystem:
- Ongoing increases in the cost of journal publishing, with dominant open access models shifting costs from subscribers to authors
- Significant consolidation and vertical (supply chain) integration in the publishing industry, and a decline in society-owned subscription journals that have long subsidized scientific and scholarly societies
- A dramatic increase in the number of "predatory" journals with substandard peer review
- Decline in the purchasing power of academic libraries relative to the quantity and cost of published research
To illustrate how researcher behavior, funder policies, and publisher business models and incentives interact, this report presents an historical overview of open access publishing. The report also provides a list of key questions for further investigation to understand, measure, and best prepare for the impact of new policies related to open access in research publishing, categorized into six general areas: access and business models, research data, preprint publishing, peer review, costs to researchers and universities, and infrastructure.
Currently, there is limited research investigating the phenomenon of research data repositories being shut down, and the impact this has on the long-term availability of data. This paper takes an infrastructure perspective on the preservation of research data by using a registry to identify 191 research data repositories that have been closed and presenting information on the shutdown process. The results show that 6.2 % of research data repositories indexed in the registry were shut down. The risks resulting in repository shutdown are varied. The median age of a repository when shutting down is 12 years. Strategies to prevent data loss at the infrastructure level are pursued to varying extent. 44 % of the repositories in the sample migrated data to another repository, and 12 % maintain limited access to their data collection. However, both strategies are not permanent solutions. Finally, the general lack of information on repository shutdown events as well as the effect on the findability of data and the permanence of the scholarly record are discussed.
In total, we have 1.8M preprint records in Scopus (as of June 2023) from the following seven preprint servers:
- Research Square
ACS and Elsevier, members of the Coalition for Responsible Sharing, have agreed to a legal settlement with ResearchGate that ensures copyright-compliant sharing of research articles published with ACS or Elsevier on the ResearchGate site. The lawsuits pending against ResearchGate in Germany and the United States are now resolved. The specific terms of the parties’ settlement are confidential.
This survey analyzes the quality of the portable document format (PDF) documents in online repositories in Switzerland, examining their accessibility for people with visual impairments. Two minimal accessibility features were analysed: the PDFs had to have tags and a hierarchical heading structure. The survey also includes interviews with the managers or heads of multiple Swiss universities’ repositories . . . An analysis of interviewee responses indicates an overall lack of awareness of PDF accessibility, and shows that online repositories currently have no concrete plans to address the issue. This paper concludes by presenting a set of recommendations for online repositories to improve the accessibility of their PDF documents.
The growing impact of preprint servers enables the rapid sharing of time-sensitive research. Likewise, it is becoming increasingly difficult to distinguish high-quality, peer-reviewed research from preprints. Although preprints are often later published in peer-reviewed journals, this information is often missing from preprint servers. To overcome this problem, the PreprintResolver was developed, which uses four literature databases (DBLP, SemanticScholar, OpenAlex, and CrossRef / CrossCite) to identify preprint-publication pairs for the arXiv preprint server. . . . Experiments were performed on a sample of 1,000 arXiv-preprints from the research field of computer science and without any publication information. . . . The results show that the PreprintResolver was able to resolve 603 out of 1,000 (60.3 %) arXiv-preprints from the research field of computer science and without any publication information. . . . In conclusion the PreprintResolver is suitable for individual, manually reviewed requests, but less suitable for bulk requests. The PreprintResolver tool (this https URL, Available from 2023-08-01) and source code (this https URL, Accessed: 2023-07-19) is available online.
The COVID-19 pandemic caused a rise in preprinting, triggered by the need for open and rapid dissemination of research outputs. We surveyed authors of COVID-19 preprints to learn about their experiences with preprinting their work and also with publishing their work in a peer-reviewed journal. Our research had the following objectives: 1. to learn about authors’ experiences with preprinting, their motivations, and future intentions; 2. to consider preprints in terms of their effectiveness in enabling authors to receive feedback on their work; 3. to compare the impact of feedback on preprints with the impact of comments of editors and reviewers on papers submitted to journals. In our survey, 78% of the new adopters of preprinting reported the intention to also preprint their future work. The boost in preprinting may therefore have a structural effect that will last after the pandemic, although future developments will also depend on other factors, including the broader growth in the adoption of open science practices. A total of 53% of the respondents reported that they had received feedback on their preprints. However, more than half of the feedback was received through "closed" channels–privately to the authors. This means that preprinting was a useful way to receive feedback on research, but the value of feedback could be increased further by facilitating and promoting "open" channels for preprint feedback. Almost a quarter of the feedback received by respondents consisted of detailed comments, showing the potential of preprint feedback to provide valuable comments on research. Respondents also reported that, compared to preprint feedback, journal peer review was more likely to lead to major changes to their work, suggesting that journal peer review provides significant added value compared to feedback received on preprints.
This article describes a method for copying open access articles and corresponding descriptive metadata from open repositories for archiving in an institutional repository using Beautiful Soup and Selenium as web scraping tools. This method quickly added hundreds of articles to an IR without relying on faculty participation or consulting publisher policies, increasing repository downloads and usage.
Purpose: The recent proliferation of preprints could be a way for researchers worldwide to increase the availability and visibility of their research findings. Against the background of rising publication costs caused by the increasing prevalence of article processing fees, the search for other ways to publish research results besides traditional journal publication may increase. This could be especially true for lower-income countries. Design/methodology/approach: Therefore, we are interested in the experiences and attitudes towards posting and using preprints in the Global South as opposed to the Global North. To explore whether motivations and concerns about posting preprints differ, we adopted a mixed-methods approach, combining a quantitative survey of researchers with focus group interviews. Findings: We found that respondents from the Global South were more likely to agree to adhere to policies and to emphasise that mandates could change publishing behaviour towards open access. They were also more likely to agree posting preprints has a positive impact. Respondents from the Global South and the Global North emphasised the importance of peer-reviewed research for career advancement. Originality: The study has identified a wide range of experiences with and attitudes towards posting preprints among researchers in the Global South and the Global North. To our knowledge, this has hardly been studied before, which is also because preprints only have emerged lately in many disciplines and countries.
In this paper, a case study of computer science preprints submitted to arXiv from 2008 to 2017 is conducted to quantify how many preprints have eventually been printed in peer-reviewed venues. Among those published manuscripts, some are published under different titles and without an update to their preprints on arXiv. In the case of these manuscripts, the traditional fuzzy matching method is incapable of mapping the preprint to the final published version. In view of this issue, we introduce a semantics-based mapping method with the employment of Bidirectional Encoder Representations from Transformers (BERT). With this new mapping method and a plurality of data sources, we find that 66% of all sampled preprints are published under unchanged titles and 11% are published under different titles and with other modifications. A further analysis was then performed to investigate why these preprints but not others were accepted for publication. Our comparison reveals that in the field of computer science, published preprints feature adequate revisions, multiple authorship, detailed abstract and introduction, extensive and authoritative references and available source code.
Open science is receiving widespread attention globally, and preprinting offers an important way to implement open science practices in scholarly publishing. To develop a systematic understanding of researchers’ adoption of and attitudes toward preprinting, we conducted a survey of authors of research papers published in 2021 and early 2022. Our survey results show that the US and Europe lead the way in the adoption of preprinting. US and European respondents reported a higher familiarity with and a stronger commitment to preprinting than their colleagues elsewhere in the world. The adoption of preprinting is much stronger in physics and astronomy as well as mathematics and computer science than in other research areas. Respondents identified free accessibility of preprints and acceleration of research communication as the most important benefits of preprinting. Low reliability and credibility of preprints, sharing results before peer review and premature media coverage are the most significant concerns about preprinting, emphasized in particular by respondents in the life and health sciences. According to respondents, the most crucial strategies to encourage preprinting are integrating preprinting into journal submission workflows and providing recognition for posting preprints.
The project will put in place the basic infrastructure and protocols needed for all-round and standardised connections between preprint repositories, community-led preprint review platforms, journals, and preprint review aggregation and curation platforms. The aim is to lower existing technological and cost barriers so that as many of these services as possible can more easily participate in the ‘publish, review, curate’ future for research.
Ultimately, we might be forced to rethink publication. If scientific research is mostly read by machines, the question arises of whether it is relevant to package it into a single coherent narrative that is adapted to the limitations of human cognition. This seems like a lot of busywork for scientists. We could unbundle scientific research from the constraints of journal formatting, as suggested by Neuromatch Open Publishing. In this view, research will be a living compendium of code, datasets, graphs and narrative content remixable and always up to date. Open and freely accessible research will be more valuable and influential because it will be seen by LLMs.
It has been argued that preprint coverage during the COVID-19 pandemic constituted a paradigm shift in journalism norms and practices. This study examines whether, in what ways, and to what extent this is the case using a sample of 11,538 preprints posted on four preprint servers—bioRxiv, medRxiv, arXiv, and SSRN—that received coverage in 94 English-language media outlets between 2014-2021. We compared mentions of these preprints with mentions of a comparison sample of 397,446 peer reviewed research articles indexed in the Web of Science to identify changes in the share of media coverage that mentioned preprints before and during the pandemic. We found that preprint media coverage increased at a slow but steady rate pre-pandemic, then spiked dramatically. This increase applied only to COVID-19-related preprints, with minimal or no change in coverage of preprints on other topics. In addition, the rise in preprint coverage was most pronounced among health and medicine-focused media outlets, which barely covered preprints before the pandemic but mentioned more COVID-19 preprints than outlets focused on any other topic. These results suggest that the growth in coverage of preprints seen during the pandemic period may imply a shift in journalistic norms, including a changing outlook on reporting preliminary, unvetted research.
This book analyzes the various economic and marketing strategies utilized by the five major STM commercial scholarly journal publishers since 2000. This period has witnessed tremendous economic, marketing, and technological growth including the migration from a print only to a hybrid publishing format. With this growth, the industry has also seen the rise of open access publishing, copyright challenges by websites such as Sci-Hub, the emergence of sharing platforms such as ResearchGate and Academia.edu, as well as the impact of Plan S on publishers, universities, and authors.. . . Scrutinizing the different managerial, marketing, technology, and economic-financial strategies crafted by scholarly journal publishers between 2000-2020, this book offers a comprehensive assessment of the industry’s attempts to identify, understand, cope with, and minimize or defeat the herculean threats to its business model.
The study found that 125 nations contributed a total of 4,045 repositories in the field of research, with the USA leading the list with the most repositories. Maximum repositories were operated by institutions having multidisciplinary approaches. The DSpace and Eprints were the preferred software types for repositories. The preferred upload content by contributors was "research articles" and "electronic thesis and dissertations."