These findings indicate that Facebook mentions to LIS papers mainly reflect the institutional level advocacy and attention, with low level of engagement, and could be influenced by several features including collaborative patterns and research topics.
Nearly two dozen journals from two of the fastest growing open-access publishers, including one of the world’s largest journals by volume, will no longer receive a key scholarly imprimatur. On 20 March, the Web of Science database said it delisted the journals along with dozens of others, stripping them of an impact factor, the citation-based measure of quality that, although controversial, carries weight with authors and institutions. . . . Clarivate initially did not name any of the delisted journals or provide specific reasons. But it confirmed to Science the identities of 19 Hindawi journals and two MDPI titles after reports circulated about their removals.
Biomedical fields have seen a remarkable increase in hybrid Gold open access articles. However, it is uncertain whether the hybrid Gold open access option contributes to a citation advantage, an increase in the citations of articles made immediately available as open access regardless of the article’s quality or whether it involves a trending topic of discussion. This study aimed to compare the citation counts of hybrid Gold open access articles to subscription articles published in hybrid journals. The study aimed to ascertain if hybrid Gold open access publications yield an advantage in terms of citations. This cross-sectional study included the list of hybrid journals under 59 categories in the "Clinical Medicine" group from Clarivate’s Journal Citation Reports (JCR) during 2018–2021. The number of citable items with ‘Gold Open Access’ and ‘Subscription and Free to Read’ in each journal, as well as the number of citations of those citable items, were extracted from JCR. A hybrid Gold open access citation advantage was computed by dividing the number of citations per citable item with hybrid Gold open access by the number of citations per citable item with a subscription. A total of 498, 636, 1009, and 1328 hybrid journals in the 2018 JCR, 2019 JCR, 2020 JCR, and 2021 JCR, respectively, were included in this study. The citation advantage of hybrid Gold open access articles over subscription articles in 2018 was 1.45 (95% confidence interval (CI), 1.24–1.65); in 2019, it was 1.31 (95% CI, 1.20–1.41); in 2020, it was 1.30 (95% CI, 1.20–1.39); and in 2021, it was 1.31 (95% CI, 1.20–1.42). In the ‘Clinical Medicine’ discipline, the articles published in the hybrid journal as hybrid Gold open access had a greater number of citations when compared to those published as a subscription, self-archived, or otherwise openly accessible option.
This study examines the open access citation advantage of gold open access (OA) journal articles published at a large U.S. research university. Most studies that examine the open access citation advantage focus on specific journals, disciplines, countries or global output. Local citation patterns may differ from these larger patterns. . . . This study reports on a method and compares average citation counts for subscription and gold OA journal articles using Web of Science. Gold OA physics journals showed a definite open access citation advantage, whereas other disciplines showed no difference or no open access citation advantage.
Altmetrics are web-based quantitative impact or attention indicators for academic articles that have been proposed to supplement citation counts. This article reports the first assessment of the extent to which mature altmetrics from Altmetric.com and Mendeley associate with individual article quality scores. It exploits expert norm-referenced peer review scores from the UK Research Excellence Framework 2021 for 67,030+ journal articles in all fields 2014–2017/2018, split into 34 broadly field-based Units of Assessment (UoAs). Altmetrics correlated more strongly with research quality than previously found, although less strongly than raw and field normalized Scopus citation counts. Surprisingly, field normalizing citation counts can reduce their strength as a quality indicator for articles in a single field. For most UoAs, Mendeley reader counts are the best altmetric (e.g., three Spearman correlations with quality scores above 0.5), tweet counts are also a moderate strength indicator in eight UoAs (Spearman correlations with quality scores above 0.3), ahead of news (eight correlations above 0.3, but generally weaker), blogs (five correlations above 0.3), and Facebook (three correlations above 0.3) citations, at least in the United Kingdom. In general, altmetrics are the strongest indicators of research quality in the health and physical sciences and weakest in the arts and humanities.
We sought to evaluate the performance of open-source artificial intelligence to predict the impact factor or Eigenfactor score tertile using academic article abstracts.
PubMed-indexed articles published between 2016 and 2021 were identified with the Medical Subject Headings (MeSH) terms "ophthalmology," "radiology," and "neurology." Journals, titles, abstracts, author lists, and MeSH terms were collected. Journal impact factor and Eigenfactor scores were sourced from the 2020 Clarivate Journal Citation Report. The journals included in the study were allocated percentile ranks based on impact factor and Eigenfactor scores, compared with other journals that released publications in the same year. All abstracts were preprocessed, which included the removal of the abstract structure, and combined with titles, authors, and MeSH terms as a single input. The input data underwent preprocessing with the inbuilt ktrain Bidirectional Encoder Representations from Transformers (BERT) preprocessing library before analysis with BERT. Before use for logistic regression and XGBoost models, the input data underwent punctuation removal, negation detection, stemming, and conversion into a term frequency-inverse document frequency array. Following this preprocessing, data were randomly split into training and testing data sets with a 3:1 train:test ratio. Models were developed to predict whether a given article would be published in a first, second, or third tertile journal (0-33rd centile, 34th-66th centile, or 67th-100th centile), as ranked either by impact factor or Eigenfactor score. BERT, XGBoost, and logistic regression models were developed on the training data set before evaluation on the hold-out test data set. The primary outcome was overall classification accuracy for the best-performing model in the prediction of accepting journal impact factor tertile.
There were 10,813 articles from 382 unique journals. The median impact factor and Eigenfactor score were 2.117 (IQR 1.102-2.622) and 0.00247 (IQR 0.00105-0.03), respectively. The BERT model achieved the highest impact factor tertile classification accuracy of 75.0%, followed by an accuracy of 71.6% for XGBoost and 65.4% for logistic regression. Similarly, BERT achieved the highest Eigenfactor score tertile classification accuracy of 73.6%, followed by an accuracy of 71.8% for XGBoost and 65.3% for logistic regression.
Open-source artificial intelligence can predict the impact factor and Eigenfactor score of accepting peer-reviewed journals. Further studies are required to examine the effect on publication success and the time-to-publication of such recommender systems.
This study revisited the methodology for identifying the effects of open access and revealed the causes for contradictory conclusions using four indices for journals that transitioned from subscription to open access. . . . Although the aggregated data of the eight journals indicated that open access had a positive effect, the effect varied across journals. A few journals produced different results between the two citation scores as well as between citation scores and number of citations or articles. Furthermore, a publisher’s choice of which journal to shift to open access influenced their performance after the shift.
Results also show that the author collaboration network is very sparsely connected, indicating the absence of close collaboration among the authors in the field. Furthermore, results reveal that the Wikipedia research institutions’ collaboration network reflects a North–South divide as very limited cooperation occurs between developed and developing countries’ institutions. Finally, the multiple correspondence analysis applied to obtain the Wikipedia research conceptual map reveals the breadth, diversity, and intellectual thrust of the Wikipedia’s scholarly publications.
Twitter attention both starts and ends quickly. Mendeley readers accumulate quickly, and continue to grow over the following years. News and blog attention is quick to start, although news attention persists over a longer timeframe. Citations in policy documents are slow to start, and are observed to be growing over a decade after publication. Over time, growth in Twitter activity is confirmed, alongside an apparent decline in blogging attention. Mendeley usage is observed to grow, but shows signs of recent decline. Policy attention is identified as the slowest form of impact studied by altmetrics, and one that strongly favours the Humanities and Social Sciences. The Open Access Altmetrics Advantage is seen to emerge and evolve over time, with each attention source showing different trends.
The mechanisms through which this network status can be exchanged into academic advantage are not straightforward, but any academic who has achieved a degree of popularity online can attest to the direct and indirect advantages which this has brought to their career.. . . What if that capital is now worthless? It’s a strange position that has the potential to leave academics clinging on to their Twitter accounts long after the beneficial impact of the platform has evaporated in a mushroom cloud of moving fast and breaking things. The collapse of Twitter would be a significant event within higher education, analogous to (though not on the same scale as) citational rankings being reset overnight.
This case study looks at the approaches to user engagement with National Library of Scotland (NLS) maps website users, and how this informs digital preservation decisions. After a brief description of the NLS maps website structure, it examines user expectations of the NLS maps website, how these have developed over time, and the main purposes users have for visiting the website. The main research methods which have been employed to consult with users are then outlined, including user surveys, web-analytics, mystery visitor reports, and enquiries.
We investigate gender- and country-based biases in Wikipedia citation practices using linked data from the Web of Science and a Wikipedia citation dataset. . . . we show that publications by women are cited less by Wikipedia than expected, and publications by women are less likely to be cited than those by men. Scholarly publications by authors affiliated with non-Anglosphere countries are also disadvantaged in getting cited by Wikipedia. . . . The level of gender- or country-based inequalities varies by research field, and the gender-country intersectional bias is prominent in math-intensive STEM fields.
A well-written Wikipedia page will cite scholarly publications with links to the articles in those citations that can be accessed immediately by users. At the 2019 Charleston Conference keynote, Internet Archive founder Brewster Kahle claimed that 6% of Wikipedia readers click on a link in the footnotes (although another study found that it was more like 0.03%). In 2016, Wikipedia was the 6th-largest referrer for DOIs, with half of referrals successfully authenticating to access the article. External links on Wikipedia produce an estimated 7 million dollars of revenue per month. Given that Wikipedia is such a popular website, it’s unsurprising that academic publishers are actively pursuing ways to promote their work on Wikipedia.
The study asks how choices of immediate gold and hybrid open access are related to journal ranking and how the uptake of immediate open access is affected by transformative publish-and-read deals, pushed by recent science policy. Data consists of 186,621 articles published with a Norwegian affiliation in the period 2013–2021, all of which were published in journals ranked in a National specific ranking, on one of two levels according to their importance, prestige, and perceived quality within a discipline. The results are that researchers chose to have their articles published as hybrid two times as often in journals on the most prestigious level compared with journals on the normal level. The opposite effect was found with gold open access where publishing on the normal level was chosen three times more than on the high level. This can be explained by the absence of highly ranked gold open access journals in many disciplines. With the introduction of publish-and-read deals, hybrid open access has boosted and become a popular choice enabling the researcher to publish open access in legacy journals.
Digital Scholarship has released its third bibliography about research data, the Research Data Publication and Citation Bibliography.
Building on the base bibliography, the Research Data Curation and Management Bibliography (over 800 works) and the Research Data Sharing and Reuse Bibliography (over 200 works), the Research Data Publication and Citation Bibliography includes over 225 selected English-language articles and books that are useful in understanding the publication and citation of research data. It also provides limited coverage of closely related topics, such as research data identifiers (e.g., DOI) and scholarly metrics. Most sources have been published from January 2009 through December 2021; however, a limited number of earlier key sources are also included. Abstracts are included in this bibliography if a work is under a Creative Commons Attribution License or under one of the Creative Commons public domain licenses.
It is licensed under a Creative Commons Attribution 4.0 International License.