"Amplifying Academic Research through YouTube: Engagement Metrics as Predictors of Citation Impact"


The preliminary findings from the linear regression analysis (Table 1) suggest a meaningful relationship between the online engagement metrics of videos on YouTube and the academic impact of the publications referenced within these videos. Specifically, the analysis found positive correlations with the citation impact for three key metrics: the number of videos referencing publications, the ratio of likes to dislikes on videos, and the number of comments containing references to other publications. The positive correlation indicates a sort of selective amplification process. Publications mentioned in videos that garner attention in the form of likes and active discussion in comments are likely being selectively chosen for their relevance or quality. This selection process by content creators and the subsequent engagement by viewers may serve as an “informal peer review”, signaling the value and impact of the research. The findings suggest that social media, particularly YouTube in this context, acts as a filter that potentially can highlight the visibility of impactful research.

https://arxiv.org/abs/2405.12734

| Artificial Intelligence |
| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

"Relationships between Expert Ratings of Business/Economics Journals and Key Citation Metrics: The Impact of Size-Independence, Citing-Journal Weighting, and Subject-Area Normalization"


This study uses data for >3300 business and economics journals to explore the relationships between 5 subjective (expert) journal ratings and 10 citation metrics including 5IF (5-year Impact Factor), Article Influence (AI) score, CiteScore, Eigenfactor, Impact per Publication, SJR, and SNIP. Overall, AI and SJR are the citation metrics most closely related to the expert journal ratings. . . . These results, which are consistent across the 5 expert ratings, suggest that evaluators consider the average impact of an article in each journal rather than the total impact of the journal as a whole, that they give more credit for citations in high-impact journals than for citations in lesser journals, and that they assess each journal’s relative standing within its own field or subfield rather than its broader scholarly impact.

https://doi.org/10.1016/j.acalib.2024.102882

| Artificial Intelligence |
| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

"Different Open Access Routes, Varying Societal Impacts: Evidence from the Royal Society Biological Journals"


In this article, we explore different OA routes (i.e., gold OA, hybrid OA, and bronze OA) and their varying effects on multiple types of societal impacts (i.e., social media and web) by using the case of four biological journals founded by the Royal Society. The results show that (1) gold OA is significantly and positively related to social media indicators (Twitter counts and Facebook counts), but significantly and negatively associated with web indicators (Blog counts and News counts); (2) hybrid OA has a significant and positive effect on both social media and web indicators; and (3) bronze OA is significantly and positively associated with social media indicators, but it turns to be negative albeit nonsignificant for web indicators.

https://doi.org/10.1007/s11192-024-05032-0

| Artificial Intelligence |
| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

AI-native Platform: "Reimagining Research Impact: Introducing Web of Science Research Intelligence"


Currently being developed in partnership with leading academic institutions, Web of Science Research Intelligence is an AI-native platform that embodies a vision centered on three pillars: unification, innovation and impact. It seamlessly integrates funding data with research outputs that include publications, patents, conference proceedings, books, policy documents and more. Based on these data, the platform identifies relevant funding opportunities within emerging research areas, equipping institutions and researchers to innovate.

  • A conversational assistant powered by generative AI enables all users to gain insights and create qualitative narratives for more balanced impact assessment, from data scientists to those with limited analysis experience.
  • Tailored recommendations for collaboration and funding help early career researchers build their networks and all researchers position themselves to win.
  • A new framework for measuring societal impact beyond traditional citation metrics will empower researchers and institutions to showcase the broader impacts of their work.

https://tinyurl.com/2zdshm6b

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

Next Generation Metrics for Scientific and Scholarly Research in Europe


The field of evaluating academic activities is vast, complex, and highly dynamic, as are the roles of any data and indicators used to support these evaluations This Next Generation Metrics for Scientific and Scholarly Research in Europe paper, explores how universities can and should use currently available metrics and data to assess their research evaluation processes, in conjunction with qualitative expertise and information.

The authors have chosen to focus on the aspect of academic evaluation that shows great potential for significant advancements in the coming years: the use and advancement of next-generation metrics for responsible research evaluation, encompassing open science, societal impact, and innovation.

The paper aims to support universities in shaping their metric policies in alignment with their own missions, rather than relying solely on standard metrics and data availability. The paper furthermore intends to serve as a framework for universities to determine priorities to work on in specific domains for the application of contextually relevant indicators and metrics.

The authors place strong emphasis on the reuse of existing expertise on metrics as well as on collaboration, both among universities and between universities and funding agencies to achieve these goals.

https://doi.org/10.5281/zenodo.11123148

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

Paywall: "Open Peer Review Correlates with Altmetrics but Not with Citations: Evidence from Nature Communications and PLoS One"


The analysis reveals articles subjected to OPR [Open Peer Review] have no obvious advantage in citations but a notable higher score in altmetrics. The distribution of data variation across most disciplines, displaying a statistically significant difference between OPR and non-OPR, mirrors the overall trend. Two potential explanations for the disparity in OPR’s impact on citations compared to altmetrics are proposed. The first relates to the quality heterogeneity between OPR and non-OPR research, while the second is related to the diverse authors citing and mentioning articles in distinct communities.

https://doi.org/10.1016/j.joi.2024.101540

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

"An Analysis of the Effects of Sharing Research Data, Code, and Preprints on Citations"


In this study, we investigate whether adopting one or more Open Science practices leads to significantly higher citations for an associated publication, which is one form of academic impact. We use a novel dataset known as Open Science Indicators, produced by PLOS and DataSeer, which includes all PLOS publications from 2018 to 2023 as well as a comparison group sampled from the PMC Open Access Subset. In total, we analyze circa 122’000 publications. We calculate publication and author-level citation indicators and use a broad set of control variables to isolate the effect of Open Science Indicators on received citations. We show that Open Science practices are adopted to different degrees across scientific disciplines. We find that the early release of a publication as a preprint correlates with a significant positive citation advantage of about 20.2% on average. We also find that sharing data in an online repository correlates with a smaller yet still positive citation advantage of 4.3% on average. However, we do not find a significant citation advantage for sharing code.

https://arxiv.org/abs/2404.16171

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

"Forensic Scientometrics — An Emerging Discipline to Protect the Scholarly Record"


Forensic Scientometrics (FoSci) is emerging as a vital discipline at the intersection of scientific integrity and security. Scholarship and scholarly communication are critical for maintaining scientific integrity, influencing public trust in science, health, technology, policy, and law. Yet, these foundations are threatened by the misuse of scientific research for personal, commercial, ideological, and geopolitical gains, including questionable practices and misconduct. The rise of paper mills and predatory publishers, along with ideological and geopolitical motivations, undermines academic integrity. This field pioneers the integration of traditional scientometric methods with ethics to address pressing challenges in research integrity and security, crucial in an era of heightened scrutiny over science’s reliability. FoSci’s development signifies a collective commitment to maintaining scientific trust, marked by a call for official recognition and support from stakeholders across the scientific ecosystem.

https://arxiv.org/abs/2404.00478

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

"Unleashing the Power of AI. A Systematic Review of Cutting-Edge Techniques in AI-Enhanced Scientometrics, Webometrics, and Bibliometrics"


Findings: (i) Regarding scientometrics, the application of AI yields various distinct advantages, such as conducting analyses of publications, citations, research impact prediction, collaboration, research trend analysis, and knowledge mapping, in a more objective and reliable framework. (ii) In terms of webometrics, AI algorithms are able to enhance web crawling and data collection, web link analysis, web content analysis, social media analysis, web impact analysis, and recommender systems. (iii) Moreover, automation of data collection, analysis of citations, disambiguation of authors, analysis of co-authorship networks, assessment of research impact, text mining, and recommender systems are considered as the potential of AI integration in the field of bibliometrics.

https://arxiv.org/abs/2403.18838

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

"Controlled Experiment Finds No Detectable Citation Bump from Twitter Promotion"


Multiple studies across a variety of scientific disciplines have shown that the number of times that a paper is shared on Twitter (now called X) is correlated with the number of citations that paper receives. However, these studies were not designed to answer whether tweeting about scientific papers causes an increase in citations, or whether they were simply highlighting that some papers have higher relevance, importance or quality and are therefore both tweeted about more and cited more. The authors of this study are leading science communicators on Twitter from several life science disciplines, with substantially higher follower counts than the average scientist, making us uniquely placed to address this question. We conducted a three-year-long controlled experiment, randomly selecting five articles published in the same month and journal, and randomly tweeting one while retaining the others as controls. This process was repeated for 10 articles from each of 11 journals, recording Altmetric scores, number of tweets, and citation counts before and after tweeting. Randomization tests revealed that tweeted articles were downloaded 2.6–3.9 times more often than controls immediately after tweeting, and retained significantly higher Altmetric scores (+81%) and number of tweets (+105%) three years after tweeting. However, while some tweeted papers were cited more than their respective control papers published in the same journal and month, the overall increase in citation counts after three years (+7% for Web of Science and +12% for Google Scholar) was not statistically significant (p > 0.15). Therefore while discussing science on social media has many professional and societal benefits (and has been a lot of fun), increasing the citation rate of a scientist’s papers is likely not among them.

https://doi.org/10.1371/journal.pone.0292201

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

"OurResearch Receives $7.5M grant from Arcadia to Establish OpenAlex, a Milestone Development for Open Science"


OurResearch is proud to announce a $7.5M grant from Arcadia, to establish a sustainable and completely open index of the world’s research ecosystem. With this 5-year grant, OurResearch expands their open science ambitions to replace paywalled knowledge graphs with OpenAlex. . . .

OpenAlex indexes more than twice as many scholarly works as the leading proprietary products and the entirety of the knowledge graph and its source code are openly licensed and freely available through data snapshots, an easy to use API, and a nascent user interface. . . .

Development of OpenAlex started only two years ago and it already serves 115M API calls per month; underlies a major university ranking; is displacing proprietary products at Universities; and has established partnerships with national governments. We are excited by these early successes of OpenAlex and its promise to revolutionize scholarly communication and democratize the world’s research.

You can use OpenAlex‘s Author Profile Change Request Form, to correct certain types of errors about your publications, such as "My work has been incorrectly attributed to another author."

There is also a Submit a Request form, but it is not clear if this can be used to correct citation count or other types of errors not covered by the above form.

https://tinyurl.com/3396s27m

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

"Information Accessibility and Knowledge Creation: The Impact of Google’s Withdrawal from China on Scientific Research"


How important is Google for scientific research? This paper exploits the exogenous shock represented by Google’s sudden withdrawal of its services from mainland China to assess the importance of access to information for the knowledge production function of scientific scholars in the field of economics. For economists, a type of scholar with a simple knowledge production function, results from difference-in-difference analyses, which compare their scientific output to scholars located in the neighbouring regions, show that the scientific productivity declines by about 28% in volume and 30% in terms of citations. These results are consistent with the view that information accessibility is an important driver of scientific progress. Considering that the negative effect of the shock is stronger for top scholars located in China, Google’s sudden exit bears the risk that researchers lose touch with the research frontier and persistently lag behind their foreign peers.

https://doi.org/10.1080/13662716.2023.2298293

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

"Citation Amnesia: NLP and Other Academic Fields Are in a Citation Age Recession"


This study examines the tendency to cite older work across 20 fields of study over 43 years (1980–2023). . . . Our analysis, based on a dataset of approximately 240 million papers, reveals a broader scientific trend: many fields have markedly declined in citing older works (e.g., psychology, computer science). . . . Our results suggest that citing more recent works is not directly driven by the growth in publication rates. . . even when controlling for an increase in the volume of papers. Our findings raise questions about the scientific community’s engagement with past literature, particularly for NLP, and the potential consequences of neglecting older but relevant research.

https://arxiv.org/abs/2402.12046

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

"Can ChatGPT Be Used to Predict Citation Counts, Readership, and Social Media Interaction? An Exploration among 2222 Scientific Abstracts"


This study explores the potential of ChatGPT, a large language model, in scientometrics by assessing its ability to predict citation counts, Mendeley readers, and social media engagement. In this study, 2222 abstracts from PLOS ONE articles published during the initial months of 2022 were analyzed using ChatGPT-4, which used a set of 60 criteria to assess each abstract. Using a principal component analysis, three components were identified: Quality and Reliability, Accessibility and Understandability, and Novelty and Engagement. The Accessibility and Understandability of the abstracts correlated with higher Mendeley readership, while Novelty and Engagement and Accessibility and Understandability were linked to citation counts (Dimensions, Scopus, Google Scholar) and social media attention. Quality and Reliability showed minimal correlation with citation and altmetrics outcomes. Finally, it was found that the predictive correlations of ChatGPT-based assessments surpassed traditional readability metrics. The findings highlight the potential of large language models in scientometrics and possibly pave the way for AI-assisted peer review.

https://doi.org/10.1007/s11192-024-04939-y

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

"Google Scholar is Manipulatable"


Citations are widely considered in scientists’ evaluation. As such, scientists may be incentivized to inflate their citation counts. While previous literature has examined self-citations and citation cartels, it remains unclear whether scientists can purchase citations. Here, we compile a dataset of ~1.6 million profiles on Google Scholar to examine instances of citation fraud on the platform. We survey faculty at highly-ranked universities, and confirm that Google Scholar is widely used when evaluating scientists. Intrigued by a citation-boosting service that we unravelled during our investigation, we contacted the service while undercover as a fictional author, and managed to purchase 50 citations. These findings provide conclusive evidence that citations can be bought in bulk, and highlight the need to look beyond citation counts.

https://arxiv.org/abs/2402.04607

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

"Is Gold Open Access Helpful for Academic Purification? A Causal Inference Analysis Based on Retracted Articles in Biochemistry"


The results showed that compared to non-OA, Gold OA is advantageous in reducing the retraction time of flawed articles, but does not demonstrate a significant advantage in reducing citations after retraction. This indicates that Gold OA may help expedite the detection and retraction of flawed articles, ultimately promoting the practice of responsible research.

https://doi.org/10.1016/j.ipm.2023.103640

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

"Open-Access Papers Draw More Citations from a Broader Readership"


Now, after years of little conclusive evidence to support these assertions, researchers report that open-access papers have a greater reach than paywalled ones in two key ways: They attract more total citations, and those citations come from scholars in a wider range of locations, institutions, and fields of research. The study also reports a "citation diversity advantage" for a controversial type of open-access article, those deposited in "green" public repositories.

http://tinyurl.com/27p6pfje

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

"Promotion of Scientific Publications on ArXiv and X Is on the Rise and Impacts Citations"


Here, based on a large dataset of computer science publications, we study trends in the use of early preprint publications and revisions on ArXiv and the use of X (formerly Twitter) for promotion of such papers in the last 10 years. We find that early submission to ArXiv and promotion on X have soared in recent years. Estimating the effect that the use of each of these modern affordances has on the number of citations of scientific publications, we find that in the first 5 years from an initial publication peer-reviewed conference papers submitted early to ArXiv gain on average 21.1±17.4 more citations, revised on ArXiv gain 18.4±17.6 more citations, and promoted on X gain 44.4±8 more citations. Our results show that promoting one’s work on ArXiv or X has a large impact on the number of citations, as well as the number of influential citations computed by Semantic Scholar, and thereby on the career of researchers. We discuss the far-reaching implications of these findings for future scientific publishing systems and measures of scientific impact.

https://arxiv.org/abs/2401.11116

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

"Self-Archiving Adoption in Legal Scholarly Communication: A Literature Review;"


This article explores the current Library and Information Science (LIS) literature on open access and self-archiving and related studies. . . It further investigates the open access and self-archiving practices in disciplinary . . . Finally, it examines self-archiving in law and concludes that the research gap and lack of literature on self-archiving in the discipline of law makes this study worthwhile.

https://doi.org/10.1080/13614576.2023.2279760

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

Paywall: "Analyzing the Relationship between Citation-Based Impact Metrics and Electronic Journal Usage: A Case Study"


We focus on the impact of major JIFs on local e-journal usage and propose an alternative approach to conventional methods for collection selectors. By treating journal usage patterns as panel data and employing fixed-effects regression models, we find that journal popularity has the greatest influence on local e-journal usage and the effects of impact factors on academic article usage can vary across different disciplines.

https://doi.org/10.1080/01462679.2023.2230166

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

"The Impacts of Changes in Journal Data Policies: A Cross-disciplinary Survey"


This discipline-specific survey of journal DSP and SMP highlighted the increasing adoption rates and rankings of DSP over time. Furthermore, the findings suggest that DSP adoption may have a notable impact on the increase in JIF. The adoption of DSP by journals may be associated with the increased attention and credibility of the articles.

https://doi.org/10.1002/pra2.924

| Research Data Publication and Citation Bibliography | Research Data Sharing and Reuse Bibliography | Research Data Curation and Management Bibliography | Digital Scholarship |

Paywall: "Trends in Research Impact Librarianship: Developing a New Program and Services"


Research impact librarianship is an area within the profession that continues to grow out of need for dedicated expertise of bibliometrics and other various assessment measures.. . . The Libraries at the University of Houston is in the midst of creating a research visibility and impact program born out of an initiative to elevate the university’s level of prestige and impact by developing personnel, programs, and practices to support research visibility and impact across the institution. This article discusses the University of Houston Libraries’ process and progress toward formalizing research impact services.

https://doi.org/10.1080/01930826.2023.2262364

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

"Measured in a Context: Making Sense of Open Access Book Data"


Open access (OA) book platforms, such as JSTOR, OAPEN Library or Google Books, have been available for over a decade. Each platform shows usage data, but this results in confusion about how well an individual book is performing overall. Even within one platform, there are considerable usage differences between subjects and languages. Some context is therefore necessary to make sense of OA books usage data. A possible solution is a new metric — the Transparent Open Access Normalized Index (TOANI) score. It is designed to provide a simple answer to the question of how well an individual open access book or chapter is performing. The transparency is based on clear rules, and by making all of the data used visible. The data is normalized, using a common scale for the complete collection of an open access book platform and, to keep the level of complexity as low as possible, the score is based on a simple metric. As a proof of the concept, the usage of over 18,000 open access books and chapters in the OAPEN Library has been analysed, to determine whether each individual title has performed as well as can be expected compared to similar titles.

https://doi.org/10.1629/uksg.627

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |

Using Altmetric Data Responsibly: A Guide to Interpretation and Good Practice

This guide focuses specifically on data from the data provider and company, Altmetric, but other types of altmetrics are mentioned and occasionally used as a comparison in this guide, such as the Open Syllabus database to find the educational engagement with scholarly outputs. This guide opens with an introduction followed by an overview of Altmetric and the Altmetric Attention Score, Altmetrics and Responsible Research Assessment, Output Types Tracked by Altmetric, and the Altmetric Sources of Attention, which include: News and Mainstream Media, Social Media (X (formerly Twitter), Facebook, Reddit, and historical data from Google+, Pinterest, LinkedIn, and Sina Weibo); Patents, Peer Review, Syllabi (historical data only), Multimedia, Public Policy Documents, Wikipedia, Research Highlights, Reference Managers, and Blogs; finally, there is a conclusion, a list of related resources and readings, two appendices, and references. This guide is intended for use by librarians, practitioners, funders, and other users of Altmetric data or those who are interested in incorporating altmetrics into their bibliometric practice and/or research analytics. It can also help researchers who are going up for annual evaluations and promotion and tenure reviews, who can use the data in informed and practical applications. It can also be a useful reference guide for research managers and university administrators who want to understand the broader online engagement with research publications beyond traditional scholarly citations, also known as bibliometrics, but who also want to avoid misusing, misinterpreting, or abusing Altmetric data when making decisions, creating policies, and evaluating faculty members and researchers at their institutions.

http://hdl.handle.net/10919/116448

| Research Data Curation and Management Works |
| Digital Curation and Digital Preservation Works |
| Open Access Works |
| Digital Scholarship |