"A Century of Science: Globalization of Scientific Collaborations, Citations, and Innovations"

Yuxiao Dong, Hao Ma, Zhihong Shen, and Kuansan Wang have self-archived "A Century of Science: Globalization of Scientific Collaborations, Citations, and Innovations."

Here's an excerpt:

In this work, we study the evolution of scientific development over the past century by presenting an anatomy of 89 million digitalized papers published between 1900 and 2015. We find that science has benefited from the shift from individual work to collaborative effort, with over 90% of the world-leading innovations generated by collaborations in this century, nearly four times higher than they were in the 1900s. We discover that rather than the frequent myopic- and self-referencing that was common in the early 20th century, modern scientists instead tend to look for literature further back and farther around. Finally, we also observe the globalization of scientific development from 1900 to 2015, including 25-fold and 7-fold increases in international collaborations and citations, respectively, as well as a dramatic decline in the dominant accumulation of citations by the US, the UK, and Germany, from 95% to 50% over the same period.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Transitioning from a Conventional to a’‘Mega’ Journal: A Bibliometric Case Study of the Journal Medicine"

Simon Wakeling et al. have published "Transitioning from a Conventional to a'‘Mega' Journal: A Bibliometric Case Study of the Journal Medicine" in Publications.

Here's an excerpt:

This study compares the bibliometric profile of the journal Medicine before and after its transition to the OAMJ model. Three standard modes of bibliometric analysis are employed, based on data from Web of Science: journal output volume, author characteristics, and citation analysis. The journal’s article output is seen to have grown hugely since its conversion to an OAMJ, a rise driven in large part by authors from China. Articles published since 2015 have fewer citations, and are cited by lower impact journals than articles published before the OAMJ transition.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"What Makes Papers Visible on Social Media? An Analysis of Various Document Characteristics"

Zohreh Zahedi et al. have self-archived "What Makes Papers Visible on Social Media? An Analysis of Various Document Characteristics."

Here's an excerpt:

In this study we have investigated the relationship between different document characteristics and the number of Mendeley readership counts, tweets, Facebook posts, mentions in blogs and mainstream media for 1.3 million papers published in journals covered by the Web of Science (WoS). It aims to demonstrate that how factors affecting various social media-based indicators differ from those influencing citations and which document types are more popular across different platforms. Our results highlight the heterogeneous nature of altmetrics, which encompasses different types of uses and user groups engaging with research on social media.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"The Coverage of Microsoft Academic: Analyzing the Publication Output of a University"

Sven E. Hug and Martin P. Braendle have self-archived "The Coverage of Microsoft Academic: Analyzing the Publication Output of a University."

Here's an excerpt:

This is the first in-depth study on the coverage of Microsoft Academic (MA). The coverage of a verified publication list of a university was analyzed on the level of individual publications in MA, Scopus, and Web of Science (WoS). Citation counts were analyzed and issues related to data retrieval and data quality were examined. . . . MA surpasses Scopus and WoS clearly with respect to book-related document types and conference items but falls slightly behind Scopus with regard to journal articles. MA shows the same biases as Scopus and WoS with regard to the coverage of the social sciences and humanities, non-English publications, and open-access publications. Rank correlations of citation counts are high between MA and the benchmark databases. . . .Given the fast and ongoing development of MA, we conclude that MA is on the verge of becoming a bibliometric superpower. However, comprehensive studies on the quality of MA data are still lacking.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"A Data Citation Roadmap for Scientific Publishers"

Helena Cousijn et al. have self-archived "A Data Citation Roadmap for Scientific Publishers."

Here's an excerpt:

This article presents a practical roadmap for scholarly publishers to implement data citation in accordance with the Joint Declaration of Data Citation Principles (JDDCP), a synopsis and harmonization of the recommendations of major science policy bodies. It was developed by the Publishers Early Adopters Expert Group as part of the Data Citation Implementation Pilot (DCIP) project, an initiative of FORCE11.org and the NIH BioCADDIE program. The structure of the roadmap presented here follows the 'life of a paper' workflow and includes the categories Pre-submission, Submission, Production, and Publication. The roadmap is intended to be publisher-agnostic so that all publishers can use this as a starting point when implementing JDDCP-compliant data citation.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"CiteScore—Flawed but Still A Game Changer"

Phil Davis has published "CiteScore—Flawed but Still A Game Change" in The Scholarly Kitchen.

Here's an excerpt:

The CiteScore metric is controversial because of its overt biases against journals that publish a lot of front-matter. Nevertheless, for most academic journals, CiteScore will provide rankings that are similar to the Impact Factor.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Does Evaluative Scientometrics Lose Its Main Focus on Scientific Quality by the New Orientation towards Societal Impact?"

Lutz Bornmann and Robin Haunschild have published "Does Evaluative Scientometrics Lose Its Main Focus on Scientific Quality by the New Orientation towards Societal Impact?" in Scientometrics.

Here's an excerpt:

In this Short Communication, we have outlined that the current revolution in scientometrics does not only imply a broadening of the impact perspective, but also the devaluation of quality considerations in evaluative contexts. Impact might no longer be seen as a proxy for quality, but in its original sense: the simple resonance in some sectors of society. This is an alarming development, because fraudulent research is definitely of low quality, but is expected to have great resonance if measured in terms of altmetrics.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Altmetrics and Grey Literature: Perspectives and Challenges"

Joachim Schöpfel and Hêlêne Prost have self-archived "Altmetrics and Grey Literature: Perspectives and Challenges."

Here's an excerpt:

The topic of our paper is the connection between altmetrics and grey literature. Do altmetrics offer new opportunities for the development and impact of grey literature? In particular, the paper explores how altmetrics could add value to grey literature, in particular how reference managers, repositories, academic search engines and social networks can produce altmetrics of dissertations, reports, conference papers etc. We explore, too, how new altmetric tools incorporate grey literature as source for impact assessment, and if they do. The discussion analyses the potential but also the limits of the actual application of altmetrics to grey literatures and highlights the importance of unique identifiers, above all the DOI.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Undercounting File Downloads from Institutional Repositories"

Patrick Obrien et al. have published "Undercounting File Downloads from Institutional Repositories" in the Journal of Library Administration.

Here's an excerpt:

A primary impact metric for institutional repositories (IR) is the number of file downloads, which are commonly measured through third-party Web analytics software. Google Analytics, a free service used by most academic libraries, relies on HTML page tagging to log visitor activity on Google's servers. However, Web aggregators such as Google Scholar link directly to high value content (usually PDF files), bypassing the HTML page and failing to register these direct access events. This article presents evidence of a study of four institutions demonstrating that the majority of IR activity is not counted by page tagging Web analytics software, and proposes a practical solution for significantly improving the reporting relevancy and accuracy of IR performance metrics using Google Analytics.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

Outputs of the NISO Alternative Assessment Project

NISO has released Outputs of the NISO Alternative Assessment Project.

Here's an excerpt from the announcement:

The National Information Standards Organization has published NISO RP-25-2016, Outputs of the NISO Alternative Assessment Project. This recommended practice on altmetrics, an expansion of the tools available for measuring the scholarly impact of research in the knowledge environment, was developed by working groups that were part of NISO's Altmetrics Initiative, a project funded by the Alfred P. Sloan Foundation. The document outlines altmetrics definitions and use cases, alternative outputs in scholarly communications, data metrics, and persistent identifiers in scholarly communications. This guidance was necessary because, before the project began, scholars had long expressed dissatisfaction with traditional measures of success, such as the Impact Factor, but needed standards relating to other viable assessment methods.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Citation Analysis with Microsoft Academic"

Sven E. Hug, Michael Ochsner, and Martin P. Braendle have self-archived "Citation Analysis with Microsoft Academic."

Here's an excerpt:

We explored if and how Microsoft Academic (MA) could be used for bibliometric analyses. First, we examined the Academic Knowledge API (AK API), an interface to access MA data. Second, we performed a comparative citation analysis of researchers by normalizing data from MA and Scopus. We found that MA offers structured and rich metadata, which facilitates data retrieval, handling and processing. In addition, the AK API allows retrieving histograms. These features have to be considered a major advantage of MA over Google Scholar. However, there are two serious limitations regarding the available metadata. First, MA does not provide the document type of a publication and, second, the 'fields of study' are dynamic, too fine-grained and field-hierarchies are incoherent. Nevertheless, we showed that average-based indicators as well as distribution-based indicators can be calculated with MA data. We postulate that MA has the potential to be used for fully-fledged bibliometric analyses.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Measuring Scientific Impact Beyond Citation Counts"

Robert M. Patton, Christopher G. Stahl and Jack C. Wells have published "Measuring Scientific Impact Beyond Citation Counts" in D-Lib Magazine.

Here's an excerpt:

The measurement of scientific progress remains a significant challenge exasperated by the use of multiple different types of metrics that are often incorrectly used, overused, or even explicitly abused. Several metrics such as h-index or journal impact factor (JIF) are often used as a means to assess whether an author, article, or journal creates an "impact" on science. Unfortunately, external forces can be used to manipulate these metrics thereby diluting the value of their intended, original purpose. This work highlights these issues and the need to more clearly define "impact" as well as emphasize the need for better metrics that leverage full content analysis of publications.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level"

B. Ian Hutchins et al. have published "Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level" in PLoS Biology.

Here's an excerpt:

Despite their recognized limitations, bibliometric assessments of scientific productivity have been widely adopted. We describe here an improved method to quantify the influence of a research article by making novel use of its co-citation network to field-normalize the number of citations it has received. Article citation rates are divided by an expected citation rate that is derived from performance of articles in the same field and benchmarked to a peer comparison group. The resulting Relative Citation Ratio is article level and field independent and provides an alternative to the invalid practice of using journal impact factors to identify influential papers. To illustrate one application of our method, we analyzed 88,835 articles published between 2003 and 2010 and found that the National Institutes of Health awardees who authored those papers occupy relatively stable positions of influence across all disciplines. We demonstrate that the values generated by this method strongly correlate with the opinions of subject matter experts in biomedical research and suggest that the same approach should be generally applicable to articles published in all areas of science. A beta version of iCite, our web tool for calculating Relative Citation Ratios of articles listed in PubMed, is available at https://icite.od.nih.gov.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Scholarly Use of Social Media and Altmetrics: A Review of the Literature"

Cassidy R. Sugimoto et al. have self-archived "Scholarly Use of Social Media and Altmetrics: A Review of the Literature."

Here's an excerpt:

This review provides an extensive account of the state-of-the art in both scholarly use of social media and altmetrics. The review consists of two main parts: the first examines the use of social media in academia, examining the various functions these platforms have in the scholarly communication process and the factors that affect this use. The second part reviews empirical studies of altmetrics, discussing the various interpretations of altmetrics, data collection and methodological limitations, and differences according to platform. The review ends with a critical discussion of the implications of this transformation in the scholarly communication system.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"2016 Scholar Metrics Released"

Google has published "2016 Scholar Metrics Released."

Here's an excerpt:

Scholar Metrics provide an easy way for authors to quickly gauge the visibility and influence of recent articles in scholarly publications. Today, we are releasing the 2016 version of Scholar Metrics. This release covers articles published in 2011-2015 and includes citations from all articles that were indexed in Google Scholar as of June 2016.

The top 100 publications include e-print servers and open access journals, such as arXiv Cosmology and Extragalactic Astrophysics (astro-ph.CO), arXiv High Energy Physics – Experiment (hep-ex), PLoS ONE, and PLoS Genetics.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"A Two-Sided Academic Landscape: Portrait of Highly-Cited Documents in Google Scholar (1950-2013)"

Alberto Martin-Martin et al. have self-archived "A Two-Sided Academic Landscape: Portrait of Highly-Cited Documents in Google Scholar (1950-2013)."

Here's an excerpt:

Since the existence of a full-text link does not guarantee the disposal of the full-text (some links actually refer to publisher's abstracts), the results (40% of the documents had a free full-text link) might be somewhat overestimated. In any case, these values are consistent with those published by Archambault et al. (2013), who found that over 40% of the articles from their sample were freely accessible; higher than those by Khabsa and Giles (2014) and Björk et al. (2010), who found only a 24% and 20.4% of open access documents respectively; and much lower than Jamali and Nabavi (2015) and Pitol and De Groote (2014), who found 61.1% and 70% respectively.

The different nature of the samples makes it difficult to draw comparisons among these studies. Nonetheless, the sample used in this study (64,000 documents) is the largest ever used to date.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"A Simple Proposal for the Publication of Journal Citation Distributions"

Vincent Lariviere et al. have self-archived "A Simple Proposal for the Publication of Journal Citation Distributions."

Here's an excerpt:

Although the Journal Impact Factor (JIF) is widely acknowledged to be a poor indicator of the quality of individual papers, it is used routinely to evaluate research and researchers. Here, we present a simple method for generating the citation distributions that underlie JIFs. Application of this straightforward protocol reveals the full extent of the skew of distributions and variation in citations received by published papers that is characteristic of all scientific journals. Although there are differences among journals across the spectrum of JIFs, the citation distributions overlap extensively, demonstrating that the citation performance of individual papers cannot be inferred from the JIF. We propose that this methodology be adopted by all journals as a move to greater transparency, one that should help to refocus attention on individual pieces of work and counter the inappropriate usage of JIFs during the process of research assessment.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Laying the Groundwork for a New Library Service: Scholar-Practitioner & Graduate Student Attitudes Toward Altmetrics and the Curation of Online Profiles"

Kathleen Reed, Dana McFarland, and Rosie Croft have published "Laying the Groundwork for a New Library Service: Scholar-Practitioner & Graduate Student Attitudes Toward Altmetrics and the Curation of Online Profiles" in Evidence Based Library and Information Practice.

Here's an excerpt:

While all participants had Googled themselves, few were strategic about their online scholarly identity. Participants affirmed the perception that altmetrics can be of value in helping to craft a story of the value of their research and its diverse outputs. When participants had prior knowledge of altmetrics tools, it tended to be very narrow and deep, and perhaps field-specific. Participants identified time as the major barrier to use of scholarly profile and altmetrics tools.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Open Access Publishing Trend Analysis: Statistics Beyond the Perception"

Elisabetta Poltronieri et al. have published "Open Access Publishing Trend Analysis: Statistics Beyond the Perception" in Information Research.

Here's an excerpt:

The purpose of this analysis was twofold: to track the number of open access journals acquiring impact factor, and to investigate the distribution of subject categories pertaining to these journals. As a case study, journals in which the researchers of the National Institute of Health (Istituto Superiore di Sanità) in Italy have published were surveyed.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Back to the Past: On the Shoulders of an Academic Search Engine Giant"

Alberto Martin-Martin et al. have self-archived "Back to the Past: On the Shoulders of an Academic Search Engine Giant."

Here's an excerpt:

A study released by the Google Scholar team found an apparently increasing fraction of citations to old articles from studies published in the last 24 years (1990-2013). To demonstrate this finding we conducted a complementary study using a different data source (Journal Citation Reports), metric (aggregate cited half-life), time spam (2003-2013), and set of categories (53 Social Science subject categories and 167 Science subject categories). Although the results obtained confirm and reinforce the previous findings, the possible causes of this phenomenon keep unclear. We finally hypothesize that first page results syndrome in conjunction with the fact that Google Scholar favours the most cited documents are suggesting the growing trend of citing old documents is partly caused by Google Scholar.

Digital Scholarship | Digital Scholarship Sitemap

"Grand Challenges in Measuring and Characterizing Scholarly Impact"

Chaomei Chen has self-archived "Grand Challenges in Measuring and Characterizing Scholarly Impact."

Here's an excerpt:

The constantly growing body of scholarly knowledge of science, technology, and humanities is an asset of the mankind. While new discoveries expand the existing knowledge, they may simultaneously render some of it obsolete. It is crucial for scientists and other stakeholders to keep their knowledge up to date. Policy makers, decision makers, and the general public also need an efficient communication of scientific knowledge. Several grand challenges concerning the creation, adaptation, and diffusion of scholarly knowledge, and advance quantitative and qualitative approaches to the study of scholarly knowledge are identified.

Digital Scholarship | Digital Scholarship Sitemap

"Bibliometric and Benchmark Analysis of Gold Open Access in Spain: Big Output and Little Impact"

Daniel Torres-Salinas et al. have published "Bibliometric and Benchmark Analysis of Gold Open Access in Spain: Big Output and Little Impact" in El Profesional de la Información.

Here's an excerpt:

This bibliometric study analyzes the research output produced by Spain during the 2005-2014 time period in Open Access (OA) journals indexed in Web of Science.. . . . Spain is the second highest ranking European country with gold OA publication output and the fourth highest in Open Access output (9%). . . . Spain's normalized citation impact in Open access (0.72) is lower than the world average and that of the main European countries.

Digital Scholarship | Digital Scholarship Sitemap

"Examining the Impact of the National Institutes of Health Public Access Policy on the Citation Rates of Journal Articles"

Sandra L. De Groote et al. have published "Examining the Impact of the National Institutes of Health Public Access Policy on the Citation Rates of Journal Articles" in PLoS One.

Here's an excerpt:

Purpose

To examine whether National Institutes of Health (NIH) funded articles that were archived in PubMed Central (PMC) after the release of the 2008 NIH Public Access Policy show greater scholarly impact than comparable articles not archived in PMC. . . .

Results

A total of 45,716 articles were examined, including 7,960 with NIH-funding. An analysis of the number of times these articles were cited found that NIH-funded 2006 articles in PMC were not cited significantly more than NIH-funded non-PMC articles. However, 2009 NIH funded articles in PMC were cited 26% more than 2009 NIH funded articles not in PMC, 5 years after publication. This result is highly significant even after controlling for journal (as a proxy of article quality and topic).

Digital Scholarship | Digital Scholarship Sitemap

"Crowdsourcing Metrics of Digital Collections"

LIBER Quarterly has released "Crowdsourcing Metrics of Digital Collections" by Tuula Pääkkönen.

Here's an excerpt:

In the National Library of Finland (NLF) there are millions of digitized newspaper and journal pages, which are openly available via the public website http://digi.kansalliskirjasto.fi. To serve users better, last year the front end was completely overhauled with its main aim in crowdsourcing features, e.g., by giving end-users the opportunity to create digital clippings and a personal scrapbook from the digital collections. But how can you know whether crowdsourcing has had an impact? How much crowdsourcing functionalities have been used so far? Did crowdsourcing work? In this paper the statistics and metrics of a recent crowdsourcing effort are analysed across the different digitized material types (newspapers, journals, ephemera). The subjects, categories and keywords given by the users are analysed to see which topics are the most appealing. Some notable public uses of the crowdsourced article clippings are highlighted.

Digital Scholarship | Digital Scholarship Sitemap

"A Review of the Literature on Citation Impact Indicators"

Ludo Waltman has self-archived "A Review of the Literature on Citation Impact Indicators."

Here's an excerpt:

This paper provides an in-depth review of the literature on citation impact indicators. First, an overview is given of the literature on bibliographic databases that can be used to calculate citation impact indicators (Web of Science, Scopus, and Google Scholar). Next, selected topics in the literature on citation impact indicators are reviewed in detail. The first topic is the selection of publications and citations to be included in the calculation of citation impact indicators. The second topic is the normalization of citation impact indicators, in particular normalization for field differences. Counting methods for dealing with co-authored publications are the third topic, and citation impact indicators for journals are the last topic. The paper concludes by offering some recommendations for future research.

| New: Research Data Curation Bibliography, Version 5 | Digital Scholarship | Digital Scholarship Sitemap