"Laying the Groundwork for a New Library Service: Scholar-Practitioner & Graduate Student Attitudes Toward Altmetrics and the Curation of Online Profiles"

Kathleen Reed, Dana McFarland, and Rosie Croft have published "Laying the Groundwork for a New Library Service: Scholar-Practitioner & Graduate Student Attitudes Toward Altmetrics and the Curation of Online Profiles" in Evidence Based Library and Information Practice.

Here's an excerpt:

While all participants had Googled themselves, few were strategic about their online scholarly identity. Participants affirmed the perception that altmetrics can be of value in helping to craft a story of the value of their research and its diverse outputs. When participants had prior knowledge of altmetrics tools, it tended to be very narrow and deep, and perhaps field-specific. Participants identified time as the major barrier to use of scholarly profile and altmetrics tools.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Open Access Publishing Trend Analysis: Statistics Beyond the Perception"

Elisabetta Poltronieri et al. have published "Open Access Publishing Trend Analysis: Statistics Beyond the Perception" in Information Research.

Here's an excerpt:

The purpose of this analysis was twofold: to track the number of open access journals acquiring impact factor, and to investigate the distribution of subject categories pertaining to these journals. As a case study, journals in which the researchers of the National Institute of Health (Istituto Superiore di Sanità) in Italy have published were surveyed.

Digital Curation and Digital Preservation Works | Open Access Works | Digital Scholarship | Digital Scholarship Sitemap

"Back to the Past: On the Shoulders of an Academic Search Engine Giant"

Alberto Martin-Martin et al. have self-archived "Back to the Past: On the Shoulders of an Academic Search Engine Giant."

Here's an excerpt:

A study released by the Google Scholar team found an apparently increasing fraction of citations to old articles from studies published in the last 24 years (1990-2013). To demonstrate this finding we conducted a complementary study using a different data source (Journal Citation Reports), metric (aggregate cited half-life), time spam (2003-2013), and set of categories (53 Social Science subject categories and 167 Science subject categories). Although the results obtained confirm and reinforce the previous findings, the possible causes of this phenomenon keep unclear. We finally hypothesize that first page results syndrome in conjunction with the fact that Google Scholar favours the most cited documents are suggesting the growing trend of citing old documents is partly caused by Google Scholar.

Digital Scholarship | Digital Scholarship Sitemap

"Grand Challenges in Measuring and Characterizing Scholarly Impact"

Chaomei Chen has self-archived "Grand Challenges in Measuring and Characterizing Scholarly Impact."

Here's an excerpt:

The constantly growing body of scholarly knowledge of science, technology, and humanities is an asset of the mankind. While new discoveries expand the existing knowledge, they may simultaneously render some of it obsolete. It is crucial for scientists and other stakeholders to keep their knowledge up to date. Policy makers, decision makers, and the general public also need an efficient communication of scientific knowledge. Several grand challenges concerning the creation, adaptation, and diffusion of scholarly knowledge, and advance quantitative and qualitative approaches to the study of scholarly knowledge are identified.

Digital Scholarship | Digital Scholarship Sitemap

"Bibliometric and Benchmark Analysis of Gold Open Access in Spain: Big Output and Little Impact"

Daniel Torres-Salinas et al. have published "Bibliometric and Benchmark Analysis of Gold Open Access in Spain: Big Output and Little Impact" in El Profesional de la Información.

Here's an excerpt:

This bibliometric study analyzes the research output produced by Spain during the 2005-2014 time period in Open Access (OA) journals indexed in Web of Science.. . . . Spain is the second highest ranking European country with gold OA publication output and the fourth highest in Open Access output (9%). . . . Spain's normalized citation impact in Open access (0.72) is lower than the world average and that of the main European countries.

Digital Scholarship | Digital Scholarship Sitemap

"Examining the Impact of the National Institutes of Health Public Access Policy on the Citation Rates of Journal Articles"

Sandra L. De Groote et al. have published "Examining the Impact of the National Institutes of Health Public Access Policy on the Citation Rates of Journal Articles" in PLoS One.

Here's an excerpt:

Purpose

To examine whether National Institutes of Health (NIH) funded articles that were archived in PubMed Central (PMC) after the release of the 2008 NIH Public Access Policy show greater scholarly impact than comparable articles not archived in PMC. . . .

Results

A total of 45,716 articles were examined, including 7,960 with NIH-funding. An analysis of the number of times these articles were cited found that NIH-funded 2006 articles in PMC were not cited significantly more than NIH-funded non-PMC articles. However, 2009 NIH funded articles in PMC were cited 26% more than 2009 NIH funded articles not in PMC, 5 years after publication. This result is highly significant even after controlling for journal (as a proxy of article quality and topic).

Digital Scholarship | Digital Scholarship Sitemap

"Crowdsourcing Metrics of Digital Collections"

LIBER Quarterly has released "Crowdsourcing Metrics of Digital Collections" by Tuula Pääkkönen.

Here's an excerpt:

In the National Library of Finland (NLF) there are millions of digitized newspaper and journal pages, which are openly available via the public website http://digi.kansalliskirjasto.fi. To serve users better, last year the front end was completely overhauled with its main aim in crowdsourcing features, e.g., by giving end-users the opportunity to create digital clippings and a personal scrapbook from the digital collections. But how can you know whether crowdsourcing has had an impact? How much crowdsourcing functionalities have been used so far? Did crowdsourcing work? In this paper the statistics and metrics of a recent crowdsourcing effort are analysed across the different digitized material types (newspapers, journals, ephemera). The subjects, categories and keywords given by the users are analysed to see which topics are the most appealing. Some notable public uses of the crowdsourced article clippings are highlighted.

Digital Scholarship | Digital Scholarship Sitemap

"A Review of the Literature on Citation Impact Indicators"

Ludo Waltman has self-archived "A Review of the Literature on Citation Impact Indicators."

Here's an excerpt:

This paper provides an in-depth review of the literature on citation impact indicators. First, an overview is given of the literature on bibliographic databases that can be used to calculate citation impact indicators (Web of Science, Scopus, and Google Scholar). Next, selected topics in the literature on citation impact indicators are reviewed in detail. The first topic is the selection of publications and citations to be included in the calculation of citation impact indicators. The second topic is the normalization of citation impact indicators, in particular normalization for field differences. Counting methods for dealing with co-authored publications are the third topic, and citation impact indicators for journals are the last topic. The paper concludes by offering some recommendations for future research.

| New: Research Data Curation Bibliography, Version 5 | Digital Scholarship | Digital Scholarship Sitemap

The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management

The Higher Education Funding Council for England has released The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management.

Here's an excerpt:

Our report starts by tracing the history of metrics in research management and assessment, in the UK and internationally. It looks at the applicability of metrics within different research cultures, compares the peer review system with metric-based alternatives, and considers what balance might be struck between the two. It charts the development of research management systems within institutions, and examines the effects of the growing use of quantitative indicators on different aspects of research culture, including performance management, equality, diversity, interdisciplinarity, and the 'gaming' of assessment systems. The review looks at how different funders are using quantitative indicators, and considers their potential role in research and innovation policy.

| New: Research Data Curation Bibliography, Version 5 | Digital Scholarship | Digital Scholarship Sitemap

"Scholarly Social Media Profiles and Libraries: A Review"

Judit Ward et al. have published "Scholarly Social Media Profiles and Libraries: A Review" in LIBER Quarterly.

Here's an excerpt:

This article aims to point out emerging roles and responsibilities for academic librarians with the potential of better integrating the library in the research process. In order to find out how to enhance the online reputation and discoverability of individual faculty members as well as their affiliated institutions, the authors worked side-by-side with researchers in the United States and Europe to explore, create, revise, and disambiguate scholarly profiles in various software applications. In an attempt to understand and organize scholarly social media, including new, alternative metrics, the authors reviewed and classified the major academic profile platforms, highlighting the overlapping elements, benefits, and drawbacks inherent in each. The consensus is that it would be time-consuming to keep one's profile current and accurate on all of these platforms, given the plethora of underlying problems, also discussed in detail in the article. However, it came as a startling discovery that reluctance to engage with scholarly social media may cause a misrepresentation of a researcher's academic achievements and may come with unforeseen consequences. The authors claim that current skills and competencies can secure an essential role for academic librarians in the research workflow by means of monitoring and navigating researcher profiles in scholarly social media in order to best represent the scholarship of their host institutions.

Digital Scholarship | Digital Scholarship Sitemap

Altmetric Mentions and the Communication of Medical Research

Digital Science has released Altmetric Mentions and the Communication of Medical Research.

Here's an excerpt:

Social and mainstream media mentions of research publications appear much more rapidly than conventional academic citations and are generated by a wider range of users. They therefore offer the potential for early and complementary indicators of research impact. Such indicators could also identify new kinds of economic and social impact.

In this report we explore the relevance of such new indicators to research in medical and health sciences.

Digital Scholarship | Digital Scholarship Sitemap

"A Review of Theory and Practice in Scientometrics"

John Mingers and Loet Leydesdorff have self-archived "A Review of Theory and Practice in Scientometrics."

Here's an excerpt:

Scientometrics is the study of the quantitative aspects of the process of science as a communication system. It is centrally, but not only, concerned with the analysis of citations in the academic literature. In recent years it has come to play a major role in the measurement and evaluation of research performance. In this review we consider: the historical development of scientometrics, sources of citation data, citation metrics and the "laws" of scientometrics, normalisation, journal impact factors and other journal metrics, visualising and mapping science, evaluation and policy, and future developments.

Digital Scholarship | Digital Scholarship Sitemap

"Does Google Scholar Contain All Highly Cited Documents (1950-2013)?"

Alberto Martín-Martín et al. have self-archived "Does Google Scholar Contain All Highly Cited Documents (1950-2013)."

Here's an excerpt:

The study of highly cited documents on Google Scholar (GS) has never been addressed to date in a comprehensive manner. The objective of this work is to identify the set of highly cited documents in Google Scholar and define their core characteristics: their languages, their file format, or how many of them can be accessed free of charge. We will also try to answer some additional questions that hopefully shed some light about the use of GS as a tool for assessing scientific impact through citations. The decalogue of research questions is shown below:

1. Which are the most cited documents in GS?
2. Which are the most cited document types in GS?
3. What languages are the most cited documents written in GS?
4. How many highly cited documents are freely accessible?
4.1 What file types are the most commonly used to store these highly cited documents?
4.2 Which are the main providers of these documents?
5. How many of the highly cited documents indexed by GS are also indexed by WoS?
6. Is there a correlation between the number of citations that these highly cited documents have received in GS and the number of citations they have received in WoS?
7. How many versions of these highly cited documents has GS detected?
8. Is there a correlation between the number of versions GS has detected for these documents, and the number citations they have received?
9. Is there a correlation between the number of versions GS has detected for these documents, and their position in the search engine result pages?
10. Is there some relation between the positions these documents occupy in the search engine result pages, and the number of citations they have received?

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

"Tweets as Impact Indicators: Examining the Implications of Automated Bot Accounts on Twitter"

Stefanie Haustein et al. have self-archived "Tweets as Impact Indicators: Examining the Implications of Automated Bot Accounts on Twitter."

Here's an excerpt:

This brief communication presents preliminary findings on automated Twitter accounts distributing links to scientific papers deposited on the preprint repository arXiv. It discusses the implication of the presence of such bots from the perspective of social media metrics (altmetrics), where mentions of scholarly documents on Twitter have been suggested as a means of measuring impact that is both broader and timelier than citations. We present preliminary findings that automated Twitter accounts create a considerable amount of tweets to scientific papers and that they behave differently than common social bots, which has critical implications for the use of raw tweet counts in research evaluation and assessment. We discuss some definitions of Twitter cyborgs and bots in scholarly communication and propose differentiating between different levels of engagement from tweeting only bibliographic information to discussing or commenting on the content of a paper.

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

"The Imperative for Open Altmetrics"

Stacy Konkiel, Heather Piwowar, and Jason Priem have published "The Imperative for Open Altmetrics" in The Journal of Electronic Publishing.

Here's an excerpt:

If scholarly communication is broken, how will we fix it? At Impactstory—a non-profit devoted to helping scholars gather and share evidence of their research impact by tracking online usage of scholarship via blogs, Wikipedia, Mendeley, and more—we believe that incentivizing web-native research via altmetrics is the place to start. In this article, we describe the current state of the art in altmetrics and its effects on publishing, we share Impactstory's plan to build an open infrastructure for altmetrics, and describe our company's ethos and actions.

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

"Which Kind of Papers Has Higher or Lower Altmetric Counts? A Study Using Article-Level Metrics from PLOS and F1000Prime"

Lutz Bornmann has self-archived "Which Kind of Papers Has Higher or Lower Altmetric Counts? A Study Using Article-Level Metrics from PLOS and F1000Prime."

Here's an excerpt:

The present study investigates the usefulness of altmetrics for measuring the broader impact of research. Methods: This study is essentially based on a dataset with papers obtained from F1000. This dataset was augmented with altmetrics (such as Twitter counts) which were downloaded from the homepage of PLOS (the Public Library of Science). This study covers a total of 1,082 papers. Findings: The results from regression models indicate that Facebook and Twitter, but not Figshare or Mendeley, can provide indications of papers which are of interest to a broader circle of readers (and not only for the peers in a specialist area), and seem therefore be useful for societal impact measurement.

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

Do Altmetrics Point to the Broader Impact of Research? An Overview of Benefits and Disadvantages of Altmetrics

Lutz Bornmann has self-archived "Do Altmetrics Point to the Broader Impact of Research? An Overview of Benefits and Disadvantages of Altmetrics."

Here's an excerpt:

Today, it is not clear how the impact of research on other areas of society than science should be measured. While peer review and bibliometrics have become standard methods for measuring the impact of research in science, there is not yet an accepted framework within which to measure societal impact. Alternative metrics (called altmetrics to distinguish them from bibliometrics) are considered an interesting option for assessing the societal impact of research, as they offer new ways to measure (public) engagement with research output. Altmetrics is a term to describe web-based metrics for the impact of publications and other scholarly material by using data from social media platforms (e.g. Twitter or Mendeley). This overview of studies explores the potential of altmetrics for measuring societal impact. It deals with the definition and classification of altmetrics. Furthermore, their benefits and disadvantages for measuring impact are discussed.

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

"A Review of the Characteristics of 108 Author-Level Bibliometric Indicators"

Lorna Wildgaard, Jesper W. Schneider, and Birger Larsen have self-archived "A Review of the Characteristics of 108 Author-Level Bibliometric Indicators."

Here's an excerpt:

An increasing demand for bibliometric assessment of individuals has led to a growth of new bibliometric indicators as well as new variants or combinations of established ones. The aim of this review is to contribute with objective facts about the usefulness of bibliometric indicators of the effects of publication activity at the individual level. This paper reviews 108 indicators that can potentially be used to measure performance on the individual author level, and examines the complexity of their calculations in relation to what they are supposed to reflect and ease of end-user application.

Digital Scholarship | "A Quarter-Century as an Open Access Publisher"

"Measuring the Broader Impact of Research: The Potential of Altmetrics"

Lutz Bornmann has self-archived "Measuring the Broader Impact of Research: The Potential of Altmetrics."

Here's an excerpt:

Today, it is not clear how the impact of research on other areas of society than science should be measured. While peer review and bibliometrics have become standard methods for measuring the impact of research in science, there is not yet an accepted framework within which to measure societal impact. Alternative metrics (called altmetrics to distinguish them from bibliometrics) are considered an interesting option for assessing the societal impact of research, as they offer new ways to measure (public) engagement with research output. Altmetrics is a term to describe web-based metrics for the impact of publications and other scholarly material by using data from social media platforms (e.g. Twitter or Mendeley). This overview of studies explores the potential of altmetrics for measuring societal impact. It deals with the definition and classification of altmetrics. Furthermore, their benefits and disadvantages for measuring impact are discussed.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"The Multidimensional Assessment of Scholarly Research Impact"

Henk F. Moed and Gali Halevi have self-archived "The Multidimensional Assessment of Scholarly Research Impact."

Here's an excerpt:

This article introduces the Multidimensional Research Assessment Matrix of scientific output. Its base notion holds that the choice of metrics to be applied in a research assessment process depends upon the unit of assessment, the research dimension to be assessed, and the purposes and policy context of the assessment. An indicator may by highly useful within one assessment process, but less so in another. For instance, publication counts are useful tools to help discriminating between those staff members who are research active, and those who are not, but are of little value if active scientists are to be compared one another according to their research performance. This paper gives a systematic account of the potential usefulness and limitations of a set of 10 important metrics including altmetrics, applied at the level of individual articles, individual researchers, research groups and institutions. It presents a typology of research impact dimensions, and indicates which metrics are the most appropriate to measure each dimension. It introduces the concept of a meta-analysis of the units under assessment in which metrics are not used as tools to evaluate individual units, but to reach policy inferences regarding the objectives and general setup of an assessment process.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"A Comparison of Citations, Downloads And Readership Data for an Information Systems Journal"

Christian Schlögl et al. have published "A Comparison of Citations, Downloads And Readership Data for an Information Systems Journal" in a special issue on altmetrics of Research Trends.

Here's an excerpt:

In our analysis we identified a high (though not a perfect) correlation between citations and downloads which was slightly lower between downloads and readership frequencies and again between citations and readership counts. This is mainly due to the fact that the used data (sources) are related either to research or at least to teaching in higher education institutions.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"Teaching an Old University Press Publisher New Tricks: Living in the Present and Preparing for the Future of Scholarly Communications"

Patrick H. Alexander has published "Teaching an Old University Press Publisher New Tricks: Living in the Present and Preparing for the Future of Scholarly Communications" in the Journal of Electronic Publishing.

Here's an excerpt:

University presses currently exist in the dual worlds of print and digital publishing. Current staffing needs require that they hire personnel with skills and experience that mirror that present duality. Training and maintaining a skilled workforce requires a commitment to flexibility and an openness to the ever-changing nature of scholarly communication. As the scholarly publishing ecosystem continues to evolve, university presses will need to look to a future workforce that has additional training, knowledge, and experience beyond the traditional skills associated with academic publishing, one that fully embraces the realities of a digital world, the habits of new generations of researchers, and the increasing role of technology in scholarly communication. This article looks at what the future might look like, what skills might be required, and how one might prepare for that future.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"Empirical Evidences in Citation-Based Search Engines: Is Microsoft Academic Search Dead?"

Enrique Orduna-Malea et al. have self-archived "Empirical Evidences in Citation-Based Search Engines: Is Microsoft Academic Search Dead?"

Here's an excerpt:

The goal of this working paper is to summarize the main empirical evidences provided by the scientific community as regards the comparison between the two main citation based academic search engines: Google Scholar and Microsoft Academic Search, paying special attention to the following issues: coverage, correlations between journal rankings, and usage of these academic search engines. Additionally, selfelaborated data is offered, which are intended to provide current evidence about the popularity of these tools on the Web, by measuring the number of rich files PDF, PPT and DOC in which these tools are mentioned, the amount of external links that both products receive, and the search queries frequency from Google Trends. The poor results obtained by MAS led us to an unexpected and unnoticed discovery: Microsoft Academic Search is outdated since 2013.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"How Well Developed Are Altmetrics? A Cross-Disciplinary Analysis of the Presence of ‘Alternative Metrics’ in Scientific Publications"

Zohreh Zahedi, Rodrigo Costas, and Paul Wouters have self-archived "How Well Developed Are Altmetrics? A Cross-Disciplinary Analysis of the Presence of 'Alternative Metrics' in Scientific Publications."

Here's an excerpt:

In this paper an analysis of the presence and possibilities of altmetrics for bibliometric and performance analysis is carried out. Using the web based tool Impact Story, we collected metrics for 20,000 random publications from the Web of Science. We studied both the presence and distribution of altmetrics in the set of publications, across fields, document types and over publication years, as well as the extent to which altmetrics correlate with citation indicators. The main result of the study is that the altmetrics source that provides the most metrics is Mendeley, with metrics on readerships for 62.6% of all the publications studied, other sources only provide marginal information. In terms of relation with citations, a moderate spearman correlation (r=0.49) has been found between Mendeley readership counts and citation indicators.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap

"Do Altmetrics Correlate with Citations? Extensive Comparison of Altmetric Indicators with Citations from a Multidisciplinary Perspective"

Rodrigo Costas, Zohreh Zahedi, Paul Wouters have self-archived "Do Altmetrics Correlate with Citations? Extensive Comparison of Altmetric Indicators with Citations from a Multidisciplinary Perspective" in arXiv.org.

Here's an excerpt:

An extensive analysis of the presence of different altmetric indicators provided by Altmetric.com across scientific fields is presented, particularly focusing on their relationship with citations. Our results confirm that the presence and density of social media altmetric counts are still very low and not very frequent among scientific publications, with 15%-24% of the publications presenting some altmetric activity and concentrating in the most recent publications, although their presence is increasing over time. Publications from the social sciences, humanities and the medical and life sciences show the highest presence of altmetrics, indicating their potential value and interest for these fields. The analysis of the relationships between altmetrics and citations confirms previous claims of positive correlations but relatively weak, thus supporting the idea that altmetrics do not reflect the same concept of impact as citations. Also, altmetric counts do not always present a better filtering of highly cited publications than journal citation scores. Altmetrics scores (particularly mentions in blogs) are able to identify highly cited publications with higher levels of precision than journal citation scores (JCS), but they have a lower level of recall. The value of altmetrics as a complementary tool of citation analysis is highlighted, although more research is suggested to disentangle the potential meaning and value of altmetric indicators for research evaluation.

Digital Scholarship | Digital Scholarship Publications Overview | Sitemap