Ulrich Herb has self-archived "OpenAccess Statistics: Alternative Impact Measures for Open Access Documents? An Examination How to Generate Interoperable Usage Information from Distributed Open Access Services" in E-LIS.
Here's an excerpt:
Publishing and bibliometric indicators are of utmost relevance for scientists and research institutions as the impact or importance of a publication (or even of a scientist or an institution) is mostly regarded to be equivalent to a citation-based indicator, e.g. in form of the Journal Impact Factor or the Hirsch-Index. Both on an individual and an institutional level performance measurement depends strongly on these impact scores. This contribution shows that most common methods to assess the impact of scientific publications often discriminate Open Access publications — and by that reduce the attractiveness of Open Access for scientists. Assuming that the motivation to use Open Access publishing services (e.g. a journal or a repository) would increase if these services would convey some sort of reputation or impact to the scientists, alternative models of impact are discussed. Prevailing research results indicate that alternative metrics based on usage information of electronic documents are suitable to complement or to relativize citation-based indicators. Furthermore an insight into the project OpenAccess-Statistics OA-S is given. OA-S implemented an infrastructure to collect document-related usage information from distributed Open Access Repositories in an aggregator service in order to generate interoperable document access information according to three standards (COUNTER, LogEc and IFABC). The service also guarantees the deduplication of users and identical documents on different servers. In a second phase it is not only planned to implement added services like recommender features, but also to evaluate alternative impact metrics based on usage patterns of electronic documents.
NEEDED: MORE OA — NOT ALTERNATIVE IMPACT METRICS FOR OA JOURNALS
An article by Ulrich Herb (2010) [UH] is predicated on one of the oldest misunderstandings about OA: that OA ≡ OA journals (“Gold OA”) and that the obstacle to OA is that OA journals don’t have a high enough impact factor:
UH:
“This contribution shows that most common methods to assess the impact of scientific publications often discriminate Open Access publications – and by that reduce the attractiveness of Open Access for scientists.”
The usual solution that is proposed for this non-problem is that we should therefore give OA journals a higher weight in performance evaluation, despite their lower impact factor, in order to encourage OA.
(This is nonsense, and it is not the “solution” proposed by UH. A journal’s weight in performance evaluation needs to be earned — on the basis of its content’s quality and impact — not accorded by fiat, in order to encourage OA.)
The “solution” proposed by UH is not to give OA journals a higher a-priori weight, but to create new impact measures that will accord them a higher weight.
UH:
“Assuming that the motivation to use Open Access publishing services (e.g. a journal or a repository) would increase if these services would convey some sort of reputation or impact to the scientists, alternative models of impact are discussed.”
New impact measures are always welcome — but they too must earn their weights based on their track-records for validity and predictivity.
And what is urgently needed by and for research and researchers is not more new impact measures but more OA.
And the way to provide more OA is to provide OA to more articles — which can be done in two ways, not just the one way of publishing in OA journals (Gold OA), but by self-archiving articles published in all journals (whether OA or non-OA) in institutional repositories, to make them OA (“Green OA”).
Ulrich Herb seems to have misunderstood this completely (equating OA with Gold OA only). The contradiction is evident in two successive paragraphs:
UH:
“Commonly used citation-based indicators provide some arguments pro Open Access: Scientific documents that can be used free of charge are significantly more often downloaded and cited than Toll Access documents are (Harnad & Brody, 2004 ; Lawrence, 2001). Moreover the frequency of downloads seems to correlate with the citation counts of scientific documents (Brody, Harnad & Carr, 2006).
“Nevertheless there is lack of tools and indicators to measure the impact of Open Access publications. Especially documents that are self-archived on Open Access Repositories (and not published in an Open Access Journal) are excluded from the relevant databases (WoS, JCR, Scopus, etc.) that are typically used to calculate JIF-scores or the h-index” [emphasis added].
The bold-face passage in the second paragraph is completely erroneous, and in direct contradiction with what is stated in the immediately preceding paragraph. For the increased citations generated by making articles in any journal (OA or non-OA) OA by making them freely accessible online are included in the relevant databases used to calculate journal impact. Indeed, most of the evidence that OA increases citations comes from comparing the citation counts of articles (in the same journal and issue) that are and are not made OA by their authors. (And these within-journal comparisons are necessarily based on Green OA, not Gold OA.)
Yes, there are journals (OA and non-OA — mostly non-OA!) that are not (yet) indexed by some of the databases (WoS, JCR, Scopus, etc.); but that is not an OA problem.
Yes, let’s keep enhancing the visibility and harvestability of OA content; but that is not the OA problem: the problem is that most content is not yet OA.
And yes, let’s keep developing rich, new OA metrics; but you can’t develop OA metrics until the content is made OA.
References
Brody, T., Carr, L., Gingras, Y., Hajjem, C., Harnad, S. and Swan, A. (2007) Incentivizing the Open Access Research Web: Publication-Archiving, Data-Archiving and Scientometrics. CTWatch Quarterly 3(3).
Harnad, S. (2008) Validating Research Performance Metrics Against Peer Rankings. Ethics in Science and Environmental Politics 8 (11) doi:10.3354/esep00088 (Special issue: The Use And Misuse Of Bibliometric Indices In Evaluating Scholarly Performance)
Harnad, S. (2009) Open Access Scientometrics and the UK Research Assessment Exercise. Scientometrics 79 (1)
Herb, Ulrich (2010) OpenAccess Statistics: Alternative Impact Measures for Open Access documents? An examination how to generate interoperable usage information from distributed Open Access services., 2010 In: L’information scientifique et technique dans l’univers numérique. Mesures et usages. L’association des professionnels de l’information et de la documentation, ADBS, pp. 165-178