2006 PACS Review Use Statistics

Posted in Announcements, E-Journals, Open Access, Scholarly Communication on January 21st, 2007

The Public-Access Computer Systems Review (PACS Review) was a freely available e-journal, which I founded in 1989. It allowed authors to retain their copyrights, and it had a liberal copyright policy for noncommercial use. It’s last issue was published in 1998.

In 2006, there were 763,228 successful requests for PACS Review files, 2,091 average successful requests per day, 751,264 successful requests for pages, and 2,058 average successful requests for pages per day. (A request is for any type of file; a page request is for a content file, such as an HTML, PDF, or Word file). These requests came from 41,865 distinct host computers.

The requests came from 134 Internet domains. Leaving aside requests from unresolved numerical addresses, the top 15 domains were: .com (Commercial), .net (Networks), .edu (USA Higher Education), .cz (Czech Republic), .jp (Japan), .ca (Canada), .uk (United Kingdom), .au (Australia), .de (Germany), .nl (Netherlands), .org (Non Profit Making Organizations), .in (India), .my (Malaysia), .it (Italy), and .mx (Mexico). At the bottom were domains such as .ms (Montserrat), .fm (Micronesia), .nu (Niue), .ad (Andorra), and .az (Azerbaijan).

Rounded to the nearest thousand, there had previously been 3.5 million successful requests for PACS Review files.

This is the last time that use statistics will be reported for the PACS Review.

Fedora 2.2 Released

Posted in Fedora, Institutional Repositories, Open Access, Open Source Software, Scholarly Communication on January 20th, 2007

The Fedora Project has released version 2.2 of Fedora.

From the announcement:

This is a significant release of Fedora that includes a complete repackaging of the Fedora source and binary distribution so that Fedora can now be installed as a standalone web application (.war) in any web container. This is a first step in positioning Fedora to fit within a standard "enterprise system" environment. A new installer application makes it easy to setup and run Fedora. Fedora now uses Servlet Filters for authentication. To support digital object integrity, the Fedora repository can now be configured to calculate and store checksums for datastream content. This can be done globally, or on selected datastreams. The Fedora API also provides the ability to check content integrity based on checksums. The RDF-based Resource Index has been tuned for better performance. Also, a new high-performing triplestore, backed by Postgres, has been developed that can be plugged into the Resource Index. Fedora contains many other enhancements and bug fixes.

ScientificCommons.org: Access to Over 13 Million Digital Documents

Posted in E-Prints, OAI-PMH, Open Access, Scholarly Communication on January 19th, 2007

ScientificCommons.org is an initiative of the Institute for Media and Communications Management at the University of St. Gallen. It indexes both metadata and full-text from global digital repositories. It uses OAI-PMH to identify relevant documents. The full-text documents are in PDF, PowerPoint, RTF, Microsoft Word, and Postscript formats. After being retrieved from their original repository, the documents are cached locally at ScientificCommons.org. It has indexed about 13 million documents from over 800 repositories.

Here are some additional features from the About ScientificCommons.org page:

Identification of authors across institutions and archives: ScientificCommons.org identifies authors and assigns them their scientific publications across various archives. Additionally the social relations between the authors will be extracted and displayed. . . .

Semantic combination of scientific information: ScientificCommons.org structures and combines the scientific data to knowledge areas with Ontology’s. Lexical and statistical methods are used to identify, extract and analyze keywords. Based on this processes ScientificCommons.org classifies the scientific data and uses it e.g. for navigational and weighting purposes.

Personalization services: ScientificCommons.org offers the researchers the possibilities to inform themselves about new publications via our RSS Feed service. They can customize the RSS Feed to a special discipline or even to personalized list of keywords. Furthermore ScientificCommons.org will provide an upload service. Every researcher can upload his publication directly to ScientificCommons.org and assign already existing publications at ScientificCommons.org to his own researcher profile.

New UC Report: The Promise of Value-based Journal Prices and Negotiation

Posted in Scholarly Communication on January 18th, 2007

The University of California libraries have released The Promise of Value-based Journal Prices and Negotiation: A UC Report and View Forward.

Here is the report’s abstract:

In pursuit of their scholarly communication agenda, the University of California ten-campus libraries have posited and tested the case that a journal’s institutional price can and should be related to its value to the academic enterprise. We developed and tested a set of metrics that comprise "value-based pricing" of scholarly journals. The metrics are the measurable impact of the journal, the transparent measures of production costs, the institutionally-based contributions to the journal, such as editorial labor, and the transaction efficiencies from consortial purchases. Initial modeling and use of the approaches are promising, leading the libraries to employ and further develop the approaches and share their work to date with the larger community.

This excerpt from press release provides further information:

The report describes a value-based approach that borrows from analysis done by Professors Ted Bergstrom (UC Santa Barbara) and R. Preston McAfee (Caltech) on journal cost-effectiveness (www.journalprices.com). The UC approach also includes suggestions for annual price increases that are tied to production costs; credits for institutionally-based contributions to the journal, such as editorial labor; and credits for business transaction efficiencies from consortial purchases.

Through the report the libraries ask how an explicit method can be established, validated, and communicated for aligning the purchase or license costs of scholarly journals with the value they contribute to the academy and the costs to create and deliver them. In addition to describing the work done to date, the report provides examples of potential cost savings and declares UC’s intention to pursue value-based prices in their negotiations with journal publishers. In addition, the report invites the academic community to work collectively to refine and improve these and other value-based approaches.

The Long Run

Posted in Emerging Technologies, Libraries on January 18th, 2007

Enthusiasm about new technologies is essential to innovation. There needs to be some fire in the belly of change agents or nothing ever changes. Moreover, the new is always more interesting than the old, which creaks with familiarity. Consequently, when an exciting new idea seizes the imagination of innovators and, later, early adopters (using Rogers’ diffusion of innovations jargon), it is only to be expected that the initial rush of enthusiasm can sometimes dim the cold eye of critical analysis.

Let’s pick on Library 2.0 to illustrate the point, and, in particular, librarian-contributed content instead of user-contributed content. It’s an idea that I find quite appealing, but let’s set that aside for the moment.

Overcoming the technical challenges involved, academic library X sets up on-demand blogs and wikis for staff as both outreach and internal communication tools. There is an initial frenzy of activity, and a number of blogs and wikis are established. Subject specialists start blogging. Perhaps the pace is reasonable for most to begin with, although some fall by the wayside quickly, but over time, with a few exceptions, the postings become more erratic and the time between postings increases. It is unclear whether target faculty read the blogs in any great numbers. Internal blogs follow a similar pattern. Some wikis, both internal and external, are quickly populated, but then become frozen by inactivity; others remain blank; others flourish because they serve a vital need.

Is this a story of success, failure, or the grey zone in between?

The point is this. Successful publishing in new media such as blogs and wikis requires that these tools serve a real purpose and that their contributors make a consistent, steady, and never-ending effort. It also requires that the intended audience understand and regularly use the tools and that, until these new communication channels are well-established, the library vigorously promote them because there is a real danger that, if you build it, they will not come.

Some staff will blog their hearts out irregardless of external reinforcement, but many will need to have their work acknowledged in some meaningful way, such as at evaluation, promotion, and tenure decision points. Easily understandable feedback about tool use, such as good blog-specific or wiki-specific log analysis, is important as well to give writers the sense that they are being read and to help them tailor their message to their audience.

On the user side, it does little good to say "Here’s my RSS feed" to a faculty member who doesn’t know what RSS is and could care less. Of course, some will be hip to RSS, but that may not be the majority. If the library wants RSS feeds to become part of a faculty member’s daily workflow, it is going to have to give that faculty member a good reason for it to be so, such as significant, identified RSS feed content in the faculty member’s field. Then, it is going to have to help the faculty member with the RSS transition by pointing out good RSS readers, providing tactful instruction, and offering ongoing assistance.

In spite of the feel-good glow of early success, it may be prudent not to declare victory too soon after making the leap into a major new technology. It’s a real accomplishment, but dealing with technical puzzles is often not the hardest part. The world of computers and code is a relatively ordered and predictable one; the world of humans is far more complex and unpredictable.

The real test of a new technology is in the long run: Is the innovation needed, viable, and sustainable? Major new technologies often require significant ongoing organizational commitments and a willingness to measure success and failure with objectivity and to take corrective action as required. For participative technologies such as Library 2.0 and institutional repositories, it requires motivating users as well as staff to make behavioral changes that persist long after the excitement of the new wears off.

DLF/NSDL OAI Best Practices Wiki

Posted in Metadata, OAI-PMH, Open Access on January 17th, 2007

The Digital Library Federation and NSDL OAI and Shareable Metadata Best Practices Working Group’s OAI Best Practices Wiki has a number of resources relevant to the Open Archives Initiative Protocol for Metadata Harvesting (OAI-PMH) and related metadata issues.

The Tools and Strategies for Using and Enhancing/Extending the OAI Protocol section is of particular interest. It includes information about OAI-PMH data provider and service provider registries, software solutions and packages, and static repositories and gateways; metadata management and added value tools as well as OAI and character validation tools; and using SRU/W, collection description schema, and NSDL safe transforms.

Managing Digitization Activities, SPEC Kit 294

Posted in ARL Libraries, Digital Libraries, Digitization on January 17th, 2007

The Association of Research Libraries has published Managing Digitization Activities, SPEC Kit 294. The table of contents and executive summary are freely available.

Here are some highlights from the announcement:

This survey was distributed to the 123 ARL member libraries in February 2006. Sixty-eight libraries (55%) responded to the survey, of which all but two (97%) reported having engaged in digitization activities. Only one respondent reported having begun digitization activities prior to 1992; five other pioneers followed in 1992. From 1994 through 1998 there was a steady increase in the number of libraries beginning digital initiatives; 30 joined the pioneers at the rate of three to six a year. There was a spike of activity at the turn of the millennium that reached a high in 2000, when nine libraries began digital projects. Subsequently, new start-ups have slowed, with only an additional one to five libraries beginning digitization activities each year.

The primary factor that influenced the start up of digitization activities was the availability of grant funding (39 responses or 59%). Other factors that influenced the commencement of these activities were the addition of new staff with related skills (50%), staff receiving training (44%), the decision to use digitization as a preservation option (42%), and the availability of gift monies (29%). . . . .

Only four libraries reported that their digitization activities are solely ongoing functions; the great majority (60 or 91%) reported that their digitization efforts are a combination of ongoing library functions and discrete, finite projects.

Notre Dame Institutional Digital Repository Phase I Final Report

Posted in DSpace, E-Prints, Electronic Theses and Dissertations (ETDs), Institutional Repositories, Open Access, Scholarly Communication on January 16th, 2007

The University of Notre Dame Libraries have issued a report about their year-long institutional repository pilot project. There is an abbreviated HTML version and a complete PDF version.

From the Executive Summary:

Here is the briefest of summaries regarding what we did, what we learned, and where we think future directions should go:

  1. What we did—In a nutshell we established relationships with a number of content groups across campus: the Kellogg Institute, the Institute for Latino Studies, Art History, Electrical Engineering, Computer Science, Life Science, the Nanovic Institute, the Kaneb Center, the School of Architecture, FTT (Film, Television, and Theater), the Gigot Center for Entrepreneurial Studies, the Institute for Scholarship in the Liberal Arts, the Graduate School, the University Intellectual Property Committee, the Provost’s Office, and General Counsel. Next, we collected content from many of these groups, "cataloged" it, and saved it into three different computer systems: DigiTool, ETD-db, and DSpace. Finally, we aggregated this content into a centralized cache to provide enhanced browsing, searching, and syndication services against the content.
  2. What we learned—We essentially learned four things: 1) metadata matters, 2) preservation now, not later, 3) the IDR requires dedicated people with specific skills, 4) copyright raises the largest number of questions regarding the fulfillment of the goals of the IDR.
  3. Where we are leaning in regards to recommendations—The recommendations take the form of a "Chinese menu" of options, and the options are be grouped into "meals." We recommend the IDR continue and include: 1) continuing to do the Electronic Theses & Dissertations, 2) writing and implementing metadata and preservation policies and procedures, 3) taking the Excellent Undergraduate Research to the next level, and 4) continuing to implement DigiTool. There are quite a number of other options, but they may be deemed too expensive to implement.

Blackwell Synergy Based on Literatum Goes Live

Posted in Publishing on January 15th, 2007

Blackwell Publishing has released a new version of Blackwell Synergy, which utilizes Atypon’s Literatum software.

From the press release:

Blackwell Synergy enables its users to search 1 million articles from over 850 leading scholarly journals across the sciences, social sciences, humanities and medicine. The redesign provides easier navigation, faster loading times and improved access to tools for researchers, as well as meeting the latest accessibility standards (ADA section 508 and W3C’s WAI-AA).

Recently, the University of Chicago Press picked Atypon as a technology partner to provide an e-publishing platform for its online journals.

OCLC Openly Informatics Link Evaluator for Firefox

Posted in OCLC, Techie on January 15th, 2007

OCLC Openly Informatics has announced a free link checking plug-in for Firefox called Link Evaluator.

Here a brief description from the Link Evaluator page:

Link Evaluator is a Firefox extension designed to help users evaluate the availability of online resources linked to from a given Web page. When started, it automatically follows all links on the current page, and assesses the responses of each URL (link). . . .

After each link is checked, it is highlighted with a color based on the relative success of the result: green for fully successful, shades of yellow for partly successful, and red for unsuccessful.

It requires Mozilla Firefox version 1.5 (or later).

digitalculturebooks

Posted in Digital Presses, Open Access, Publishing, Scholarly Communication on January 12th, 2007

The University of Michigan Press and the Scholarly Publishing Office of the University of Michigan Library, working together as the Michigan Digital Publishing Initiative, have established digitalculturebooks, which offers free access to digital versions of its published works (print works are fee-based). The imprint focuses on "the social, cultural, and political impact of new media."

The objectives of the imprint are to:

  • develop an open and participatory publishing model that adheres to the highest scholarly standards of review and documentation;
  • study the economics of Open Access publishing;
  • collect data about how reading habits and preferences vary across communities and genres;
  • build community around our content by fostering new modes of collaboration in which the traditional relationship between reader and writer breaks down in creative and productive ways.

Library Journal Academic Newswire notes in its article about digitalculturebooks:

While press officials use the term "open access," the venture is actually more "free access" than open at this stage. Open access typically does not require permission for reuse, only a proper attribution. UM director Phil Pochoda told the LJ Academic Newswire that, while no final decision has been made, the press’s "inclination is to ask authors to request the most restrictive Creative Commons license" for their projects. That license, he noted, requires attribution and would not permit commercial use, such as using it in a subsequent for-sale product, without permission. The Digital Culture Books web site currently reads that "permission must be received for any subsequent distribution."

The imprint’s first publication is The Best of Technology Writing 2006.

(Prior postings about digital presses.)

The Lowdown on Microsoft’s Vista OS

Posted in Digital Culture, Techie on January 11th, 2007

PC Magazine‘s special double issue on Microsoft’s Vista operating system (26, no. 1/2 January 2007) is worth a look. Here are the key articles:

You might also be interested in their Top 20 Wired Colleges piece, which has some surprising results (e.g., Villanova University tops MIT).


Page 700 of 718« First...102030...698699700701702...710...Last »

DigitalKoans

DigitalKoans

Digital Scholarship

Copyright © 2005-2017 by Charles W. Bailey, Jr.

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International license.