Petition for Public Access to Publicly Funded Research in the United States

AALL, ALA, ACRL, the Alliance for Taxpayer Access, Public Knowledge, SPARC, and other organizations have initiated the Petition for Public Access to Publicly Funded Research in the United States.

The petition states:

We, the undersigned, believe that broad dissemination of research results is fundamental to the advancement of knowledge. For America’s taxpayers to obtain an optimal return on their investment in science, publicly funded research must be shared as broadly as possible. Yet too often, research results are not available to researchers, scientists, or the members of the public. Today, the Internet and digital technologies give us a powerful means of addressing this problem by removing access barriers and enabling new, expanded, and accelerated uses of research findings.

We believe the US Government can and must act to ensure that all potential users have free and timely access on the Internet to peer-reviewed federal research findings. This will not only benefit the higher education community, but will ultimately magnify the public benefits of research and education by promoting progress, enhancing economic growth, and improving the public welfare.

We support the re-introduction and passage of the Federal Research Public Access Act, which calls for open public access to federally funded research findings within six months of publication in a peer-reviewed journal.

The petition follows a similar effort in the European Union, the Petition for Guaranteed Public Access to Publicly-Funded Research Results, which was signed by over 23,000 individuals and organizations.

UK Council of Research Repositories Established

SHERPA Plus has announced the launch of the UK Council of Research Repositories.

It is described as follows: "UKCoRR will be an independent professional body to allow repository managers to share experiences and discuss issues of common concern. It will give repository managers a group voice in national discussions and policy development independent of projects or temporary initiatives."

Digital Object Prototypes Framework Released

Kostas Saidis has released the Digital Object Prototypes Framework. It is available from the DOPs download page.

Here is an excerpt from the fedora-commons-users announcement:

At a glance, DOPs is a framework for the effective management and manipulation of diverse and heterogeneous digital material, providing repository-independent, type-consistent abstractions of stored digital objects. In DOPs, individual objects are treated as instances of their prototype and, hence, conform to its specifications automatically, regardless of the underlying storage format used to store and encode the objects.

The framework also provides inherent support for collections /sub-collections hierarchies and compound objects, while it allows DL-pertinent services to compose type-specific object behavior effectively. A DO Storage module is also available, which allows one to use the framework atop Fedora (thoroughly tested with Fedora version 2.0).

PRESERV Project Report on Digital Preservation in Institutional Repositories

The JISC PRESERV (Preservation Eprint Services) project has issued a report titled Laying the Foundations for Repository Preservation Services: Final Report from the PRESERV Project.

Here’s an excerpt from the Executive Summary:

The PRESERV project (2005-2007) investigated long-term preservation for institutional repositories (IRs), by identifying preservation services in conjunction with specialists, such as national libraries and archives, and building support for services into popular repository software, in this case EPrints. . . .

PRESERV was able to work with The National Archives, which has produced PRONOMDROID, the pre-eminent tool for file format identification. Instead of linking PRONOM to individual repositories, we linked it to the widely used Registry of Open Access Repositories (ROAR), through an OAI harvesting service. As a result format profiles can be found for over 200 repositories listed in ROAR, what we call the PRONOM-ROAR service. . . .

The lubricant to ease the movement of data between the components of the services model is metadata, notably preservation metadata, which informs, describes and records a range of activities concerned with preserving specific digital objects. PRESERV identified a rich set of preservation metadata, based on the current standard in this area, PREMIS, and where this metadata could be generated in our model. . . .

The most important changes to EPrints software as a result of the project were the addition of a history module to record changes to an object and actions performed on an object, and application programs to package and disseminate data for delivery to an external service using either the Metadata Encoding and Transmission Standard (METS) or the MPEG-21 Part 2: Digital Item Declaration Language (DIDL). One change to the EPrints deposit interface is the option for authors to select a licence indicating rights for allowable use by service providers or users, and others. . . .

PRESERV has identified a powerful and flexible framework in which a wide range of preservation services from many providers can potentially be intermediated to many repositories by other types of repository services. It is proposed to develop and test this framework in the next phase of the project.

A Long Road Ahead for Digitization

The New York Times published an article today ("History, Digitized (and Abridged)") that examines the progress that has been made in digitization in the US. It doesn’t hold many surprises for those in the know, but it might be useful in orienting non-specialists to some of the challenges involved, especially those who think that everything is online on the Internet.

It also has some interesting tidbits, including a chart that shows the holdings of different types of materials in the National Archives and how many items have been digitized for each type.

It has some current cost data from the Library of Congress quoted below:

At the Library of Congress, for example, despite continuing and ambitious digitization efforts, perhaps only 10 percent of the 132 million objects held will be digitized in the foreseeable future. For one thing, costs are prohibitive. Scanning alone on smaller items ranges from $6 to $9 for a 35-millimeter slide, to $7 to $11 a page for presidential papers, to $12 to $25 for poster-size pieces.

It also discusses the copyright laws that apply to sound materials and their impact on digitization efforts:

When it comes to sound recordings, copyright law can introduce additional complications. Recordings made before 1972 are protected under state rather than federal laws, and under a provision of the 1976 Copyright Act, may be entitled to protection under state law until 2067. Also, an additional copyright restriction often applies to the underlying musical composition.

A study published in 2005 by the Library of Congress and the Council on Library and Information Resources found that some 84 percent of historical sound recordings spanning jazz, blues, gospel, country and classical music in the United States, and made from 1890 to 1964, have become virtually inaccessible.

An interesting, well-written article that’s worth a read.

Source: Hafner, Katie. "History, Digitized (and Abridged)." The New York Times, 11 March 2007, BU YT 1, 8-9.

Trustworthy Repositories Audit & Certification: Criteria and Checklist Published

The Center for Research Libraries and RLG Programs have published the Trustworthy Repositories Audit & Certification: Criteria and Checklist.

Here’s an excerpt from the press release:

In 2003, RLG and the US National Archives and Records Administration created a joint task force to address digital repository certification. The goal of the RLG-NARA Task Force on Digital Repository Certification was to develop criteria to identify digital repositories capable of reliably storing, migrating, and providing access to digital collections. With partial funding from the NARA Electronic Records Archives Program, the international task force produced a set of certification criteria applicable to a range of digital repositories and archives, from academic institutional preservation repositories to large data archives and from national libraries to third-party digital archiving services. . . . .

In 2005, the Andrew W. Mellon Foundation awarded funding to the Center for Research Libraries to further establish the documentation requirements, delineate a process for certification, and establish appropriate methodologies for determining the soundness and sustainability of digital repositories. Under this effort, Robin Dale (RLG Programs) and Bernard F. Reilly (President, Center for Research Libraries) created an audit methodology based largely on the checklist, tested it on several major digital repositories, including the E-Depot at the Koninklijke Bibliotheek in the Netherlands, the Inter-University Consortium for Political and Social Research, and Portico.

Findings and methodologies were shared with those of related working groups in Europe who applied the draft checklist in their own domains: the Digital Curation Center (U.K.), DigitalPreservationEurope (Continental Europe) and NESTOR (Germany). The report incorporates the sum of knowledge and experience, new ideas, techniques, and tools that resulted from cross-fertilization between the U.S. and European efforts. It also includes a discussion of audit and certification criteria and how they can be considered from an organizational perspective.

UK EThOSnet ETD Project Funded

A UK-wide ETD project called EThOSnet has been funded for a two-year period by JISC and CURL (Consortium of Research Libraries). When the project concludes, the British Library will establish the EThOS service based on the work done by EThOSnet.

An excerpt from the press release is below:

The project builds on earlier exploratory work, also funded by JISC and CURL, which between 2004 and 2006 developed a prototype for the service. Independent evaluation has since given the prototype strong backing and suggested further developments, while a recent consultation resulted in expressions of interest from over 70 HE institutions to participate in the emerging e-theses service.

EThOSnet builds on these firm foundations and through collaboration with the British Library and the HE community will transform access to theses in the UK by providing the full text of theses through a single point of entry. In addition, in tandem with the emerging network of institutional repositories in the UK, it promises to become a central element of the national infrastructure for research.

JISC Report Evaluates CLIR/ARL E-Journal Archiving Report

JISC has just released an evaluation of the 2006 CLIR/ARL report e-Journal Archiving Metes and Bounds: A Survey of the Landscape. The new report, which is by Maggie Jones, is titled Review and Analysis of the CLIR Report E-Journal Archiving Metes and Bounds: A Survey of the Landscape.

Here is an excerpt from the Executive Summary:

Although both legal deposit legislation and institutional repositories are important developments, neither of them can reasonably be expected to provide practical solutions for libraries licensing access to e-journals. In the UK, the archiving clauses in the NESLI licence have provided a measure of security for libraries but in the absence of trusted repositories charged with managing e-journals, these have provided largely theoretical assurance.

There is a pressing requirement for trusted repositories focussed on archiving and preserving e-journals, which are independent of publishers, and which offer services which can safeguard content while sharing costs between libraries and publishers equitably. While the concerns of libraries are much the same as they were when the JISC consultancy on e-journals archiving reported in 2003, there are now a clearer set of options emerging. Over the past few years, a number of promising initiatives have been developed which provide much better prospects for continued access to licensed e-journal content and which offer cost-effective services for libraries and publishers. Twelve of these trusted repositories have been profiled in a recent CLIR survey. Many of them, including Portico, Pub Med Central, CLOCKSS, and LOCKSS are already familiar in the UK.

Despite a rapidly changing landscape, there is nevertheless a powerful momentum, as evidenced in the rapid take-up of two of the services, LOCKSS and Portico. It is also now possible to articulate a set of principles for archiving services, based on practical reality, which can guide decision-making. The CLIR survey provides a valuable catalyst which the forthcoming BL/DPC/JISC E-Journal Archiving and Preservation workshop (27th March 2007) and other mechanisms have the opportunity to take a significant step forward in this crucial area.

Digitization Copyright Wars: Microsoft Blasts Google at AAP

Microsoft Associate General Counsel Thomas Rubin took off the gloves at the Association of American Publishers meeting on Tuesday. The target: Google Book Search. The goal: to contrast Google’s approach to copyright issues associated with digitizing books with Microsoft’s more publisher-friendly approach.

Rubin’s comments included the following:

The stated goal of Google’s Book Search project is to make a copy of every book ever published and bring it within Google’s vast database of indexed content. While Google says that it doesn’t currently intend to place ads next to book search results, Google’s broader business model is straightforward—attract as many users as possible to its site by providing what it considers to be "free" content, then monetize that content by selling ads. I think Pat Schroeder put it best when she said Google has "a hell of a business model—they’re going to take everything you create, for free, and sell advertising around it."

To accomplish its book search goals, Google persuaded several libraries to give it unfettered access to their collections, both copyrighted and public domain works. It also entered into agreements with several publishers to acquire rights to certain of their copyrighted books. Despite such deals, in late 2004 Google basically turned its back on its partners. Concocting a novel "fair use" theory, Google bestowed upon itself the unilateral right to make entire copies of copyrighted books not covered by these publisher agreements without first obtaining the copyright holder’s permission.

Google’s chosen path would no doubt allow it to make more books searchable online more quickly and more cheaply than others, and in the short term this will benefit Google and its users. But the question is, at what long-term cost? In my view, Google has chosen the wrong path for the longer term, because it systematically violates copyright and deprives authors and publishers of an important avenue for monetizing their works. In doing so, it undermines critical incentives to create. . . .

Google defends its actions primarily by arguing that its unauthorized copying and future monetization of your books are protected as fair use. . . .

In essence, Google is saying to you and to other copyright owners: "Trust us—you’re protected. We’ll keep the digital copies secure, we’ll only show snippets, we won’t harm you, we’ll promote you." But Google’s track record of protecting copyrights in other parts of its business is weak at best.

Rubin also discussed Microsoft’s Live Search Academic and Live Search Books in some detail.

Here are some of the more interesting articles and postings about the speech:

Meanwhile, the Bavarian State Library has just joined Google’s library partners, adding about one million books to the project.

Haworth Press Requires Copyright Transfer Prior to Peer Review

Haworth Press now has a policy of requiring a copyright transfer prior to peer review.

For example, the "Instructions to Authors" for the Journal of Interlibrary Loan, Document Delivery & Electronic Reserve states: "Copyright ownership of your manuscript must be transferred officially to The Haworth Press, Inc., before we can begin the peer-review process."

This raises the interesting question of what happens when a paper is rejected: Haworth now owns the copyright, so how can the author now submit the rejected paper elsewhere?

Postscript: Haworth has posted a liblicense-l message indicating that it is clarifying its copyright transfer requirements.

Fez 1.3 Released

Christiaan Kortekaas has announced on the fedora-commons-users list that Fez 1.3 is now available from SourceForge.

Here’s a summary of key changes from his message:

  • Primary XSDs for objects based on MODS instead of DC (can still handle your existing DC objects though)
  • Download statistics using apache logs and GeoIP
  • Object history logging (premis events)
  • Shibboleth support
  • Fulltext indexing (pdf only)
  • Import and Export of workflows and XSDs
  • Sanity checking to help make sure required external dependencies are working
  • OAI provider that respects FezACML authorisation rules

For further information on Fez, see the prior post "Fez+Fedora Repository Software Gains Traction in US."

Scholarly Electronic Publishing Weblog Update (3/7/07)

The latest update of the Scholarly Electronic Publishing Weblog (SEPW) is now available, which provides information about new scholarly literature and resources related to scholarly electronic publishing, such as books, journal articles, magazine articles, technical reports, and white papers. Especially interesting are: "Assessment of Self-Archiving in Institutional Repositories: Depositorship and Full-Text Availability"; "Datasets, a Shift in the Currency of Scholarly Communication: Implications for Library Collections and Acquisitions"; "Digital Rights Management and the Process of Fair Use"; "Digitization in Australasia"; "Disruptive Technologies: Taking STM Publishing into the Next Era"; "Electronic Thesis Initiative: Pilot Project of McGill University, Montreal"; "Every Library’s Nightmare? Digital Rights Management and Licensed Scholarly Digital Resources"; "The Ides of February in Europe: The European Commission Plan for Open Access"; "Perspectives on Access to Electronic Journals for Long-Term Preservation"; "Preparing Academic Scholarship for an Open Access World"; and "Shifting from Print to Electronic Journals in ARL University Libraries."

For weekly updates about news articles, Weblog postings, and other resources related to digital culture (e.g., copyright, digital privacy, digital rights management, and Net neutrality), digital libraries, and scholarly electronic publishing, see the latest DigitalKoans Flashback posting.

Rice University Names Head of Its Digital Press

Fred Moody has been chosen to head the reborn-digital Rice University Press. Based in Seattle (where he will remain), Moody is a journalist and author of books such as I Sing The Body Electronic, Seattle and the Demons of Ambition: From Boom to Bust in the Number One City of the Future, and The Visionary Position: the Inside Story of the Digital Dreamers Who Are Making Virtual Reality a Reality. Moody holds an MLS from the University of Michigan.

Below is an excerpt from the Rice News article ("Moody Tapped to Head Rice University Press"):

The press will start out publishing art history books and grow as peer review panels are added. A second imprint at the press—called Long Tail Press—will be added so publishing can be done in partnership with other university presses. It will allow for previously published books to be published again on a digital platform. It will also allow for books that have been accepted at fellow university presses, but haven’t been printed because of cost, to be published.

"My goal is to grow Rice’s reputation for quality first, and to grow the size of the press—in terms both of number of books published and number of disciplines published—almost as fast," Moody said. "The idea is not so much to grow a huge business as to grow the best forum in the world for scholarly research."

(Prior postings about digital presses.)

E-Journal: A Drupal-Based E-Journal Publishing System

Roman Chyla has developed E-Journal, an e-journal management and publishing system based upon the popular open-source Drupal content management system.

Here is a description from the E-Journal site:

This module allows you to create and control own electronic journals in Drupal—you can set up as many journals as you want, add authors and editors. Module gives you issue management and provides list of vocabularies (to browse) and archive of published articles. This module is more sophisticated than epublish.module and was inspired by Open Journal System. Our workflow is not so rigid though and because of the Drupal platform, you can do much more with e-journal than with OJS – potentially ;-).

An example journal that uses E-Journal is Ikaros .

(Prior postings about e-journal management and publishing systems.)

Fez+Fedora Repository Software Gains Traction in US

The February 2007 issue of Sustaining Repositories reports that more US institutions are using or investigating a combination of Fez and Fedora (see the below quote):

Fez programmers at the University of Queensland (UQ) have been gratified by a surge in international interest in the Fez software. Emory University Libraries are building a Fez repository for electronic theses. Indiana University Libraries are also testing Fez+Fedora to see whether to replace their existing DSpace installation. The Colorado Alliance of Research Libraries (http://www.coalliance.org/) is using Fez+Fedora for their Alliance Digital Repository. Also in the US, the National Science Digital Library is using Fez+Fedora for their Materials Science Digital Library (http://matdl.org/repository/index.php).

AAUP Statement on Open Access

The Association of American University Presses has issued a statement on Open Access. Peter Suber and Ben Vershbow have commented the statement. Also, there was an article about it in Inside Higher Ed ("University Presses Take Their Stand").

Here is an excerpt from that statement (minus footnote numbers):

From the founding of the first American university presses in the late 19th century, the purpose of the university press has always been to assist the university in fulfilling its noble mission "to advance knowledge, and to diffuse it not merely among those who can attend the daily lectures—but far and wide," in the famous words of President Daniel Coit Gilman of the Johns Hopkins University. Universities acknowledged then that for most scholarly works there was insufficient commercial demand to sustain a publishing operation on sales alone, and recognized an obligation to establish and subsidize their own presses in order to serve the mission of universities to share the knowledge they generate.

Knowledge is expensive to produce, and requires—in addition to the scholar’s own work—knowledgeable editorial selection and careful vetting as well as a high level of quality in copyediting, design, production, marketing, and distribution in order to achieve the excellence for which American universities have come to be widely praised. Universities have made substantial investments in their presses, and the staffs who run them are expert at what they do. The system of scholarly communication that these presses support has played a vital role in the spread of knowledge worldwide. Calls for changing this system need to take careful account of the costs of doing so, not just for individual presses but for their parent universities, and for the scholarly societies that also contribute in major ways to the current system.

And, indeed, while proud of their achievements, university presses and scholarly societies have never been averse to change. Rather, being embedded in the culture of higher education that values experimentation and advances in knowledge, presses have themselves been open to new ways of facilitating scholarly communication and have been active participants in the process. Prominent examples from the last decade include Project MUSE, the History E-Book Project, the History Cooperative, California’s AnthroSource and eScholarship Editions, Cambridge Companions Online, Chicago’s online edition of The Founders’ Constitution, Columbia’s International Affairs Online (CIAO) and Gutenberg-e, The New Georgia Encyclopedia, MIT CogNet, Oxford Scholarship Online and Oxford’s recent experiments with open access journals, Virginia’s Rotunda, and Michigan’s new press and library collaboration digitalculturebooks.

The phrase "open access" has come to symbolize the pressure for change in the system, largely in response to the financial burden on academic libraries of maintaining subscriptions to commercially published journals in science, technology, and medicine (STM). Without reform to this system many fear that the results of new research will increasingly be accessible to an ever-shrinking number of the wealthiest universities. Hence the call has arisen for a new publishing model of open access that will ensure the continued ability of universities to disseminate knowledge "far and wide."

The well-known Budapest Open Access Initiative (BOAI), in promoting a solution to the high price of STM journals, defines open access as "permitting any users to read, download, copy, distribute, print, search, or link to the full texts of these articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers other than those inseparable from gaining access to the Internet itself." In principle, this definition of open access could be applied to all types of scholarly publishing, and calls for widespread use of institutional repositories and for self-archiving by individual scholars in order to promote such open access are by no means limited to just STM journal literature. Although the debate over open access has centered almost exclusively on one sector of publishing, STM journals, there is no reason to limit the discussion to that sector and indeed, given the interconnectedness of knowledge, it is unwise not to explore the implications of open access for all fields of knowledge lest an unfortunate new "digital divide" should arise between fields and between different types of publishing. The recently proposed legislation known as the Federal Research Public Access Act of 2006 (FRPAA) would affect a wide range of research that receives funding from 11 federal agencies, including the National Science Foundation, the Environmental Protection Agency, and the Departments of Energy, Education, and Defense. The American Council of Learned Societies, in its 2006 report on "Cyberinfrastructure for the Humanities and Social Sciences," has advocated such open access for all social science and humanities scholarship. However, there is a wide range of models that can be subsumed under the generic term "open access," with both risks and benefits to the entire system of scholarly communications that are as yet not fully understood.

The member presses of AAUP, which include scholarly societies, recognize that they have an obligation to confront the many challenges—economic, legal, and technological—to the existing system and to participate with all willing partners, both within and outside the university, to strengthen and expand scholarly communications. Many of them, often in collaboration with research libraries, are already experimenting with new approaches, including varieties of open access that seek to balance the mission of scholarly communication with its costs.

Those costs today are covered by a combination of institutional subsidies and sales in the marketplace. On average, AAUP university-based members receive about 10% of their revenue as subsidies from their parent institution, 85% from sales, and 5% from other sources. Therefore the AAUP believes it is important to keep an open mind about what constitutes open access, since some kinds of open access are compatible with a market-based model. The National Academies Press, for instance, makes all of its books available online for free full-text browsing worldwide while offering both downloadable PDFs and print copies for sale.

For the more radical approaches that abandon the market as a viable basis for the recovery of costs in scholarly publishing and instead try to implement a model that has come to be known as the "gift economy" or the "subsidy economy," the AAUP urges that the following points be kept in mind:

1) BOAI-type open access will require large contributions from either the authors or other sources (including foundations and libraries, which pay "member" fees instead of paying for subscriptions). Scholars at less wealthy institutions or those with no institutional affiliations may experience greater difficulty in publishing unless fees are waived or reduced (a process that will increase the burden on other authors, who will have to pay higher fees to offset the waivers). This will be especially true for monographs, the publishing cost for which now runs around $25,000 to $30,000 (for an average monograph of 250 pages with no illustrations) and would still be close to $20,000 to $25,000 if no printing were done or inventory maintained by the publisher. While inequities among users may be resolved by open-access publishing, they may resurface as inequities among authors.

2) Costs for scholarly communication overall will not change radically, but merely be shifted from one sector of the university to another. For university presses and scholarly societies currently, only 17% to 20% of the publishing costs of monographs are spent on manufacturing, so most of their other expenses will still need to be covered Even so, many end users will prefer to print out what they want to read, especially longer articles and books, using printing devices that are less economical than dedicated printing presses. Moreover, since traditional print publishing will not disappear overnight, there will be the continuing costs of maintaining that part of the system in addition to the new costs of supporting online publishing ventures. Finally, if faculty are asked themselves to become publishers, they will spend more of their time performing tasks for which they are not trained and less on the teaching and research for which they are, resulting in an overall loss in economic efficiency for the university as a whole.

3) Requirements for fully free-to-user open access publishing of journal articles, whether through the journals themselves or by way of open institutional repositories or authors’ self-archiving, will undermine existing well-regarded services like Project MUSE (the electronic database of more than 300 journals in the humanities and social sciences jointly operated by the library and press at Johns Hopkins) that rely on institutional site licensing to be sustained. BOAI-style open access is inherently incompatible with site licensing as a model for journal publishing and archiving.

4) In 2005, university presses recovered 90% of their operating costs, roughly $500 million, from sales. Of that $500 million, sales to libraries account for 15% to 20%, or $75 to $100 million. The rest comes from sales to general and college bookstores, to online retailers, and directly to individual scholars. Under free-to-user open access, universities that operate presses would need to be prepared to decide how much of the cost of maintaining the system they would want to continue bearing and how much they would expect other universities to absorb by providing full or partial faculty subsidies for publication of both journal articles and monographs. Any university opting for full support could expect its costs to rise dramatically. Conversely, if any parent university decided to maintain only its current support, other universities not now supporting the system, or doing so only through small and occasional subsidies for faculty publication, would also see their costs increase. Offsetting these costs would be whatever amounts their libraries would save in journal subscriptions and monograph purchases, but since commercial publishers (and many society publishers) would not have the option of converting to a full "subsidy economy," those amounts would be equivalent to only what libraries currently spend on university press publications.

5) If commercial publishers should decide to stop publishing research under the constrained circumstances envisioned by advocates of free-to-user open access, what happens to the journals abandoned by these publishers? How many of them could universities afford to subsidize through faculty grants? How much could universities with presses increase the output of their presses to accommodate the monographs now published commercially? The answers to these questions could involve significant new capital investments. In addition, the case of scholarly societies under BOAI-style open access is particularly worrying. As non-profit organizations committed to supporting effective scholarly communications and professional standards in their fields, these societies provide a wide range of services to scholars and scholarship, including annual conferences, professional development opportunities, recognition of scholarly excellence, and statistical information on such matters as enrollment and employment in their fields, as well as respected publishing programs. Whether a given society’s publishing activities underwrite other services or must be supported by other revenues, funding for essential professional and scholarly activities would be jeopardized by a mandated shift to free-to-user open access, increasing the financial burdens on individual scholars as both authors and professionals.

For university presses, unlike commercial and society publishers, open access does not necessarily pose a threat to their operation and their pursuit of the mission to "advance knowledge, and to diffuse it. . . far and wide." Presses can exist in a gift economy for at least the most scholarly of their publishing functions if costs are internally reallocated (from library purchases to faculty grants and press subsidies). But presses have increasingly been required by their parent universities to operate in the market economy, and the concern that presses have for the erosion of copyright protection directly reflects this pressure. Any decision to switch from a market to a gift economy requires very careful thought and planning. The AAUP and its member presses welcome the opportunity to collaborate with university administrators, librarians, and faculty in designing new publishing models, mindful that it is important to protect what is most valuable about the existing system, which has served the scholarly community and the general public so well for over a century, while undertaking reforms to make the system work better for everyone in the future.

Digital Information Management Certificate

Here’s a description of the University of Arizona’s new Certificate in Digital Information Management program from their press release. The deadline for scholarship applications and admission to the program starting this summer has just been extended to April 1, 2007.

The University of Arizona School of Information Resources and Library Science and The University of Arizona Office of Continuing Education and Academic Outreach are now accepting applications from students interested in a new post-baccalaureate certificate program in Digital Information Management (DigIn). DigIn will provide hands-on experience and focused instruction for people seeking new careers in or improving their skills and knowledge of digital archives, digital libraries, digital document repositories and other kinds of digital collections.

The explosion of digital information and the growth of on-line digital resources has led to a shortage of individuals with an understanding of the disciplines of libraries, document management and archives who also have the technical knowledge and skills needed to create, manage and support digital information collections. The six-course, 18-credit hour graduate program will provide both new students and working professionals with a balanced mix of content that includes practical applied technology skills along with a foundation in the theory and practice of building and maintaining today’s digital collections. Certificate holders will be well positioned for careers in libraries, archives, local, state and federal government and the private sector.

All coursework is online, so students will not need to take time off work or travel for courses. The program may be completed in 18-30 months and starts each summer with two required courses, Introduction to Applied Technology and Introduction to Digital Collections. The certificate program has been developed in cooperation with The Arizona State Library, Archives and Public Records. Major funding for program development comes from the Institute of Museum and Library Services (IMLS), which has also provided funding for a limited number of scholarships.

For more information and to apply, visit the University of Arizona Office of Continuing Education and Academic Outreach website at:
http://ceao.arizona.edu/dist/sirls_welcome.html.

Freedom and Innovation Revitalizing U.S. Entrepreneurship Act of 2007

Representatives Rick Boucher and John Doolittle have introduced the Freedom and Innovation Revitalizing U.S. Entrepreneurship Act of 2007 (FAIR USE Act) in the House.

The EFF embraced the bill; The RIAA said it would "legalize hacking."

The key sections of the bill are below:

SEC. 2. COPYRIGHT INFRINGEMENT.

(a) STATUTORY DAMAGES ADJUSTMENT.—Section 2 54(c)(2) of title 7, United States Code, is amended by adding at the end the following: "The court shall remit statutory damages for secondary infringement, except in a case in which the copyright owner sustains the burden of proving, and the court finds, that the act or acts constituting such secondary infringement were done under circumstances in which no reasonable person could have believed such conduct to be lawful."

(b) CODIFICATION OF SUPREME COURT PRECEDENT APPLICABLE TO HARDWARE DEVICES.—Section 1 of title 7, United States Code, is amended by adding at the end the following: "(g) CERTAIN HARDWARE DEVICES.— No person shall be liable for copyright infringement based on the design, manufacture, or distribution of a hardware device that is capable of substantial, commercially significant noninfringing use."

SEC.. DMCA AMENDMENTS.

(a) CODIFICATION OF DETERMINATION OF LIBRARIAN OF CONGRESS.—Section 21(a)(1) of title 7, United States Code, is amended by adding at the end the following new subparagraph: "(F) The prohibition contained in subparagraph (A) shall not apply to a person by reason of that person’s en gaging in a noninfringing use of any of the classes of copyrighted works set forth in the determination of the Librarian of Congress in Docket No. RM 25-11, as published as a final rule by the Copyright Office, Library of Congress, effective November 27, 26 (71 F. R.8472 (Nov. 27, 26)."

(b) EXTENSION OF DETERMINATIONS OF LIBRARIAN OF CONGRESS.—Section 21(a)(1) of title 7, United States Code, is amended by adding at the end the following new subparagraph: "(G) The prohibition contained in subparagraph (A) shall not apply to—

"(i) an act of circumvention that is carried out solely for the purpose of making a compilation of portions of audiovisual works in the collection of a library or archives for educational use in a classroom by an instructor; "

(ii) an act of circumvention that is carried out solely for the purpose of enabling a person to skip past or to avoid commercial or personally objectionable content in an audiovisual work;

"(iii) an act of circumvention that is carried out solely for the purpose of enabling a person to transmit a work over a home or personal network, except that this exemption does not apply to the circumvention of a technological measure that prevents uploading of a work to the Internet for mass, indiscriminate redistribution;

"(iv) an act of circumvention that is carried out solely for the purpose of gaining access to one or more works in the public domain that are included in a compilation consisting primarily of works in the public domain;

"(v) an act of circumvention that is carried out to gain access to a work of substantial public interest solely for purposes of criticism, comment, news reporting, scholarship, or research; or

"(vi) an act of circumvention that is carried out solely for the purpose of enabling a library or archives meeting the requirements of section 18(a)(2), with respect to works included in its collection, to preserve or secure a copy or to replace a copy that is damaged, deteriorating, lost, or stolen."

Wildfire Institutional Repository Software

One of the interesting findings of my brief investigation of open access repository software by country was the heavy use of Wildfire in the Netherlands.

Wildfire was created by Henk Druiven, University of Groningen, and it is used by over 70 repositories. It runs on a PHP, MySQL, and Apache platform.

Here is a brief description from In Between.

Wildfire is the software our library uses for our OAI compatible repositories. It is a flexible system for setting up a large number of repositories that at the same time allows them to be aggregated in groups. A group acts like yet another repository with its own harvest address and user interface.

There are several descriptive documents about Wildfire, but most are not in English.

Nontraditional Professionals in Research Libraries

There is an interesting article by Stanley Wilder in the "Careers" section of the February 23, 2007 issue of The Chronicle of Higher Education ("The New Library Professional").

Using 2005 data, he notes that 23% of the professionals in ARL libraries are in nontraditional positions (e.g., development, human resources, and IT), and 39% of under-35 professionals are in these type of positions. In this under-35 group, 24% of nontraditional professionals earn $54,000 (or more). Forty-seven percent of under-35 computer professionals make $50,000 or more. (Needless to say, traditional librarians have not done as well salary-wise, leading to equity concerns.) The number of professionals who do not have an MLS has skyrocketed by 142% since 1985.

I wonder if there are significant differences in this trend by ARL library rank, with it being stronger at more larger, more affluent libraries or at libraries in private institutions.

My own experience during the last 20 years or so at a small public ARL library was that it was a constant struggle to get approval for new computer professional positions; to be able to recruit at salaries that, while not truly competitive, were at least not laughable; to upgrade existing positions so that they more adequately reflected job duties and marketplace values; and to retain staff. This was more difficult for non-MLS professionals than for MLS professionals.

More than once I, as an Assistant Dean/Director of Systems, had to take direct responsibility for Web support because of lengthy recruitment difficulties, including one two-year stretch where I did 100% of all Web support work in addition to my normal duties (I also ran the branch libraries for one year of that period). As an Assistant Dean for Digital Library Planning and Development, I had no staff and no prospect of getting any.

At some research libraries, non-MLS professionals may find that they have no career path or a short one. Taking computer professionals as an example, the issue is how far up the hierarchy can non-MLS professionals go before they hit the "must have an accredited MLS" ceiling? Can they become unit heads, department heads, ADs, or Deans/Directors? The answer may vary by library. Another issue is, generous salaries aside, are nontraditional professionals treated as second-class citizens in other ways than advancement (e.g., they may not be given the same level of support for professional travel and activities, especially if MLS librarians have faculty or faculty-like status and are adequately supported in their efforts to move up the academic ranks). Given that 39% of under-35 professionals are in nontraditional jobs, these are important issues to address, especially if Boomer librarians manage to retire en masse as some predict. It would not be a pretty sight to have Boomers heading out the door just as younger nontraditional librarians bump their heads on the MLS ceiling and start considering other career options.

This is a liminal period for research libraries, and, to a significant degree, nontraditional staff will determine their future success.

Source: Wilder, Stanley. "The New Library Professional." The Chronicle of Higher Education, 23 February 2007, C1, C4.

Open Access Repository Software Use By Country

Based on data from the OpenDOAR Charts service, here is snapshot of the open access repository software that is in use in the top five countries that offer such repositories.

The countries are abbreviated in the table header column as follows: US = United States, DK = Germany, UK = United Kingdom, AU = Australia, and NL = Netherlands. The number in parentheses is the reported number of repositories in that country.

Read the country percentages downward in each column (they do not total to 100% across the rows).

Excluding "unknown" or "other" systems, the highest in-country percentage is shown in boldface.

Software/Country US (248) DE (109) UK (93) AU (50) NL (44)
Bepress 17% 0% 2% 6% 0%
Cocoon 0% 0% 1% 0% 0%
CONTENTdm 3% 0% 2% 0% 0%
CWIS 1% 0% 0% 0% 0%
DARE 0% 0% 0% 0% 2%
Digitool 0% 0% 1% 0% 0%
DSpace 18% 4% 22% 14% 14%
eDoc 0% 2% 0% 0% 0%
ETD-db 4% 0% 0% 0% 0%
Fedora 0% 0% 0% 2% 0%
Fez 0% 0% 0% 2% 0%
GNU EPrints 19% 8% 46% 22% 0%
HTML 2% 4% 4% 4% 0%
iTor 0% 0% 0% 0% 5%
Milees 0% 2% 0% 0% 0%
MyCoRe 0% 2% 0% 0% 0%
OAICat 0% 0% 0% 2% 0%
Open Repository 0% 0% 3% 0% 2%
OPUS 0% 43% 2% 0% 0%
Other 6% 7% 2% 2% 0%
PORT 0% 0% 0% 0% 2%
Unknown 31% 28% 18% 46% 23%
Wildfire 0% 0% 0% 0% 52%

Snapshot Data from OpenDOAR Charts

OpenDOAR has introduced OpenDOAR Charts, a nifty new service that allows users to create and view charts that summarize data from its database of open access repositories.

Here’s what a selection of the default charts show today. Only double-digit percentage results are discussed.

  • Repositories by continent: Europe is the leader with 49% of repositories. North America places second with 33%.
  • Repositories by country: In light of the above, it is interesting that the US leads the pack with 29% of repositories. Germany (13%) and the UK follow (11%).
  • Repository software: After the 28% of unknown software, EPrints takes the number two slot (21%), followed by DSpace (19%).
  • Repository types: By far, institutional repositories are the leader at 79%. Disciplinary repositories follow (13%).
  • Content types: ETDs lead (53%), followed by unpublished reports/working papers (48%), preprints/postprints (37%), conference/workshop papers (35%), books/chapters/sections (31%), multimedia/av (20%), postprints only (17%), bibliographic references (16%), special items (15%), and learning objects (13%).

This is a great service; however, I’d suggest that University of Nottingham consider licensing it under a Creative Commons license so that snapshot charts could be freely used (at least for noncommercial purposes).

Creative Commons Version 3.0 Licenses Released

The Creative Commons has released version 3.0 of its popular licenses.

Here’s an excerpt from the press release that explains the changes:

Separating the “generic” from the US license

As part of Version 3.0, we have spun off the “generic” license to be the CC US license and created a new generic license, now known as the “unported” license. For more information about this change, see this more detailed explanation.

Harmonizing the treatment of moral rights & collecting society royalties

In Version 3.0, we are ensuring that all CC jurisdiction licenses and the CC unported license have consistent, express treatment of the issues of moral rights and collecting society royalties (subject to national differences). For more information about these changes, see this explanation of the moral rights harmonization and this explanation of the collecting society harmonization.

No Endorsement Language

That a person may not misuse the attribution requirement of a CC license to improperly assert or imply an association or relationship with the licensor or author, has been implicit in our licenses from the start. We have now decided to make this explicit in both the Legal Code and the Commons Deed to ensure that — as our licenses continue to grow and attract a large number of more prominent artists and companies — there will be no confusion for either the licensor or licensee about this issue. For a more detailed explanation, see here.

BY-SA — Compatibility Structure Now Included

The CC BY-SA 3.0 licenses will now include the ability for derivatives to be relicensed under a “Creative Commons Compatible License,” which will be listed here. . . . More information about this is provided here.

Clarifications Negotiated With Debian & MIT

Finally, Version 3.0 of the licenses include minor clarifications to the language of the licenses to take account of the concerns of Debian (more details here) and MIT (more details here).

CNI-COPYRIGHT List Moves and Changes Its Name

The CNI-COPYRIGHT mailing list is moving and changing its name.

The list is now called PIJIP-COPYRIGHT, and its e-mail address is PIJIP-COPYRIGHT@roster.wcl.american.edu.

The list’s new home page is:

http://roster.wcl.american.edu/archives/pijip-copyright.html

Peter Jaszi, Professor of Law and Faculty Director of the Program on Information Justice and Intellectual Property at the Washington College of Law, American University is now in charge of the list.