JISC E-Journal Archive Registry Study

JISC has released "Scoping Study for a Registry of Electronic Journals That Indicates Where They Are Archived."

Here's an excerpt from the "Executive Summary":

The research and especially the interviews have confirmed the assumption behind the project that there is a need for more information, and more easily accessible information, about where e-journals are archived. However, what has also emerged strongly is that this issue cannot be considered in isolation, either from the overall context of relationships within the scholarly communication system, nor from other initiatives being undertaken to improve information flows e.g. in relation to the transfer of journal titles between publishers. . . .

Librarians felt that they were most likely to consult a registry in situations where they were considering taking out or renewing a subscription; considering cancellation of a print subscription in favour of an e-only subscription; contemplating relocating or discarding print holdings. The vast majority of potential users of such a registry would be library staff in university and national libraries, though organisations licensing e-journals on behalf of the library community would also be likely to use the registry to check compliance with licence conditions.

One of the key benefits of a registry is perceived to be the exposure of gaps in archive provision. This was identified by all types of stakeholder: librarians would want to be alerted to risks to any of their holdings; publishers who are making provision would like to see their efforts recognised and pressure placed on publishers who are not making satisfactory arrangements; archive organisations would also benefit as that effect fed through to more demand for their services.

The drawbacks to a registry as a solution to the acknowledged information gap were mainly seen as ones of practicality (keeping the information accurate and up to date), trust (especially whether a national solution is appropriate, and conversely whether an international solution is feasible) and sustainability of the funding model. Other solutions were suggested, mainly involving either WorldCat or ERM vendors such as Serials Solutions. The latter were also suggested as a complementary part of a solution involving, but not limited to, a registry.

VALA 2008 Presentations

Presentations from the VALA 2008 conference are now available.

Here's a selection of presentations:

Michelle McLean has blogged a number of VALA 2008 sessions in Connecting Librarian postings.

National Center for Learning Science and Technology Trust Fund Included in Bill Passed By House

The College Opportunity and Affordability Act, recently passed by the House, included funding for Digital Promise's National Center for Learning Science and Technology Trust Fund. One of Digital Promise's goals is to: "Digitize America’s collected memory stored in our nation's universities, libraries, museums and public television archives to make these materials available anytime and anywhere."

Here's an excerpt from the press release:

We are thrilled to report that legislation embracing the Digital Promise proposal to establish the National Center for Learning Science and Technology Trust Fund as a pilot program (we had originally labeled the Center "DO IT," the Digital Opportunity Investment Trust) was passed by the House of Representatives by a wide margin on Thursday evening, February 7.

The College Opportunity and Affordability Act (HR 4137), authorizes the establishment of the Center as an independent nonprofit 501(c)(3) corporation within the Department of Education. Under the legislation, the Center will have its own distinguished nine member board of directors. It will administer a trust fund for precompetitive basic and applied research to help transform education, skills training and lifelong learning for the digital age. It will assess and research prototypes for innovative digital learning and information technologies; support pilot testing and evaluation, encourage their widespread adoption and use, and introduce digital media education programs for parents, teachers, and children to build technology literacy. To carry out its activities the Center will award contracts and grants to colleges and universities, museums, libraries, public broadcasting entities and similar nonprofit organizations and public institutions, as well as to for-profit organizations.

iRODS Version 1.0: Data Grids, Digital Libraries, Persistent Archives, and Real-Time Data Systems

The Data-Intensive Computing Environments group at the San Diego Supercomputer Center has released version 1.0 of the open-source iRODS (Integrated Rule-Oriented Data System) system, which can be used to support data grids, digital libraries, persistent archives, and real-time data systems.

Here's an excerpt from the press release:

"iRODS is an innovative data grid system that incorporates and moves beyond ten years of experience in developing the widely used Storage Resource Broker (SRB) technology," said Reagan Moore, director of the DICE group at SDSC. "iRODS equips users to handle the full range of distributed data management needs, from extracting descriptive metadata and managing their data to moving it efficiently, sharing data securely with collaborators, publishing it in digital libraries, and finally archiving data for long-term preservation. . . ."

"You can start using it as a single user who only needs to manage a small stand-alone data collection," said Arcot Rajasekar, who leads the iRODS development team. "The same system lets you grow into a very large federated collaborative system that can span dozens of sites around the world, with hundreds or thousands of users and numerous data collections containing millions of files and petabytes of data—it’s a true full-scale distributed data system." A petabyte is one million gigabytes, about the storage capacity of 10,000 of today’s PCs. . . .

Version 1.0 of iRODS is supported on Linux, Solaris, Macintosh, and AIX platforms, with Windows coming soon. The iRODS Metadata Catalog (iCAT) will run on either the open source PostgreSQL database (which can be installed via the iRODS install package) or Oracle. And iRODS is easy to install—just answer a few questions and the install package automatically sets up the system.

Under the hood, the iRODS architecture stores data on one or more servers, which may be widely separated geographically; keeps track of system and user-defined information describing the data with the iRODS Metadata Catalog (iCAT); and offers users access through clients (currently a command line interface and Web client, with more to come). As directed by iRODS rules, the system can process data where it is stored using applications called "micro-services" executed on the remote server, making possible smaller and more targeted data transfers.

A Review and Analysis of Academic Publishing Agreements and Open Access Policies

The OAK (Open Access to Knowledge) Law Project has published A Review and Analysis of Academic Publishing Agreements and Open Access Policies.

Here's an excerpt from the "Conclusion and Next Steps":

The review of publishers’ open access policies and practices found that:

  • the majority of publishers did not have a formal open access policy;
  • only four of the total sample of 64 publishers surveyed had a formal open access policy;
  • 62.5% of the publishers were able to provide sufficient information to enable them to be “colour classified” using the SHERPA/RoMEO colour classification system to denote levels of open access;
  • using the SHERPA/RoMEO colour classifications:
    • 25% of the surveyed publishers were “green” (permitting archiving of the pre-print and post-print versions of published articles);
    • 4.7% were “blue” (permitting archiving of the post-print version);
    • 6.25% were “yellow” (permitting archiving of the pre-print version);
    • 26.6% were “white” (archiving not formally supported).

House Passes College Opportunity and Affordability Act with File-Sharing Provision Intact

Despite lobbying by EDUCAUSE and others, the U.S. House of Representatives has passed, 354 to 58, the College Opportunity and Affordability Act with its Sec. 494 illegal file sharing provision intact.

Here's the provision:

SEC. 494. CAMPUS-BASED DIGITAL THEFT PREVENTION.

(a) In General— Each eligible institution participating in any program under this title shall to the extent practicable—

(1) make publicly available to their students and employees, the policies and procedures related to the illegal downloading and distribution of copyrighted materials required to be disclosed under section 485(a)(1)(P); and

(2) develop a plan for offering alternatives to illegal downloading or peer-to-peer distribution of intellectual property as well as a plan to explore technology-based deterrents to prevent such illegal activity.

(b) Grants—

(1) PROGRAM AUTHORITY— The Secretary may make grants to institutions of higher education, or consortia of such institutions, and enter into contracts with such institutions, consortia, and other organizations, to develop, implement, operate, improve, and disseminate programs of prevention, education, and cost-effective technological solutions, to reduce and eliminate the illegal downloading and distribution of intellectual property. Such grants or contracts may also be used for the support of a higher education centers that will provide training, technical assistance, evaluation, dissemination, and associated services and assistance to the higher education community as determined by the Secretary and institutions of higher education.

(2) AWARDS— Grants and contracts shall be awarded under paragraph (1) on a competitive basis.

(3) APPLICATIONS— An institution of higher education or a consortium of such institutions that desires to receive a grant or contract under paragraph (1) shall submit an application to the Secretary at such time, in such manner, and containing or accompanied by such information as the Secretary may reasonably require by regulation.

(4) AUTHORIZATION OF APPROPRIATIONS— There are authorized to be appropriated to carry out this subsection such sums as may be necessary for fiscal year 2009 and for each of the 4 succeeding fiscal years.

Read more about it at: "Controversial College Funding Bill Passed—P2P Proviso Intact," "Educause Lobbies Against Piracy Measure in House Bill," "House, Focusing on Cost, Approves Higher Education Act," and "House Approves MPAA-Backed College Antipiracy Rules."

Preservation in the Age of Large-Scale Digitization: A White Paper

The Council on Library and Information Resources has published Preservation in the Age of Large-Scale Digitization: A White Paper by Oya Rieger.

Here's an excerpt from the "Preface":

This paper examines large-scale initiatives to identify issues that will influence the availability and usability, over time, of the digital books that these projects create. As an introduction, the paper describes four key large-scale projects and their digitization strategies. Issues range from the quality of image capture to the commitment and viability of archiving institutions, as well as those institutions' willingness to collaborate. The paper also attempts to foresee the likely impacts of large-scale digitization on book collections. It offers a set of recommendations for rethinking a preservation strategy. It concludes with a plea for collaboration among cultural institutions. No single library can afford to undertake a project on the scale of Google Book Search; it can, however, collaborate with others to address the common challenges that such large projects pose.

Although this paper covers preservation administration, digital preservation, and digital imaging, it does not attempt to present a comprehensive discussion of any of these distinct specialty areas. Deliberately broad in scope, the paper is designed to be of interest to a wide range of stakeholders. These stakeholders include scholars; staff at institutions that are currently providing content for large-scale digital initiatives, are in a position to do so in the future, or are otherwise influenced by the outcomes of such projects; and leaders of foundations and government agencies that support, or have supported, large digitization projects. The paper recommends that Google and Microsoft, as well as other commercial leaders, also be brought into this conversation.

Indiana University Establishes the Institute for Digital Arts and Humanities

Indiana University at Bloomington has establishes an Institute for Digital Arts and Humanities.

Here's an excerpt from the press release:

The Institute for Digital Arts and Humanities (IDAH) will enable and expand digitally based arts and humanities projects at IU by bringing together scholars, artists, librarians and IT experts. The institute draws on established strengths at the IU Bloomington campus in combining arts and humanities disciplines and information technology, such as the Variations digital music library, the EVIA digital video archive of ethnographic music and dance, 3-D virtual reality work by IU artists and the IU Digital Library Program. . . .

IDAH will make use of the extensive cyberinfrastructure on the Bloomington campus, including massive data storage capacity and advanced visualization systems. The institute will be housed in the new Research Commons, an initiative being implemented by the IU Bloomington Libraries that offers a suite of services in support of faculty research and creative activity. The Research Commons space will be located on three floors in the East Tower of the Wells Library.

IDAH unites faculty from eight IU Bloomington schools with the disciplinary and technical expertise of staff from the Wells Library and University Information Technology Services. UITS will provide information storage and access, hardware, software support and the delivery of various technologies.

IDAH joins the ranks of similar national and international institutes at institutions such as the University of Glasgow, University of Virginia and the University of California Los Angeles. The institute is administered through IU's Office of the Vice Provost for Research and led by Ruth Stone, associate vice provost for arts research and Laura Boulton, professor of ethnomusicology. . . .

Two-year fellowships will be offered for faculty to develop digital projects. IDAH fellows will work in an interdisciplinary environment to enhance their understanding of digital tools, prepare prototypes for major projects, and develop and submit grant proposals for external funding. . . .

During their fellowships, faculty will participate in an ongoing workshop with a team of specialists and other faculty fellows. Following the fellowship period, fellows will be invited to work with the institute, which will assist in hiring and supervising staff for the faculty research projects. IDAH will also serve as a center for collaboration among faculty already pursuing existing projects in expressive culture.

Are Photographs Derivative Works?

Is a photograph of a vodka bottle a derivative work? How about a photograph of a toy? A sculpture? Noted copyright expert William Patry examines cases dealing with these questions in "Photographs and Derivative Works," finding that the court's interpretation isn't always correct.

Here's an excerpt:

Photographs of other objects are not derivative works of those objects. First, a photograph of an object is not "based on" that object: It is a mere depiction of it. Second, even if one were to find that a photograph of an object is based on that "preexisting work" within the meaning of the definition of "derivative work" in Section 101, such a photograph must still "recast, transform, or adapt" the authorship in the preexisting work to be considered a derivative work.

NISO Releases SERU: A Shared Electronic Resource Understanding

The National Information Standards Organization has released SERU: A Shared Electronic Resource Understanding. The document "codifies best practices for the sale of e-resources without license agreements."

Here's an excerpt from the press release:

SERU offers publishers and librarians the opportunity to save both the time and the costs associated with a negotiated and signed license agreement by agreeing to operate within a framework of shared understanding and good faith.

Publication of SERU follows a trial-use period of June through December 2007, during which time librarians and publishers reported—all positively—on their experiences using the draft document. . . .

The SERU Working Group was launched in late 2006 following the recommendation of participants in a meeting exploring opportunities to reduce the use of licensing agreements. The 2006 meeting was sponsored by ARL, NISO, the Association of Learned and Professional Society Publishers (ALPSP), the Scholarly Publishing and Academic Resources Coalition (SPARC), and the Society for Scholarly Publishing (SSP). More information about the SERU Working Group, including FAQs and an electronic mailing list, can be found at http://www.niso.org/committees/seru/.

Danish Court Orders Nordic ISP to Block Access to Torrent Search Engine Pirate Bay

Tele2, a major Nordic ISP, must block its customers access to torrent search engine The Pirate Bay due to a Danish court order. Recently, four persons associated with The Pirate Bay were charged with assisting copyright infringement.

Read more about it at "Danish ISP Shuts Access to File-Sharing Pirate Bay," "Pirate Bay Admins Charged with Assisting Copyright Infringement," "Pirate Bay: Big Revenue Claims Fabricated by Prosecutors," "The Pirate Bay Fights Danish ISP Block," and "Pirate Bay Future Uncertain after Operators Busted."

Library Copyright Alliance and 7 Other Organizations Argue against the PRO-IP Act in White Paper

Eight organizations have submitted a white paper to the U.S. Copyright Office that critiques the PRO IP Act. The organizations are Library Copyright Alliance, Computer & Communications Industry Association, NetCoalition, Consumer Electronics Association, Public Knowledge, Center for Democracy & Technology, Association of Public Television Stations, and Printing Industries of America.

Here's an excerpt from the "Executive Summary":

Not only is there a complete lack of evidence for the need to modify existing law, the proposed change would cause significant collateral damage across the economy, including, for instance, technology and Internet companies, software developers, telecommunications companies, graphics and printed materials industries, libraries, and consumers. Allowing plaintiffs to disaggregate components of existing works would—

  • Incentivize “copyright trolls” by providing plaintiffs with the leverage to assert significantly larger damage claims and obtain unjustified “nuisance settlements” from innovators not able to tolerate the risk of a ruinous judgment.
  • Stifle innovation by discouraging technologists from using or deploying any new technology or service that could be used to engage in infringing activities by third parties.
  • Create unprecedented risk for licensees of technologies powered by software. Because licensees may be unable or unwilling to obtain meaningful indemnifications from every upstream contributor to a particular product, the proposed change will decrease companies’ willingness to outsource software solutions or use open source software.
  • Chill lawful uses, suppress the development of fair use case law, and exacerbate the orphan works problem.

Read more about it at "Groups Submit Paper Opposing Higher Copyright Damages" and "PRO-IP Act Is Dangerous and Unnecessary, Say Industry Groups."

Scholarly Electronic Publishing Weblog Update (2/6/08)

The latest update of the Scholarly Electronic Publishing Weblog (SEPW) is now available, which provides information about new works related to scholarly electronic publishing, such as books, journal articles, magazine articles, technical reports, and white papers.

Especially interesting are: "Carrots and Sticks: Some Ideas on How to Create a Successful Institutional Repository"; A DRIVER's Guide to European Repositories; The European Repository Landscape: Inventory Study into Present Type and Level of OAI Compliant Digital Repository Activities in the EU; "India, Open Access, the Law of Karma and the Golden Rule"; Investigative Study of Standards for Digital Repositories and Related Services; "The Mandates of January"; "The Preservation of Digital Materials"; "Open Access and Quality"; "Open Access to Electronic Theses and Dissertations"; "Planets: Integrated Services for Digital Preservation"; "PRONOM-ROAR: Adding Format Profiles to a Repository Registry to Inform Preservation Services"; "SPARC: Creating Innovative Models and Environments for Scholarly Research and Communication"; and "Usage Impact Factor: The Effects of Sample Characteristics on Usage-Based Impact Metrics."

SPARC Author Rights Forum Established

The Scholarly Publishing and Academic Resources Coalition has established a new mailing list: the SPARC Author Rights Forum.

Here's an excerpt from the press release:

SPARC (the Scholarly Publishing and Academic Resources Coalition) has introduced a new discussion forum on the topic of author rights. The SPARC Author Rights Forum provides a private and moderated venue for academic librarians to explore copyright and related issues in teaching and research—especially questions arising from the development of digital repositories and recent public access mandates.

The SPARC Author Rights Forum has been established to support educational outreach to authors on issues related to retaining their copy rights. Topics relevant to the list include, but are not limited to:

  • Ensuring copyright compliance with public access policies, including the new National Institutes of Health mandate
  • Rights of faculty under copyright and contract law
  • Availability and use of author addenda
  • Working with publishers to secure agreements to retain needed rights
  • Experiences in developing institutional copyright policies and educational programs

The list will focus primarily on the U.S. and Canadian legal environments, though members of the international community are welcome to join. Educators, researchers, policy makers, librarians, legal counsel, and all who have an interest in responsible author copyright management are encouraged to contribute. The SPARC Author Rights Forum is moderated by Kevin Smith, J.D., Scholarly Communications Officer for Duke University Libraries.

List membership is subject to approval and posts are moderated for appropriate topical content. To request membership in the SPARC Author Rights Forum, send any message to sparc-arforum-feed@arl.org.

Full Scholarships Available for Online Graduate Digital Information Management Certificate Program

Full scholarships are available for students interested in obtaining a graduate certificate in Digital Information Management from the University of Arizona's School of Information Resources and Library Science. Recently, the Library of Congress honored Richard Pearce-Moses, one of the key figures in the development of the program, by naming him as a digital preservation pioneer.

Here's the announcement:

The University of Arizona School of Information Resources and Library Science is pleased to announce that a number of full scholarships are still available in the school's graduate certificate program in Digital Information Management. The program is scheduled to begin a new series of courses starting this summer. Prospects have until April 1, 2008 to apply for one of the openings and available financial aid.

DigIn, as the program is known, provides hands-on experience and focused instruction supporting careers in libraries and archives, cultural heritage institutions and digital collections, information repositories in government and the private sector and similar institutions. The certificate is comprised of six courses covering diverse topics including digital collections, applied technology, technology planning and leadership, policy and ethics, digital preservation and curation, and other subjects relevant to today's digital information environments.

For people just starting in the field or considering career changes, the DigIn certificate program offers an alternative path to graduate studies that helps prepare students for success in traditional graduate programs or the workplace. The certificate also provides a means for working professionals and those who already have advanced graduate degrees in the library and information sciences to broaden their knowledge and skills in today's rapidly evolving digital information landscape.

The program is delivered in a 100% virtual environment and has no residency requirements. Students may choose to complete the certificate in fifteen or twenty-seven months.

The certificate program has been developed in cooperation with the Arizona State Library, Archives and Public Records and the University of Arizona Office of Continuing Education and Academic Outreach. Major funding for program development comes from the federal government's Institute of Museum and Library Services (IMLS), which has also provided funding for a number of scholarships.

Additional details on the program including course descriptions, admissions requirements and application forms may be found on the program website at http://sir.arizona.edu/digin. Or, contact the UA School of Information Resources and Library Science by phone at 520-621-3565 or email at sirls@email.arizona.edu.

Scholarship in the Age of Abundance: Enhancing Historical Research with Text-Mining and Analysis Tools Project

The Center for History and New Media's Scholarship in the Age of Abundance: Enhancing Historical Research with Text-Mining and Analysis Tools project has been awarded a two-year grant from the National Endowment for the Humanities.

Here's an excerpt from "Enhancing Historical Research with Text-Mining and Analysis Tools":

We will first conduct a survey of historians to examine closely their use of digital resources and prospect for particularly helpful uses of digital technology. We will then explore three main areas where text mining might help in the research process: locating documents of interest in the sea of texts online; extracting and synthesizing information from these texts; and analyzing large-scale patterns across these texts. A focus group of historians will be used to assess the efficacy of different methods of text mining and analysis in real-world research situations in order to offer recommendations, and even some tools, for the most promising approaches.

Just Say No: Verizon Won't Filter the Internet

At the recent State of the Net conference, Tom Tauke, Verizon's Executive Vice President, told participants that Verizon did not intend to filter the Internet to enforce copyright compliance.

Here's an excerpt from "Verizon: No Thank You on Copyright Filtering":

He [Tauke] said that it would be 1) a bad business decision "to assume the role of being police on the Internet;" 2) a likely invasion of privacy; and 3) would open the door to requests from others to filter out other objectionable material, like indecency and online gambling.

Read more about it at "Verizon: We Don't Want to Play Copyright Cop on Our Network."

A Dialog with the Mellon Foundation's Don Waters on the Grand Text Auto Open Peer Review Experiment

Previously ("Book to Be Published by MIT Press Undergoing Blog-Based Open Peer Review"), DigitalKoans reported on an open-peer-review experiment on the Grand Text Auto Weblog.

Now, if:book has published a dialog between its staff, the book's author, and Donald J. Waters, the Andrew W. Mellon Foundation's Program Officer for scholarly communications, about the experiment.

Here's an excerpt from the posting:

[Waters] As I understand the explanations, there is a sense in which the experiment is not aimed at "peer review" at all in the sense that peer review assesses the qualities of a work to help the publisher determine whether or not to publish it. What the exposure of the work-in-progress to the community does, besides the extremely useful community-building activity, is provide a mechanism for a function that is now all but lost in scholarly publishing, namely "developmental editing." It is a side benefit of current peer review practice that an author gets some feedback on the work that might improve it, but what really helps an author is close, careful reading by friends who offer substantive criticism and editorial comments. . . . The software that facilitates annotation and the use of the network, as demonstrated in this experiment, promise to extend this informal practice to authors more generally.

Summa: A Federated Search System

Statsbiblioteket is developing Summa, a federated search system.

Birte Christensen-Dalsgaard, Director of Development, discusses Summa and other topics in a new podcast (CNI Podcast: An Interview with Birte Christensen-Dalsgaard, Director of Development at the State and University Library, Denmark).

Here's an excerpt from the podcast abstract:

Summa is an open source system implementing modular, service-based architecture. It is based on the fundamental idea "free the content from the proprietary library systems," where the discovery layer is separated from the business layer. In doing so, any Internet technology can be used without the limitations traditionally set by proprietary library systems, and there is the flexibility to integrate or to be integrated into other systems. A first version of a Fedora—Summa integration has been developed.

A white paper is available that examines the system in more detail.

DISC-UK Report on Web 2.0 Data Visualization Tools

JISC has released DISC-UK DataShare: Web 2.0 Data Visualisation Tools: Part 1—Numeric Data.

Here's an excerpt from the "Introduction":

Part 1 of this briefing paper will highlight some examples of new collaborative web services using Web 2.0 technologies which venture into the numeric data visualisation arena. These mashups allow researchers to upload and analyse their own data in ‘open’ and dynamic environments. Broadly speaking the numeric data being referred to could be micro-data (data about the individual), macro-data2 or country-level data, derived or summary data.

A Major Milestone for the University of Michigan Library: One Million Digitized Books

The University of Michigan Library has digitized and made available one million books from its collection.

Here's an excerpt from "One Million Digitized Books":

One million is a big number, but this is just the beginning. Michigan is on track to digitize its entire collection of over 7.5 million bound volumes by early in the next decade. So far we have only glimpsed the kinds of new and innovative uses that can be made of large bodies of digitized books, and it is thrilling to imagine what will be possible when nearly all the holdings of a leading research library are digitized and searchable from any computer in the world.