EPUB for Archival Preservation

The Koninklijke Bibliotheek has released EPUB for Archival Preservation.

Here's an excerpt:

Over the last few years, the EPUB format has become increasingly popular in the consumer market. A number of publishers have indicated their wish to use EPUB for supplying their electronic publications to the KB. In response to this, the KB's Departments of Collection and Collection Care requested an initial study to investigate the suitability of the format for archival preservation. The main questions were:

  • What are the main characteristics of EPUB?
  • What functionality does EPUB provide, and is this sufficient for representing e.g. content with sophisticated layout and typography requirements?
  • How well is the EPUB supported by software tools that are used in (pre-)ingest workflows?
  • How suitable is EPUB for archival preservation? What are the main risks?

| Digital Curation Bibliography: Preservation and Stewardship of Scholarly Works | Digital Scholarship |

ESIP "Interagency Data Stewardship/Principles" and "Interagency Data Stewardship/Citations/Provider Guidelines" Approved

The Federation of Earth Science Information Partners has approved its "Interagency Data Stewardship/Principles" and "Interagency Data Stewardship/Citations/Provider Guidelines."

Here's an excerpt from "Data Management and the ESIP Federation" by Ruth Duerr:

Why do I think that this was significant? Simply because it represents the first time that a large and diverse set of US Mission agencies, data centers, research groups, commercial companies, tool developers, and even individuals have come together and agreed that data stewardship is important. They saw it to be important enough to codify into standard practices for data and recognized that data citation is something that needs to become part of the culture of science and that it is past time to make that happen.

| Digital Curation and Preservation Bibliography 2010 | Digital Scholarship |

COUNTER Code of Practice for Usage Factors, Draft Release 1

COUNTER (Counting Online Usage of Networked Electronic Resources) has released the COUNTER Code of Practice for Usage Factors, Draft Release 1.

Here's an excerpt from the announcement:

The Draft Release 1 of the COUNTER Code of Practice for Usage Factors, is one of the most significant outcomes to-date of the Usage Factor project, and is an important part of this, the final Stage of the project, which will take Usage Factor forward to full implementation. This Draft Release 1 is based on well-established COUNTER standards, procedures and protocols; it is designed to enable the recording and reporting by publishers of credible, consistent and compatible global Usage Factors for online publications hosted by them (and incorporating usage of these publications on other platforms that are capable of delivering COUNTER-compliant usage statistics). While Release 1 of this Code of Practice focuses on Usage Factors for journals, it is envisaged that its scope will be extended in subsequent Releases to cover other online publications, such as books and reference works.

| Digital Scholarship's Digital Bibliographies | Digital Scholarship |

Audit and Certification of Trustworthy Digital Repositories

The Council of the Consultative Committee for Space Data Systems (CCSDS) has released Audit and Certification of Trustworthy Digital Repositories, which is a recommended practice.

Here's an excerpt:

In 2002, Research Libraries Group (RLG) and Online Computer Library Center (OCLC) jointly published Trusted Digital Repositories: Attributes and Responsibilities (reference [B2]), which further articulated a framework of attributes and responsibilities for trusted, reliable, sustainable digital repositories capable of handling the range of materials held by large and small cultural heritage and research institutions. . . . .

OAIS included a Roadmap for follow-on standards which included 'standard(s) for accreditation of archives'. It was agreed that RLG and National Archives and Records Administration (NARA) would take this particular topic forward and the later published the TRAC (reference [B3]) document which combined ideas from OAIS (reference [1]) and Trusted Digital Repositories: Attributes and Responsibilities (TDR—reference [B2]).

The current document follows on from TRAC in order to produce an ISO standard.

| Digital Curation and Preservation Bibliography 2010 | Digital Scholarship |

NISO Releases ESPReSSO: Establishing Suggested Practices Regarding Single Sign-On

The National Information Standards Organization (NISO) has released ESPReSSO: Establishing Suggested Practices Regarding Single Sign-On.

Here's an excerpt from the press release:

Currently a hybrid environment of authentication practices exists, including older methods of userid/password, IP authentication, and/or proxy servers along with newer federated authentication protocols such as Athens and Shibboleth. The ESPReSSO recommended practice identifies changes that can be made immediately to improve the authentication experience for the user, even in a hybrid situation, while encouraging both publishers/service providers and libraries to transition to the newer Security Assertion Markup Language (SAML)-based authentication, such as Shibboleth.

| New: Scholarly Electronic Publishing Bibliography, Version 80 | Digital Scholarship |

Best Practices for TEI in Libraries: A Guide for Mass Digitization, Automated Workflows, and Promotion of Interoperability with XML Using the TEI

The TEI Special Interest Group on Libraries has released version three of the Best Practices for TEI in Libraries: A Guide for Mass Digitization, Automated Workflows, and Promotion of Interoperability with XML Using the TEI.

Here's an excerpt from:

There are many different library text digitization projects, serving a variety of purposes. With this in mind, these Best Practices are meant to be as inclusive as possible by specifying five encoding levels. These levels are meant to allow for a range of practice, from wholly automated text creation and encoding, to encoding that requires expert content knowledge, analysis, and editing. The encoding levels are not strictly cumulative: while higher levels tend to build upon lower levels by including more elements, higher levels are not supersets because some elements used at lower levels are not used at higher levels—often because more specific elements replace generic elements.

| Scholarly Electronic Publishing Bibliography 2010 | Digital Scholarship |

NISO Receives Mellon Grant to Support E-Book Annotation Sharing Workshops

NISO has received a Andrew W. Mellon Foundation grant to support two e-book annotation sharing workshops.

Here's an excerpt from the press release:

The National Information Standards Organization (NISO) has been awarded a $48,500 grant from The Andrew W. Mellon Foundation to fund two standards incubation workshops, which it will lead with the Internet Archive, on the topic of E-Book Annotation Sharing and Social Reading. These meetings will be held in conjunction with the Frankfurt Book Fair in Frankfurt, Germany, on October 10, 2011, and the Books In Browsers Meeting in San Francisco, on October 26, 2011. The Mellon Foundation grant will pay for the planning, organization, and direct meeting expenses for the two workshops, for which NISO will conduct the majority of the planning, organization and logistical support.

The two workshops will advance the discussions around system requirements for annotation sharing-including technical challenges of citation location and systems interoperability-and around the development and implementation of a consensus solution for these issues. The objectives of the meetings are to provide input to a NISO-sponsored working group on scope, goals and any initial work the group undertakes; and the advancement of a syntax specification that will be further vetted by a standards working group for how bookmarks and annotations are located and shared in digital books.

| Digital Curation and Preservation Bibliography 2010 | Institutional Repository Bibliography | Transforming Scholarly Publishing through Open Access: A Bibliography | Scholarly Electronic Publishing Bibliography 2010 |

"Standards for Web Applications on Mobile: February 2011 Current State and Roadmap"

W3C has released "Standards for Web Applications on Mobile: February 2011 Current State and Roadmap" by Dominique Hazaël-Massieux.

Here's an excerpt:

This document summarizes the various technologies developed in W3C that increases the power of Web applications, and how they apply more specifically to the mobile context, as of February 2011. . . .

The features that these technologies add to the Web platform are organized under the following categories:

  • Graphics
  • Multimedia
  • Forms
  • User interactions
  • Data storage
  • Sensors and hardware integration
  • Network
  • Communication
  • Packaging
  • Performance & Optimization

| Digital Scholarship | Digital Scholarship Publications Overview |

A Standards-based, Open and Privacy-aware Social Web

The W3C Incubator Group has released A Standards-based, Open and Privacy-aware Social Web.

Here's an excerpt:

The Social Web is a set of relationships that link together people over the Web. The Web is an universal and open space of information where every item of interest can be identified with a URI. While the best known current social networking sites on the Web limit themselves to relationships between people with accounts on a single site, the Social Web should extend across the entire Web. Just as people can call each other no matter which telephone provider they belong to, just as email allows people to send messages to each other irrespective of their e-mail provider, and just as the Web allows links to any website, so the Social Web should allow people to create networks of relationships across the entire Web, while giving people the ability to control their own privacy and data. The standards that enable this should be open and royalty-free. We present a framework for understanding the Social Web and the relevant standards (from both within and outside the W3C) in this report, and conclude by proposing a strategy for making the Social Web a "first-class citizen" of the Web.

| Digital Scholarship |

NISO Releases Cost of Resource Exchange (CORE) Protocol

NISO has released the Cost of Resource Exchange (CORE) Protocol (NISO RP-10-2010).

Here's an excerpt from the press release:

NISO is pleased to announce the publication of its latest Recommended Practice, CORE: Cost of Resource Exchange Protocol (NISO RP-10-2010). This Recommended Practice defines an XML schema to facilitate the exchange of financial information related to the acquisition of library resources between systems, such as an ILS and an ERMS.

CORE identifies a compact yet useful structure for query and delivery of relevant acquisitions data. "Sharing acquisitions information between systems has always been a difficult problem," said Ted Koppel, Agent Verso (ILS) Product Manager, Auto-Graphics, Inc. and co-chair of the CORE Working Group. "The rise of ERM systems made this problem even more acute. I'm glad that we, through the CORE Recommended Practice, have created a mechanism for data sharing, reuse, and delivery." Co-chair Ed Riding, Catalog Program Manager at the LDS Church History Library, added, "The CORE Recommended Practice provides a solution for libraries attempting to avoid duplicate entry and for systems developers intent on not reinventing the wheel. I look forward to the development of systems that can easily pull cost information from one another and believe CORE can help facilitate that."

CORE was originally intended for publication as a NISO standard. However, following a draft period of trial use that ended March 2010, the CORE Working Group and NISO's Business Information Topic Committee voted to approve the document as a Recommended Practice. This decision was in part based on the lack of uptake during the trial period as a result of recent economic conditions, and was motivated by the high interest in having CORE available for both current and future development as demand for the exchange of cost information increases. Making the CORE protocol available as a Recommended Practice allows ILS and ERM vendors, subscription agents, open-source providers, and other system developers to now implement the XML framework for exchanging cost information between systems. "I am pleased that CORE is now available for systems developers to begin using in order to facilitate the exchange of cost information between systems in a library environment," commented Todd Carpenter, NISO's Managing Director.

OPDS Catalog 1.0 Specification

The OPDS Catalog 1.0 specification has been released.

Here's an excerpt from the announcement:

The open ebook community and the Internet Archive are pleased to announce the release of the first production version of the Open Publication Distribution System (OPDS) Catalog format for digital content. OPDS Catalogs are an open standard designed to enable the discovery of digital content from any location, on any device, and for any application. . . .

Based on the widely implemented Atom Syndication Format, OPDS Catalogs have been developed since 2009 by a group of ebook developers, publishers, librarians, and booksellers interested in providing a lightweight, simple, and easy to use format for developing catalogs of digital books, magazines, and other content.

Reference Model for an Open Archival Information System (OAIS) Draft for Review

A near-final draft of the Reference Model for an Open Archival Information System (OAIS) has been made available for error-checking review.

Here's an excerpt:

This document is a technical Recommendation for use in developing a broader consensus on what is required for an archive to provide permanent, or indefinite long-term, preservation of digital information.

This Recommendation establishes a common framework of terms and concepts which comprise an Open Archival Information System (OAIS). It allows existing and future archives to be more meaningfully compared and contrasted. It provides a basis for further standardization within an archival context and it should promote greater vendor awareness of, and support of, archival requirements.


Profile of Todd Carpenter, Managing Director of NISO

The Society for Scholarly Publishing has published a profile of Todd Carpenter, who is the Managing Director of the National Information Standards Organization.

Here's an excerpt:

[SSP] Where do you see scholarly communications heading, and what new directions interest you most?

[Carpenter] I see the following as critical areas that are in most desperate need of attention in our community: discovery, license and ownership questions, and preservation. On the questions of discovery, thanks to Google, we seem to have forgotten all of the advances in organization that libraries have developed over decades in finding information and have turned to rely solely on keyword searching. This works well enough 80% of the time. The problem is that people have become satisfied with the 80% results that Google returns in fractions of a second, not understanding that there may be something critical in that remaining 20%. Incorporating into search classification structures, ontologies, and improved semantics—all common under different guises in the print world—is a critical component to ensuring that ALL relevant content is visible to users. . . .

The directions that interest me most include ebooks and display technology, identification of items, people and content, and copyright. The next transformation of our industry will likely be in how people access digital content—moving away from the desktop to something that more resembles the experience of using a book. Much of this will depend on developments with display technology, digital ink, and battery power. How people interact with content is going to come down to better solutions for identification of people and content. Control of access to content will be driven by advances in identity management. This likely won't come out of the publishing world (more likely banking or government), but will have incredible ramifications on how scholarship and all content is distributed. Finally, sharing and reuse of content is not likely to be contained by the current rules for copyright. Restructuring those rules to acknowledge and allow what most people want to do with content will be a key question worth watching if copyright is to continue to have any respect by end-users of content.


NISO to Form Single Sign-On Authentication Working Group

NISO will form a single sign-on authentication working group.

Here's an excerpt from the press release:

NISO is pleased to announce the approval by the NISO Voting Members of a new work item to focus on perfecting single-sign-on (SSO) authentication to achieve seamless item-level linking in a networked information environment. A new working group will be formed under the auspices of NISO's Discovery to Delivery Topic Committee to create one or more recommended practices that will explore practical solutions for improving the success of SSO authentication technologies and to promote the adoption of one or more of these solutions to make the access improvements a reality.

This work item is the outcome of NISO's new Chair's Initiative, an annual project of the chair of NISO's Board of Directors. NISO's current Chair, Oliver Pesch (Chief Strategist, EBSCO Information Services), has identified single-sign-on authentication as an area that would benefit greatly from study and development within NISO, with a focus on a solution that will allow a content site to know which authentication method to use without special login URLs in order to provide a seamless experience for the user. Possible solutions include providing a generic mechanism for passing the authentication method from site to site; use of cookies to remember the authentication method that was used the last time the site was accessed by that computer; and/or providing a mechanism to discover if the user has an active session for one of the common SSO authentication methods. "By developing recommended practices that will help make the SSO environment work better (smarter)," said Pesch, "libraries and information providers will improve the ability for users to successfully and seamlessly access the content to which they are entitled."

Open Publication Distribution System Draft Released

Bill McCoy, General Manager of ePublishing Business at Adobe Systems, has announced the release of a draft version of the Open Publication Distribution System.

Here's an excerpt from the announcement:

Stanza, the leading iPhone eBook software, includes an excellent online catalog system that enables users to seamlessly acquire free and commercial content from within the application. The Lexcycle team built this system in an open, extensible manner using Atom. Adobe and Lexcycle have been working together on Adobe PDF and EPUB eBook support, and now we are deepening that collaboration in working together, along with the Internet Archive and others, to establish an open architecture enabling widespread discovery, description, and access of book and other published material on the open web. The Open Publication Distribution System (OPDS) is a generalization of the Atom approach used by Stanza's online catalog. I'm grateful to the Lexcycle team as well as my friend and colleague Peter Brantley for their efforts on behalf of open access and interoperability.

Read more about it at “Adobe Teams Up With Stanza to Create Open EBook Catalog Standard.”

Draft Standard for Exchange of Library Acquisitions Data: Cost of Resource Exchange (CORE) Protocol

NISO has released the Cost of Resource Exchange (CORE) Protocol (Z39.95-200x) as a Draft Standard for Trial Use.

Here's an excerpt from the press release:

The CORE draft standard defines an XML schema to facilitate the exchange of financial information related to the acquisition of library resources between systems, such as an ILS and an ERMS. The document was approved on March 31, 2009 by the Business Information Topic Committee, which provides oversight to the CORE Working Group.

The CORE standard is being issued for a one-year trial use period, to run from April 1, 2009 through March 31, 2010. Following the DFSTU phase will be an evaluation and correction period before final publication.

"I am very pleased that CORE is available for trial use after just eight months' time. The CORE Working Group has produced a standard that provides a simple and effective solution to the problem of exchanging cost-related data from one system to another," commented Todd Carpenter, NISO's Managing Director.

Digital Preservation: JHOVE2 Functional Requirements Version 1.3 Released

JHOVE2 Functional Requirements version 1.3 has been released. (Thanks to the File Formats Blog.)

Here's an excerpt from the JHOVE Project Scope:

JHOVE has proven to be a successful tool for format-specific digital object identification, validation, and characterization, and has been integrated into the workflows of most major international preservation institutions and programs. Using an extensible plug-in architecture, JHOVE provides support for a variety of digital formats commonly used to represent audio, image, and textual content.

"Digital Project Staff Survey of JPEG 2000 Implementation in Libraries"

David Lowe and Michael J. Bennett, both of the University of Connecticut Libraries, have made "Digital Project Staff Survey of JPEG 2000 Implementation in Libraries" available in DigitalCommons@UConn.

Here's an excerpt from the abstract:

JPEG 2000 is the product of thorough efforts toward an open standard by experts in the imaging field. With its key components for still images published officially by the ISO/IEC by 2002, it has been solidly stable for several years now, yet its adoption has been considered tenuous enough to cause imaging software developers to question the need for continued support. Digital archiving and preservation professionals must rely on solid standards, so in the fall of 2008 we undertook a survey among implementers (and potential implementers) to capture a snapshot of JPEG 2000’s status, with an eye toward gauging its perception in our community.

The survey results reveal several key areas that JPEG 2000’s user community will need to have addressed in order to further enhance adoption of the standard, including perspectives from cultural institutions that have adopted it already, as well as insights from institutions that do not currently have it in their workflows. Current users are concerned about limited compatible software capabilities with an eye toward needed enhancements. They realize also that there is much room for improvement in the area of educating and informing the cultural heritage community about the advantages of JPEG 2000. A small set of users, in addition, alerts us to serious problems of cross-codec consistency and relate file validation issues that would likely be easily resolved given a modicum of collaborative attention toward standardization.