The ticTOCs Project: Enhancing Table-of-Contents RSS Feeds

The goal JISC-funded ticTOCs Project is to greatly enhance access to and re-use of journal table-of-contents RSS feeds.

Here's an excerpt from ticTOCs in a Nutshell:

ticTOCs intends be a catalyst for change by incorporating existing technology plus Web 2.0 concepts in the smart aggregation, recombination, synthesization, output and reuse of standardised journal Table of Contents (TOC) RSS feeds from numerous fragmented sources (journal publishers). These TOCs, and their content, will be presented in a personalisable and interactive web-based interface that requires little or no understanding, by the user, of the technical or procedural concepts involved. It has been called ticTOCs because in certain instances it will involve the selective ticking of appropriate TOCs, and also because ticTOCs is a memorable name, something which is important in todays online environment.

ticTOCs will incorporate:

  1. A user-friendly web-based, AJAX enabled TOCosphere for the smart aggregation, personalisation, output and reuse of TOC RSS feeds and contents. It will allow users to discover, select, personalise, display, reuse and export (to bibliographic software).
  2. Within this TOCosphere there will be a Directory of TOCs to allow easy selection by title, subject, ISSN, and so on.
  3. Re-use of data this will involve embedding TOCs and combined TOCs in research output showcases, gateways, VREs, websites, etc.
  4. Easy links from a multitude of journals lists to ticTOCs using chicklet subscribe buttons
  5. Data gathered for analysis presents many possibilities.
  6. Community networking possibilities, within the TOCosphere. . . .

The ticTOCs Consortium consists of: the University of Liverpool Library (lead), Heriot-Watt University, CrossRef, ProQuest CSA, Emerald, RefWorks, MIMAS, Cranfield University, Nature Publishing Group, Institute of Physics, SAGE Publishers, Inderscience Publishers, DOAJ (Directory of Open Access Journals), Open J-Gate, and Intute.

Urgent: Send a Message to Congress about the NIH Public Access Policy

Peter Suber has pointed out that ALA has an Action Alert that allows you to just fill in a form to send a message to your Congressional representatives about the NIH Public Access Policy.

Under "Compose Message" in the form, I suggest that you shorten the Subject to "Support the NIH Public Access Policy." As an "Issue Area" you might use "Budget" or "Health." Be sure to fill in your salutation and phone number; they are required to send an e-mail even though the form does not show them as required fields.

I’ve made slight modifications to the talking points and created a Web page so that the talking points can simply be cut and pasted into the "Editable text to" section of the form as the message.

ACRLog Urgent Call for Action about NIH Policy Vote

An urgent call for action has been issued on ACRLog about upcoming House and Senate votes on Labor, Health and Human Services appropriations bills that will determine whether NIH-funded researchers are required to make their final manuscripts publicly accessible within twelve months of publication.

Here's an excerpt from the posting:

We need your help to keep the momentum going. The full House of Representatives and the full Senate will vote on their respective measures this summer. The House is expected to convene on Tuesday, July 17. We’re asking that you contact your US Representative and your US Senators by phone or fax as soon as possible and no later than Monday afternoon. Urge them to maintain the Appropriations Committee language. (Find talking points and contact info for your legislators in the ALA Legislative Action Center. It is entirely possible that an amendment will be made on the floor of the House to delete the language in the NIH policy.

Want to know more? Listen to an interview with Heather Joseph of SPARC on the ALA Washington Office District Dispatch blog. Find background on the issue along with tips on communicating effectively with your legislators in the last two issues of ACRL’s Legislative Update and at the Alliance for Taxpayer Access website.

Peter Suber has issued a similar call on Open Access News. Here it is in full:

Tell Congress to support an OA mandate at the NIH

Let me take the unusual step of repeating a call to action from yesterday in case it got buried in the avalanche of news. 

The House Appropriations Committee approved language establishing an OA mandate at the NIH.  The full House is scheduled to vote on the appropriations bill containing that language on Tuesday, July 17

Publishers are lobbying hard to delete this language.  If you are a US citizen and support public access for publicly-funded research, please ask your representative to support this bill, and to oppose any attempt to amend or strike the language.  Contact your representative now, before you forget.

Time is short.  Offices are closed on the weekend, but emails and faxes will go through.  Send an email or fax right now or telephone before Monday afternoon.

Because the Senate Appropriations Committee approved the same language in June, you should contact your Senators with the same message.  But the vote by the full House is in three days, while the vote by the full Senate has not yet been scheduled.

For help in composing your message, see

Then spread the word!

Australian Framework and Action Plan for Digital Heritage Collections

The Collections Council of Australia Ltd. has released Australian Framework and Action Plan for Digital Heritage Collections, Version 0.C3 for comment.

Here's an excerpt from the document:

This is the Collections Council of Australia's plan to prepare an Australian framework for digital heritage collections. It brings together information shared by people working in archives, galleries, libraries and museums at a Summit on Digital Collections held in 2006. It proposes an Action Plan to address issues shared by the Australian collections sector in relation to current and future management of digital heritage collections.

Update on the DSpace Foundation

Michele Kimpton, Executive Director of the DSpace Foundation, gave gave a talk about the foundation at the DSpace UK & Ireland User Group meeting in early July.

Her PowerPoint presentation is now available.

Source: Lewis, Stuart. "Presentations from Recent DSpace UK & Ireland User Group Meeting," Unilever Centre for Molecular Informatics, Cambridge—Jim Downing, 11 July 2007.

Publisher Mergers: Walter de Gruyter Buys K. G. Saur Verlag

In yet another scholarly publishing company merger, Walter de Gruyter has announced that it has acquired K. G. Saur und Max Niemeyer.

Here’s an excerpt from the press release:

Walter de Gruyter GmbH & Co. KG has with immediate effect acquired the complete publishing programme of K. G. Saur Verlag GmbH, which since 2005 has also included the programme of Max Niemeyer Verlag. Through this acquisition Walter de Gruyter will become the market leader in the subject areas classical studies, philosophy, German studies, linguistics and English and Romance studies, as well as in library sciences and general library reference works.

For an analysis of the effect of publisher mergers on serials prices, see the works of Dr. Mark J. McCabe.

An Ecological Approach to Repository and Service Interactions

UKOLN and JISC CETIS have released An Ecological Approach to Repository and Service Interactions, Draft Version 0.9 for comment.

Here’s an excerpt from the "Not the Executive Summary" section:

This work began with the need to express something of how and why repositories and services interact. As a community we have well understood technical models and architectures that provide mechanisms for interoperability. The actual interactions that occur, however, are not widely understood and knowledge about them is not often shared. This is in part because we tend to share in the abstract through architectures and use cases, articulating interactions or connections requires an engagement with specific details. . . .

Ecology is the study of systems that are complex, dynamic, and full of interacting entities and processes. Although the nature of these interactions and processes may be highly detailed, a higher level view of them is accessible and intuitive. We think that ecology and the ecosystems it studies may offer a useful analogy to inform the task of understanding and articulating the interactions between users, repositories, and services and the information environments in which they take place. This report outlines some concepts from ecology that may be useful and suggests some definitions for a common conversation about the use of this metaphor.

We hope that this report suggests an additional way to conceptualise and analyse interactions and provide a common vocabulary for an ecological approach. It should as a minimum provoke and support some useful discussions about networks and communities.

Curation of Scientific Data: Challenges for Institutions and Their Repositories Podcast

A podcast of Chris Rusbridge’s "Curation of Scientific Data: Challenges for Institutions and their Repositories" presentation at The Adaptable Repository conference is now available. Rusbridge is Director of the Digital Curation Centre in the UK.

The PowerPoint for the presentation is also available.

CLIR Receives Mellon Grant to Study Mass Digitization

According to a O’Reilly Radar posting, the Council on Library and Information Resources has been awarded a grant from the Mellon Foundation to study mass digitization efforts.

Here’s an excerpt from the posting that describes the grant’s objectives:

  1. Assess selected large scale digitization programs by exploring their efficacy and utility for conducting scholarship, in multiple fields or disciplines (humanities, sciences, etc.).
  2. Write and issue a report with findings and recommendations for improving the design of mass digitization projects.
  3. Create a Collegium that can serve in the long-term as an advisory group to mass digitization efforts, helping to assure and obtain the highest possible data quality and utility.
  4. Convene a series of meetings amongst scholars, libraries, publishers, and digitizing organizations to discuss ways of achieving these quality and design improvements.

Lund University Journal Info Database Now Available

Lund University Libraries, creators of the Directory of Open Access Journals, has released a new database called Journal Info, which provides authors with information about 18,000 journals selected from 30 major databases. The National Library of Sweden provides support for JI, which is under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported license.

Here’s an excerpt from the FAQ page:

The purpose [of the service] is to provide an aid for the researcher in the selection of journal for publication. The publication market has continuously grown more and more complex. It is important to weigh in facts like scope and quality, but more recently also information about reader availability and library cost. The Lund University Libraries have made an attempt to merge all there items into one tool, giving the researcher the power to make informed choices.

Journal Info records provide basic information about the journal (e.g. journal homepage), "reader accessibility" information (e.g., open access status), and quality information (e.g., where it is indexed).

DSpace How-To Guide

Tim Donohue, Scott Phillips, and Dorothea Salo have published DSpace How-To Guide: Tips and Tricks for Managing Common DSpace Chores (Now Serving DSpace 1.4.2 and Manakin 1.1).

This 55-page booklet, which is under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 License, will be a welcome addition to the virtual bookshelves of institutional repository managers struggling with the mysteries of DSpace.

DRAMA Releases Fedora Front-End Beta for Authentication/Full-Text Search

DRAMA (Digital Repository Authorization Middleware Architecture) has released Fiddler, a beta version of its mura Fedora front-end that provides access control, authentication, full-text searching and a variety of other functions. DRAMA is a sub-project of RAMP (Research Activityflow and Middleware Priorities Project).

Here’s an excerpt from the news item that describes Fiddler’s features:

  • Hierarchical access control enforcement: Policies can be applied at the collection level, object level or datastream level. . . .
  • Improved access control interface: One can now view existing access control of a particular user or group for a given datastream, object or collection. . . .
  • User-centric GUI: mura only presents users with operations for which they have permissions.
  • XForms Metadata Input: We employ an XForms engine (Orbeon) for metadata input. XForms allow better user interaction, validation and supports any XML-based metadata schemas (such as MARC or MODS).
  • LDAP Filter for Fedora: The current Fedora LDAP filter (in version 2.2) does not authenticate properly, so we have developed a new LDAP filter to fix this problem.
  • Local authentication for DAR and ASM: In addition to Shibboleth authentication, the DAR and ASM can be configured to use a local authentication source (eg. via a local LDAP).
  • Generic XACML Vocabulary: XACML policies are now expressed in a generic vocabulary rather than Fedora specific ones. . . .
  • XACML Optimization: We have optimized of the evaluation engine by employing a cache with user configurable time-to-live. We have also greatly reduced the time for policies matching with DB XML, through the use of bind parameters in our queries.
  • Flexible mapping of Fedora actions to new Apache Axis handlers: Axis is the SOAP engine that Fedora employs to provide its web services. The new flexibility allows new handlers to be easily plugged into Fedora to support new features that follow the same Interceptor pattern as our authorization framework.
  • Version control: mura now supports version control.
  • Full-text search: We enabled full-text search by incorporating Fedoragsearch package.

Remembering Mosiac, the Web Browser That Changed Everything

If you have never had to use a standalone FTP client, a standalone Telnet client, a Gopher client, or a standalone USENET client, it might be hard to imagine what the Internet was like before Mosiac, the Web browser that put the World-Wide Web on the map and transformed the Internet (and the world). Go dig up a copy of The Internet for Everyone: A Guide for Users and Providers out of your library’s stacks, dust it off, and marvel at how far we have come since 1993. You’ll also meet Archie, Veronica, and WAIS, the Googles of their day.

Another way to travel back in time is to read PC Magazine‘s 1994 review of the NCSA Mosaic for Windows, and, if you really want a history lesson, download Mosaic from the National Center for Supercomputing Applications (yes, it’s still available). Also take a look at the NCSA’s About NCSA Mosaic page.

To finish off your journey to the Internet’s Paleolithic age, check out the Timeline of Web Browsers and Hobbes’ Internet Timeline v8.2.

Of course, if you do remember these seemingly ancient technologies, you can easily imagine how primitive today’s hot technologies, such as Web 2.0, will seem in 14 years, and you may wonder whether future generations will remember them clearly or as a minor footnote in technological history.

Report of the Sustainability Guidelines for Australian Repositories Project (SUGAR)

The Australian Partnership for Sustainable Repositories (APSR) has released Report of the Sustainability Guidelines for Australian Repositories Project (SUGAR).

Here’s an excerpt from the report:

The Sustainability Guidelines for Australian Repositories service (SUGAR)was intended to support people working in tertiary education institutions whose activities do not focus on digital preservation. The target community creates and digitises content for a range of purposes to support learning, teaching and research. While some have access to technical and administrative support many others may not be aware of what they need to know. The typical SUGAR user may have little interest in discussions surrounding metadata, interoperability or digital preservation, and may simply want to know the essential steps involved in achieving the task at hand.

A key challenge for SUGAR was to provide a suitable level and amount of information to meet the immediate focus of the user and their level of expertise while introducing and encouraging consideration of issues of digital sustainability. SUGAR was also intended to stand alone as an online service unsupported by a helpdesk.

Towards an Open Source Repository and Preservation System

The UNESCO Memory of the World Programme, with the support of the Australian Partnership for Sustainable Repositories, has published Towards an Open Source Repository and Preservation System: Recommendations on the Implementation of an Open Source Digital Archival and Preservation System and on Related Software Development.

Here’s an excerpt from the Executive Summary and Recommendations:

This report defines the requirements for a digital archival and preservation system using standard hardware and describes a set of open source software which could used to implement it. There are two aspects of this report that distinguish it from other approaches. One is the complete or holistic approach to digital preservation. The report recognises that a functioning preservation system must consider all aspects of a digital repositories; Ingest, Access, Administration, Data Management, Preservation Planning and Archival Storage, including storage media and management software. Secondly, the report argues that, for simple digital objects, the solution to digital preservation is relatively well understood, and that what is needed are affordable tools, technology and training in using those systems.

An assumption of the report is that there is no ultimate, permanent storage media, nor will there be in the foreseeable future. It is instead necessary to design systems to manage the inevitable change from system to system. The aim and emphasis in digital preservation is to build sustainable systems rather than permanent carriers. . . .

The way open source communities, providers and distributors achieve their aims provides a model on how a sustainable archival system might work, be sustained, be upgraded and be developed as required. Similarly, many cultural institutions, archives and higher education institutions are participating in the open source software communities to influence the direction of the development of those softwares to their advantage, and ultimately to the advantage of the whole sector.

A fundamental finding of this report is that a simple, sustainable system that provides strategies to manage all the identified functions for digital preservation is necessary. It also finds that for simple discrete digital objects this is nearly possible. This report recommends that UNESCO supports the aggregation and development of an open source archival system, building on, and drawing together existing open source programs.

This report also recommends that UNESCO participates through its various committees, in open source software development on behalf of the countries, communities, and cultural institutions, who would benefit from a simple, yet sustainable, digital archival and preservation system. . . .

ARL’s Library Brown-Bag Lunch Series: Issues in Scholarly Communication

The Association of Research Libraries (ARL) has released a series of discussion guides for academic librarians to use with faculty. The guides are under a Creative Commons Attribution-ShareAlike 3.0 United States license.

Here’s an excerpt from the guides’ web page:

This series of Discussion Leader’s Guides can serve as a starting point for a single discussion or for a series of conversations. Each guide offers prework and discussion questions along with resources that provide further background for the discussion leader of an hour-long session.

Using the discussion guides, library leaders can launch a program quickly without requiring special expertise on the topics. A brown-bag series could be initiated by a library director, a group of staff, or by any staff person with an interest in the scholarly communication system. The only requirements are the willingness to organize the gatherings and facilitate each meeting’s discussion.

The University of Maine and Two Public Libraries Adopt Emory’s Digitization Plan

Library Journal Academic Newswire reports that the University of Maine, the Toronto Public Library, and the Cincinnati Public Library will follow Emory University’s lead and digitize public domain works utilizing Kirtas scanners with print-on-demand copies being made available via BookSurge. (Also see the press release: "BookSurge, an Amazon Group, and Kirtas Collaborate to Preserve and Distribute Historic Archival Books.")

Source: "University of Maine, plus Toronto and Cincinnati Public Libraries Join Emory in Scan Alternative." Library Journal Academic Newswire, 21 June 2007.

Dealing with Data: Roles, Rights, Responsibilities and Relationships

JISC has released its Dealing with Data: Roles, Rights, Responsibilities and Relationships: Consultancy Report, which was written as part of its Digital Repositories Programme’s Data Cluster Consultancy.

Here’s an excerpt from the Executive Summary:

This Report explores the roles, rights, responsibilities and relationships of institutions, data centres and other key stakeholders who work with data. It concentrates primarily on the UK scene with some reference to other relevant experience and opinion, and is framed as "a snapshot" of a relatively fast-moving field. . . .

The Report is largely based on two methodological approaches: a consultation workshop and a number of semi-structured interviews with stakeholder representatives.

It is set within the context of the burgeoning "data deluge" emanating from e-Science applications, increasing momentum behind open access policy drivers for data, and developments to define requirements for a co-ordinated e-infrastructure for the UK. The diversity and complexity of data are acknowledged, and developing typologies are referenced.

Version 68, Scholarly Electronic Publishing Bibliography

Version 68 of the Scholarly Electronic Publishing Bibliography is now available from Digital Scholarship. This selective bibliography presents over 3,040 articles, books, and other printed and electronic sources that are useful in understanding scholarly electronic publishing efforts on the Internet.

The Scholarly Electronic Publishing Bibliography: 2006 Annual Edition is also available from Digital Scholarship. Annual editions of the Scholarly Electronic Publishing Bibliography are PDF files designed for printing.

The bibliography has the following sections (revised sections are in italics):

1 Economic Issues
2 Electronic Books and Texts
2.1 Case Studies and History
2.2 General Works
2.3 Library Issues
3 Electronic Serials
3.1 Case Studies and History
3.2 Critiques
3.3 Electronic Distribution of Printed Journals
3.4 General Works
3.5 Library Issues
3.6 Research
4 General Works
5 Legal Issues
5.1 Intellectual Property Rights
5.2 License Agreements
6 Library Issues
6.1 Cataloging, Identifiers, Linking, and Metadata
6.2 Digital Libraries
6.3 General Works
6.4 Information Integrity and Preservation
7 New Publishing Models
8 Publisher Issues
8.1 Digital Rights Management
9 Repositories, E-Prints, and OAI
Appendix A. Related Bibliographies
Appendix B. About the Author
Appendix C. SEPB Use Statistics

Scholarly Electronic Publishing Resources includes the following sections:

Cataloging, Identifiers, Linking, and Metadata
Digital Libraries
Electronic Books and Texts
Electronic Serials
General Electronic Publishing
Images
Legal
Preservation
Publishers
Repositories, E-Prints, and OAI
SGML and Related Standards

Council of Australian University Librarians ETD Survey Report

The Council of Australian University Librarians has released Australasian Digital Theses Program: Membership Survey 2006.

Here’s an excerpt from the "Key Findings" section:

1. The average percentage of records for digital theses added to ADT is 95% when digital submission is mandatory and 17% when it is not mandatory. . . .

2. 59% of respondents will have mandatory digital submission in place in 2007.

3. With this level of mandatory submission it is predicted that 60% of all theses produced in Australia and New Zealand in 2007 will have a digital copy recorded in ADT. . . .

5. The overwhelming majority of respondents offer a mediated submission service, either only having a mediated service or offering both mediated and self-submission services. When mediated and self-submission are both available, the percentage self-submitted is polarised with some achieving over a 75% self-submission rate.

6. Over half the respondents have a repository already and most are using it to manage digital theses.

7. 87% will have a repository by the end of this year, and the rest are in the initial planning stage.

CIC’s Digitization Contract with Google

Library Journal Academic Newswire has published a must-read article ("Questions Emerge as Terms of the CIC/Google Deal Become Public") about the Committee on Institutional Cooperation’s Google Book Search Library Project contract.

The article includes quotes from Peter Brantley, Digital Library Federation Executive Director, from his "Monetizing Libraries" posting about the contract (another must-read piece).

Here’s an excerpt from Brantley’s posting:

In other words—pretty much, unless Google ceases business operations, or there is a legal ruling or agreement with publishers that expressly permits these institutions (excepting Michigan and Wisconsin which have contracts of precedence) to receive digitized copies of In-Copyright material, it will be held in escrow until such time as it becomes public domain.

That could be a long wait. . . .

In an article early this year in The New Yorker, "Google’s Moon Shot," Jeffrey Toobin discusses possible outcomes of the antagonism this project has generated between Google and publishers. Paramount among them, in his mind, is a settlement. . . .

A settlement between Google and publishers would create a barrier to entry in part because the current litigation would not be resolved through court decision; any new entrant would be faced with the unresolved legal issues and required to re-enter the settlement process on their own terms. That, beyond the costs of mass digitization itself, is likely to deter almost any other actor in the market.

Report on Chemistry Teaching/Research Data and Institutional Repositories

The JISC-funded SPECTRa project has released Project SPECTRa (Submission, Preservation and Exposure of Chemistry Teaching and Research Data): JISC Final Report, March 2007.

Here’s an excerpt from the Executive Summary:

Project SPECTRa’s principal aim was to facilitate the high-volume ingest and subsequent reuse of experimental data via institutional repositories, using the DSpace platform, by developing Open Source software tools which could easily be incorporated within chemists’ workflows. It focussed on three distinct areas of chemistry research—synthetic organic chemistry, crystallography and computational chemistry.

SPECTRa was funded by JISC’s Digital Repositories Programme as a joint project between the libraries and chemistry departments of the University of Cambridge and Imperial College London, in collaboration with the eBank UK project. . . .

Surveys of chemists at Imperial and Cambridge investigated their current use of computers and the Internet and identified specific data needs. The survey’s main conclusions were:

  • Much data is not stored electronically (e.g. lab books, paper copies of spectra)
  • A complex list of data file formats (particularly proprietary binary formats) being used
  • A significant ignorance of digital repositories
  • A requirement for restricted access to deposited experimental data

Distributable software tool development using Open Source code was undertaken to facilitate deposition into a repository, guided by interviews with key researchers. The project has provided tools which allow for the preservation aspects of data reuse. All legacy chemical file formats are converted to the appropriate Chemical Markup Language scheme to enable automatic data validation, metadata creation and long-term preservation needs. . . .

The deposition process adopted the concept of an "embargo repository" allowing unpublished or commercially sensitive material, identified through metadata, to be retained in a closed access environment until the data owner approved its release. . . .

Among the project’s findings were the following:

  • it has integrated the need for long-term management of experimental chemistry data with the maturing technology and organisational capability of digital repositories;
  • scientific data repositories are more complex to build and maintain than are those designed primarily for text-based materials;
  • the specific needs of individual scientific disciplines are best met by discipline-specific tools, though this is a resource-intensive process;
  • institutional repository managers need to understand the working practices of researchers in order to develop repository services that meet their requirements;
  • IPR issues relating to the ownership and reuse of scientific data are complex, and would benefit from authoritative guidance based on UK and EU law.