2,900+ Authors for One Article

Philip Davis at The Scholarly Kitchen has commented on "The ATLAS Experiment at the CERN Large Hadron Collider," an article that has over 2,900 authors.

Here's an excerpt:

Either the definition of authorship in high energy physics will need to change, or other methods for evaluating individuals will take dominance over publications. Collectively, this community will help draft new rules.

“Editorial: Google Deal or Rip-Off?”

In "Editorial: Google Deal or Rip-Off?," Francine Fialkoff, Library Journal Editor-in-Chief, takes a hard look at the Google-Association of American Publishers/Authors Guild copyright settlement.

Here's an excerpt:

Clearly, the public had little standing in the negotiations that led to the recent agreement in the class-action lawsuit against Google for scanning books from library shelves. . . . Well, the suit was never about the public interest but about corporate interests, and librarians did not have much power at the bargaining table, no matter how hard those consulted pushed. While there are many provisions in the document that specify what libraries can and can't do and portend greater access, ultimately, it is the restrictions that scream out at us from the miasma of details.

Other perspectives can be found in my recently updated Google Book Search Bibliography, Version 3.

Lessig Moves to Harvard Law

Noted copyright expert Lawrence Lessig has joined the faculty of Harvard Law School and become the faculty director of Harvard’s Edmond J. Safra Foundation Center for Ethics.

Here's an excerpt from the press release:

Lessig—a widely acclaimed expert in constitutional law, cyberlaw, and intellectual property—comes to Harvard from the faculty of Stanford Law School. Prior to joining the Stanford faculty in 2000, he was on the faculty of the University of Chicago Law School and Harvard Law School. . . .

As faculty director of the Center, Lessig will expand on the center’s work to encourage teaching and research about ethical issues in public and professional life. He will also launch a major five-year project examining what happens when public institutions depend on money from sources that may be affected by the work of those institutions—for example, medical research programs that receive funding from pharmaceutical companies whose drugs they review, or academics whose policy analyses are underwritten by special interest groups.

“I am very excited to be returning to Harvard to work on a project of enormous importance to our democracy,” said Lessig. “The chance to extend the work of the Center to focus on the problems of institutional independence is timely and essential. I am eager to work with friends and old colleagues from the Law School and across the University to make this project a success.”

A prolific writer, Lessig is the author of five books: “Remix” (2008), “Code v2” (2007), “Free Culture” (2004), “The Future of Ideas” (2001), and “Code, and Other Laws of Cyberspace” (1999). He has published more than 60 scholarly articles in leading law and technology journals. His work also appears regularly in the popular press, and he was a monthly columnist for Wired Magazine.

New Press to Publish Viral Spiral: How the Commoners Built a Digital Republic of Their Own

The New Press will publish David Bollier's Viral Spiral: How the Commoners Built a Digital Republic of Their Own.

Here's an excerpt from the announcement:

Reporting from the heart of this "free culture" movement, journalist and activist David Bollier provides the first comprehensive history of the attempt by a global brigade of techies, lawyers, artists, musicians, scientists, businesspeople, innovators, and geeks of all stripes to create a digital republic committed to freedom and innovation. Viral Spiral —the term Bollier coins to describe the almost-magical process by which Internet users can come together to build online commons and tools—brilliantly interweaves the disparate strands of this eclectic movement. The story describes major technological developments and pivotal legal struggles, as well as fascinating profiles of hacker Richard Stallman, copyright scholar Lawrence Lessig, and other colorful figures.

New Pew Report: Future of the Internet III

The Pew Internet & American Life Project has released Future of the Internet III.

Here’s an excerpt from the announcement:

Here are the key findings on the survey of experts by the Pew Internet & American Life Project that asked respondents to assess predictions about technology and its roles in the year 2020:

  • The mobile device will be the primary connection tool to the internet for most people in the world in 2020.
  • The transparency of people and organizations will increase, but that will not necessarily yield more personal integrity, social tolerance, or forgiveness.
  • Voice recognition and touch user-interfaces with the internet will be more prevalent and accepted by 2020.
  • Those working to enforce intellectual property law and copyright protection will remain in a continuing arms race, with the crackers who will find ways to copy and share content without payment.
  • The divisions between personal time and work time and between physical and virtual reality will be further erased for everyone who is connected, and the results will be mixed in their impact on basic social relations.
  • Next-generation engineering of the network to improve the current internet architecture is more likely than an effort to rebuild the architecture from scratch.

Book Industry Study Group BookDROP 1.0 Standard Released

The Book Industry Study Group's Digital Standards Committee has released BookDROP 1.0, which is "a standard intended to support the search and discovery of digital book content on the Web."

Here's an excerpt from the standard's description:

It was first published on December 8, 2008 and was developed jointly by the Book Industry Study Group and the Association of American Publishers. BookDROP defines a set of HTTP transactions between a publishers digital book archive and the websites of the publisher's syndication partners. The overall goal of BookDROP is to encourage the discovery, search, browse and distribution of digital book content across the Web while allowing publishers to manage the quality and availability of their content.

Read more about it at "BISG Unveils BookDROP Standard for Digital Book Repositories."

Open Source Archival Software: ICA-AtoM 1.0.4 Beta Released

ICA-AtoM 1.0.4 beta has been released.

Here's an excerpt from the What is ICA-AtoM? page:

ICA-AtoM stands for International Council on Archives—Access to Memory. It is a web-based, open-source application for standards-based archival description in a multi-lingual, multi-repository environment.

ICA-AtoM comprises:

  • HTML pages served to a web browser from a web server. Apache is used in development but ICA-AtoM is also compatible with IIS.
  • A database on a database server. MySQL is used in development but ICA-AtoM uses a database abstraction layer and is therefore also compatible with Postgres, SQLite, SQLServer, Oracle, etc.
  • PHP5 software code that manages requests and responses between the web clients, the application logic and the application content stored in the database.
  • The Symfony web framework that organizes the component parts using object-orientation and best practice web design patterns.
  • The Qubit Open Information Management toolkit, developed by the ICA-AtoM project and customized to make the ICA-AtoM application.

CiteSeerX and SeerSuite: Havester + Search Engine + AI

In "CiteSeerX and SeerSuite—Adding to the Semantic Web," Avi Rappoport overviews beta versions of CiteSeerX and its open source, Java-based counterpart, SeerSuite.

Here's an excerpt:

Building on that experience, CiteSeerX is a completely new system, re-architected for scaling and modularity, to handle increasing demands from both researchers and digital library programmatic interfaces. The system uses artificial intelligence, machine learning, support vector machines, and other techniques to recognize and extract metadata for the articles found. It now uses the Lucene search engine and supports standards such as the Open Archives Initiative (OAI), including metadata browsing, and Z39.50. CiteSeerX has a simple but powerful internal structure for documents and citations. If it cannot access a document cited, it creates a virtual document as a place holder, which can then be filled when the document is available.

Open Journal Systems SWORD Plugin

The Australian Partnership for Sustainable Repositories has released a SWORD plugin for Open Journal Systems, which was developed by Scott Yeadon and Leo Monus. The plugin requires "a significant amount of patching to DSpace," and it is recommended that testing be done with Fedora. A new version will be released next year that may eliminate the need for DSpace patching.

Status Report on UC’s Next Generation Melvyl Pilot Based on WorldCat Local

The California Digital Library has released Next Generation Melvyl Pilot: Update to the University Librarians, November 20, 2008, which describes the progress made in testing OCLC's WorldCat Local as a replacement for the existing Melvyl Catalog.

Here's an excerpt:

In the six months that the Next Generation Melvyl Pilot has been live, we have gathered information on the user experience, identified the strengths and remaining challenges of the system, and compared the pilot with UC’s goals as outlined in the 2005 Bibliographic Services Task Force (BSTF) Report. Users value the breadth of the service, the integration of journal articles, and the ease of use. Users find challenging the lack of Request integration, difficulties in emailing and printing, and problems in accessing materials, all of which are on OCLC’s roadmap for improvements in the coming year. The pilot is meeting many of the goals outlined in the BSTF report and OCLC has demonstrated the ability to make rapid improvements to the system.

Based on these data, we believe that the pilot shows sufficient promise that we should transition the project into a pre-production phase, in which both UC and OCLC will engage in the planning and preparation needed to position us for going to production in mid-2009 if we continue to see successful progress.

Larry Carver Named Digital Preservation Pioneer

The National Digital Information Infrastructure and Preservation Program at the Library of Congress has named Larry Carver, retired Director of Library Technologies and Digital Initiatives at University of California at Santa Barbara, as a digital preservation pioneer.

Here's an excerpt from the UCSB press release:

"We at the UCSB Library are thrilled that Larry Carver has received this important and well-deserved recognition," said Brenda Johnson, university librarian. "His tireless and innovative work in the development of the Map and Imagery Lab and the Alexandria Digital Library has brought international attention to our library and has benefited thousands of scholars, students, and members of the public from around the world. We offer him our heartiest congratulations on being named a Library of Congress ‘Pioneer of Digital Preservation.'" . . .

Carver began his career at the library where he helped build an impressive collection of maps, aerial photography, and satellite imagery that led to the development of the Map and Imagery Laboratory (MIL) in 1979. As the MIL collections grew, Carver felt that geospatial data presented a unique challenge to the library. He believed that coordinate-based collections should be managed differently than book-based collections. But not everyone agreed with him.

"It became apparent that handling traditional geospatial content in a typical library context was just not satisfactory and another means to control that data was important," he said. "It wasn't as easy as it sounds. I was in a very conservative environment, and they were not easily convinced that this was something a library should do."

Carver and others spent years developing an exhaustive set of requirements for building a geospatial information management system. The system had a number of innovative ideas. "We included traditional methods of handling metadata but also wanted to search by location on the Earth's surface," Carver said. "The idea was that if you point to a place on the Earth you could ask the question, 'What information do you have about that space?,' as opposed to a traditional way of having to know ahead of time who wrote about it."

An opportunity to develop that system arrived in 1994 when UCSB received funding from the National Science Foundation for Carver and his team to build the Alexandria Digital Library. "We produced the first operational digital library that was based on our research," Carver said. "Our concentration was to be able to develop a system that could search millions of records with latitude and longitude coordinates and present those results via the Internet."

The basic concepts behind the Alexandria Digital Library have been widely adopted by Google Earth, Wikipedia, and others. Carver couldn't be more delighted.

"I think it's wonderful," Carver said. "We weren't trying to be the only game in town. We were just trying to raise consciousness way back in the early 1980s that this was a viable way of handling geospatial material. This approach lets people interact with data in a realistic way without having a great deal of knowledge about an individual object. It was a new way of dealing with massive amounts of information in an environment that made finding and accessing information much easier."

Read more about it at "Digital Preservation Pioneer: Larry Carver."

Stanford Intellectual Property Litigation Clearinghouse Launched

The Law, Science & Technology Program at Stanford Law School has launched the Stanford Intellectual Property Litigation Clearinghouse.

Here's an excerpt from the press release:

This publicly available, online research tool will enable scholars, policymakers, lawyers, judges, and journalists to review real-time data about IP legal disputes that have been filed across the country, and ultimately to analyze the efficacy of the system that regulates patents, copyrights, trademarks, antitrust, and trade secrets.

The Intellectual Property Litigation Clearinghouse database includes real-time data summaries, industry indices, and trend analysis together with a full-text search engine, providing detailed and timely information that cannot be found elsewhere in the public domain. Stanford Law School, along with its partner organizations that funded the development and provided industry insight, are releasing the IPLC in phased modules, and today’s release, the Patent Litigation Module, includes more than 23,000 cases filed in U.S. district courts since 2000—raw data for every district court patent case and all results (outcomes and opinions).

Intellectual property (IP) is a key driver of the American economy, and IP litigation is big business. By one estimate, the nation’s copyright and patent industries alone contributed almost 20 percent of private industry’s share of the U.S. gross domestic product and were responsible for close to 40 percent of all private industry growth.

What's a Fast Wide Area Network Data Transfer? Now, It's 114 Gigabits per Second

At SuperComputing 2008, an international team headed by California Institute of Technology researchers demonstrated wide area network, multiple-country data transfers that peaked at 114 gigabits per second and sustained a 110 gigabit per second rate.

Read more about it at "High Energy Physics Team Sets New Data-Transfer World Records."

Google Book Search Bibliography, Version 3

The Google Book Search Bibliography, Version 3 is now available.

This bibliography presents selected English-language articles and other works that are useful in understanding Google Book Search. It primarily focuses on the evolution of Google Book Search and the legal, library, and social issues associated with it. Where possible, links are provided to works that are freely available on the Internet, including e-prints in disciplinary archives and institutional repositories. Note that e-prints and published articles may not be identical.

"In Search Of A Standardized Model for Institutional Repository Assessment or How Can We Compare Institutional Repositories?"

Chuck Thomas, Florida Center for Library Automation, and Robert H. McDonald, University of California San Diego, have deposited a postprint of "In Search Of A Standardized Model for Institutional Repository Assessment or How Can We Compare Institutional Repositories?" in eScholarship.

Here's the abstract:

Assessing universities and faculty is a continuous struggle. Academic administrators must labor year after year to gather meaningful statistics for assessment exercises such as periodic institutional accreditations, program reviews, and annual funding requests. It is hard to overstate the difficulty and complexity of compiling such data. The professional literature of higher education administration contains frequent calls over the past several decades, for better ways to measure performance in colleges and universities. One way to accomplish this is through the work of research libraries and their use of institutional repositories. Developing a standardized way to assess a university's ouput through the use of digital repository metrics is one such method to assess and compare separate institutions. This paper looks at several models that could be of use in this process.

Laine Farley Named as Executive Director of the California Digital Library

Laine Farley has been named as the Executive Director of the California Digital Library. Farley has served as the Interim Executive Director since July 2006.

Here's an excerpt from the press release:

"What we needed was not just a great leader for the CDL, but also a strategy for building the next generation of digital libraries," said Daniel Greenstein, UC vice provost for academic information and strategic services. "It was equally clear that the best way forward in envisioning this new world would be to draw upon the creativity, leadership and talent already within UC and the CDL, and to ramp up our planning efforts. Laine's vision and leadership, which she has demonstrated during challenging times, will take the CDL in new and exciting directions."

As part of ongoing planning with the University of California libraries, Farley will work closely with the university librarians on the 10 campuses and others throughout the UC system to ensure that systemwide library services continue to evolve to better support libraries and scholars.

Previously, Farley's roles at the CDL have included positions as director of digital library services and deputy university librarian. In addition, she was the user services coordinator and the coordinator of bibliographic policy and services at the UC Division of Library Automation. She has also been a reference librarian and coordinator of bibliographic instruction at UC Riverside, and head of the humanities department at the Steen Library at Stephen F. Austin State University. Farley holds a B.A. in liberal arts (Plan II) and an M.L.S. from the University of Texas at Austin.