100 Year Archive Requirements Survey

The Storage Networking Industry Association has released the 100 Year Archive Requirements Survey. Access requires registration.

Here's an excerpt from the "Survey Highlights":

  • 80% of respondents declared they have information they must keep over 50 years and 68% of respondents said they must keep it over 100 years. . . .
  • Long-term generally means greater than 10 to 15 years—the period beyond which multiple migrations take place and information is at risk. . .
  • Database information (structured data) was considered to be most at risk of loss. . .
  • Over 40% of respondents are keeping e-Mail records over 10 years. . . .
  • Physical migration is a big problem. Only 30% declared they were doing it correctly at 3-5 year intervals. . . .
  • 60% of respondents say they are ‘highly dissatisfied’ that they will be able to read their retained information in 50 years. . .
  • Help is needed—current practices are too manual, too prone to error, too costly and lack adequate coordination across the organization. . . .

Fedora Commons Website Launches

The Fedora Commons Website has gone live.

Here's an excerpt from the About Fedora Commons page:

Fedora Commons is a non-profit organization providing sustainable technologies to create, manage, publish, share and preserve digital content as a basis for intellectual, organizational, scientific and cultural heritage by bringing two communities together.

Communities of practice that include scholars, artists, educators, Web innovators, publishers, scientists, librarians, archivists, publishers, records managers, museum curators or anyone who presents, accesses, or preserves digital content.

Software developers who work on the cutting edge of open source Web and enterprise content technologies to ensure that collaboratively created knowledge is available now and in the future.

Fedora Commons is the home of the unique Fedora open source software, a robust integrated repository-centered platform that enables the storage, access and management of virtually any kind of digital content.

Here's an excerpt from the press release about the Gordon and Betty Moore Foundation grant that helps fund the Fedora Commons:

Fedora Commons today announced the award of a four year, $4.9M grant from the Gordon and Betty Moore Foundation to develop the organizational and technical frameworks necessary to effect revolutionary change in how scientists, scholars, museums, libraries, and educators collaborate to produce, share, and preserve their digital intellectual creations. Fedora Commons is a new non-profit organization that will continue the mission of the Fedora Project, the successful open-source software collaboration between Cornell University and the University of Virginia. The Fedora Project evolved from the Flexible Extensible Digital Object Repository Architecture (Fedora) developed by researchers at Cornell Computing and Information Science.

With this funding, Fedora Commons will foster an open community to support the development and deployment of open source software, which facilitates open collaboration and open access to scholarly, scientific, cultural, and educational materials in digital form. The software platform developed by Fedora Commons with Gordon and Betty Moore Foundation funding will support a networked model of intellectual activity, whereby scientists, scholars, teachers, and students will use the Internet to collaboratively create new ideas, and build on, annotate, and refine the ideas of their colleagues worldwide. With its roots in the Fedora open-source repository system, developed since 2001 with support from the Andrew W. Mellon Foundation, the new software will continue to focus on the integrity and longevity of the intellectual products that underlie this new form of knowledge work. The result will be an open source software platform that both enables collaborative models of information creation and sharing, and provides sustainable repositories to secure the digital materials that constitute our intellectual, scientific, and cultural history.

There's a 20% Chance That You Are a Digital Simulation Living in a Virtual World

Nick Bostrom, Director of the Future of Humanity Institute at Oxford, says in a New York Times article today:

“My gut feeling, and it’s nothing more than that,” he says, “is that there’s a 20 percent chance we’re living in a computer simulation.”

Bostrom thinks so because, barring a future prohibition on creating simulated worlds or disinterest in doing so, that our posthuman descendants are almost certain to create simulations of the past. The more simulations that are run, the more likely that you are in one.

By some estimates, there will be enough available computing power to create a simulated world by 2050.

However, there could be a recursive problem:

It’s also possible that there would be logistical problems in creating layer upon layer of simulations. There might not be enough computing power to continue the simulation if billions of inhabitants of a virtual world started creating their own virtual worlds with billions of inhabitants apiece.

I wouldn't count on it though.

Source: Tierney, John. "Our Lives, Controlled From Some Guy's Couch." The New York Times, 14 August 2007, D1, D4.

Welcome to the DRM Zone: Case in Point, the Google Video Store

If you have ever purchased or rented a video from the Google Video Store, it will cease to function on August 15, 2007. That's because the Google Video Store is being shut down and along with it Google 's associated DRM system.

Customers will get credits in Google Checkout for what they spent on Google Video Store products, but not cash refunds, meaning that they must buy merchandise available via that service to recoup their losses. Of course, this does not compensate purchasers for the inconvenience of having to replace their videos (assuming that they can).

This fiasco underlines a key problem with DRM: it doesn't just restrict access, it restricts access using proprietary technologies, and, with few exceptions, those technologies cannot be legally circumvented under U.S. law.

Source: Fisher, Ken. "Google Selleth Then Taketh Away, Proving the Need for DRM Circumvention." Ars Technica, 12 August 2007.

Berkeley Electronic Press Acquires Digital Commons IR Software

The Berkeley Electronic Press (bepress) has acquired the Digital Commons institutional repository software from ProQuest. bepress was the original creator of the software.

Here's an excerpt from the press release:

ProQuest and The Berkeley Electronic Press ("bepress") today announced that they have reached an agreement for bepress to purchase ownership of Digital Commons, the world's leading hosted institutional repository solution. Bepress will be adding sales and marketing staff and augmenting its existing customer support and services in addition to the hosting and technology services that it has always provided Digital Commons customers.

Bepress Chairman, Aaron Edlin, said "Institutional Repositories are core to the bepress mission of furthering scholarly communication and thus bepress is excited at the opportunity to build a close relationship with Digital Commons customers. Developing successful and vibrant Institutional Repositories will be bepress's central focus."

TableSeer: Searching and Ranking PDF Table Data

Researchers at Penn State's College of Information Sciences and Technology's Cyber-Infrastructure Lab have developed open source software called TableSeer that can find, extract, search, and rank table data from PDF files. Source code will be available at the project's close.

Here's an extract from the press release:

Tables are an important data resource for researchers. In a search of 10,000 documents from journals and conferences, the researchers found that more than 70 percent of papers in chemistry, biology and computer science included tables. Furthermore, most of those documents had multiple tables.

But while some software can identify and extract tables from text, existing software cannot search for tables across documents. That means scientists and scholars must manually browse documents in order to find tables-a time-consuming and cumbersome process.

TableSeer automates that process and captures data not only within the table but also in tables' titles and footnotes. In addition, it enables column-name-based search so that a user can search for a particular column in a table.

In tests with documents from the Royal Society of Chemistry, TableSeer correctly identified and retrieved 93.5 percent of tables created in text-based formats. . . .

Information on TableSeer can be found in a paper, "TableSeer: Automatic Table Metadata Extraction and Searching in Digital Libraries," by Ying Liu, Kun Bai, Mitra and Giles of the Penn State College of Information Sciences and Technology.

UNIX Ruling: An Open Source Victory

In a blow to the SCO Group, Dale A. Kimball, a judge in the U.S. District Court for the District of Utah Central District, has ruled that Novell owns the disputed copyright to the UNIX operating system. The judge also ruled that SCO must drop its suits against IBM Corp and Sequant as well as pay Novell part of its licensing fees from Sun and Microsoft.

Here's an excerpt from "Novell Wins Right to Unix, Dismissing SCO":

The ruling is good news for organizations that use open-source software products, said Jim Zemlin, executive director of the Linux Foundation. "From the perspective of someone who is adopting open-source solutions to run in the enterprise, it proves to them that the industry is going to defend the platform, and that when organizations attack it from a legal perspective, that the industry collectively will defend it," he said.

Here's an excerpt from "Judge Says Unix Copyrights Belong to Novell":

The court's ruling has cut out the core of SCO's case and, as a result, eliminates SCO's threat to the Linux community based upon allegations of copyright infringement of Unix," said Joe LaSala, Novell's senior vice president and general counsel.

Sources: Gohring, Nancy. "Novell Wins Right to Unix, Dismissing SCO." InfoWorld, 10 August 2007; Markoff, John. "Judge Says Unix Copyrights Belong to Novell." The New York Times, 11 August 2007.

Second Life Impacts Real Life and Vice Versa

What happens in Second Life is increasingly influencing real life and vice versa. Here are some recent highlights:

Cornell Joins Google Books Library Project

The Cornell University Library has joined the Google Books Library Project.

Here's an excerpt from the press release:

Google will digitize up to 500,000 works from Cornell University Library and make them available online using Google Book Search. As a result, materials from the library’s exceptional collections will be easily accessible to students, scholars and people worldwide, supporting the library’s long-standing commitment to make its collections broadly available.

“Research libraries today are integral partners in the academic enterprise through their support of research, teaching and learning. They also serve a public good by enhancing access to the works of the world's best minds,” said Interim University Librarian Anne R. Kenney. “As a major research library, Cornell University Library is pleased to join its peer institutions in this partnership with Google. The outcome of this relationship is a significant reduction in the time and effort associated with providing scholarly full-text resources online.”

Materials from Mann Library, one of 20 member libraries that comprise Cornell University Library, will be digitized as part of the agreement. Mann’s collections include some of the following subject areas: biological sciences, natural resources, plant, animal and environmental sciences, applied economics, management and public policy, human development, textiles and apparel, nutrition and food science.. . .

Cornell is the 27th institution to join the Google Book Search Library Project, which digitizes books from major libraries and makes it possible for Internet users to search their collections online. Over the next six years, Cornell will provide Google with public domain and copyrighted holdings from its collections. If a work has no copyright restrictions, the full text will be available for online viewing. For books protected by copyright, users will just get the basic background (such as the book’s title and the author’s name), at most a few lines of text related to their search and information about where they can buy or borrow a book. Cornell University Library will work with Google to choose materials that complement the contributions of the project’s other partners. In addition to making the materials available through its online search service, Google will also provide Cornell with a digital copy of all the materials scanned, which will eventually be incorporated into the university’s own digital library.

An Empirical Study of U.S. Copyright Fair Use Opinions, 1978-2005

Barton Beebe, Associate Professor of Law at the Benjamin N. Cardozo School of Law of Yeshiva University, has released "An Empirical Study of U.S. Copyright Fair Use Opinions, 1978-2005."

Here's an excerpt from the e-print's abstract:

This Article presents the results of the first empirical study of our fair use case law to show that much of our conventional wisdom about that case law is wrong. Working from a data set consisting of all reported federal opinions that made substantial use of the Section 107 four-factor test for fair use through 2005, the Article shows which factors and subfactors actually drive the outcome of the fair use test in practice, how the fair use factors interact, how courts inflect certain individual factors, and the extent to which judges stampede the factor outcomes to conform to the overall test outcome. It also presents empirical evidence of the extent to which lower courts either deliberately ignored or were ignorant of the doctrine of the leading cases, particularly those from the Supreme Court.

Source: Beebe, Barton. "An Empirical Study of U.S. Copyright Fair Use Opinions, 1978-2005." SSRN (2007).

Preserving the Digital Heritage: Principles and Policies

The Netherlands National Commission for UNESCO and the European Commission on Preservation and Access have published Preserving the Digital Heritage: Principles and Policies.

Here's an excerpt from the "Preface":

In November 2005, the Netherlands National Commission for UNESCO, in collaboration with the Koninklijke Bibliotheek (National Library of the Netherlands) and UNESCO’s Information Society Division, organized a conference entitled Preserving the Digital Heritage (The Hague, The Netherlands, 4-5 November 2005). It focused on two important issues: the selection of material to be preserved, and the division of tasks and responsibilities between institutions. This publication contains the four speeches given by the keynote speakers, preceded by a synthesis report of the conference.

Scholarly Electronic Publishing Weblog Update (8/8/07)

The latest update of the Scholarly Electronic Publishing Weblog (SEPW) is now available, which provides information about new scholarly literature and resources related to scholarly electronic publishing, such as books, journal articles, magazine articles, technical reports, and white papers.

Especially interesting are: "Attitudes and Aspirations in a Diverse World: The Project StORe Perspective on Scientific Repositories," "Digital Archive Policies and Trusted Digital Repositories," "The Florida Digital Archive and DAITSS: A Working Preservation Repository Based on Format Migration," "Pathways: Augmenting Interoperability across Scholarly Repositories," A Portal for Doctoral E-Theses in Europe: Lessons Learned from a Demonstrator Project, "Progress toward an OA Mandate at the NIH, One More Time," "Reinventing the Library—How Repositories Are Causing Librarians to Rethink Their Professional Roles," University Publishing in a Digital Age, and "The World Is All Grown Digital. . . . How Shall a Man Persuade Management What to Do in Such Times?"

For weekly updates about news articles, Weblog postings, and other resources related to digital culture (e.g., copyright, digital privacy, digital rights management, and Net neutrality), digital libraries, and scholarly electronic publishing, see the latest DigitalKoans Flashback posting.

BioMed Central Replies to Yale

On Sunday, DigitalKoans reported that Yale had canceled its BioMed Central membership. Today, BioMed Central has replied to the Yale posting about that decision.

Here's an excerpt from the BioMed Central posting:

The main concern expressed in the library's announcement is that the amount payable to cover the cost of publications by Yale researchers in BioMed Central's journals has increased significantly, year on year. Looking at the rapid growth of BioMed Central's journals, it is not difficult to see why that is the case. BioMed Central's success means that more and more researchers (from Yale and elsewhere) are submitting to our journals each year.


An increase in the number of open access articles being submitted and going on to be published does lead to an increase in the total cost of the open access publishing service provided by BioMed Central, but the cost per article published in BioMed Central's journals represents excellent value compared to other publishers.

The Yale library announcement notes that it paid $31,625 to cover the cost of publication in BioMed Central's journals by their authors in 2006, and that the anticipated cost in 2007 will be higher. But to put this into context, according to the Association of Research Library statistics, Yale spent more than $7m on serial subscriptions. Nonetheless, we do recognize that library budgets are very tight and that supporting the rapid growth of open access publishing out of library budgets alone may not be possible. . . .

If library budgets were the only source of funding to cover the cost of open access publication, this would be a significant obstacle. Fortunately, however, there are other sources of funding that are helping to accelerate the transition to open access. . . .

The Wellcome Trust report estimated that on average the cost associated with publishing a peer-reviewed research article is less than $3000, and further estimated that this represented only 1-2% of the typical investment by a funder in carrying out the research that led to the article. It is not surprising therefore, that major biomedical research funders such as NIH and HHMI now encourage open access publication, and are willing to provide financial support for it. BioMed Central's list of biomedical funder open access policies provides further information.

Authors may, of course, pay articles from their own grant funds, and around half of articles published in BioMed Central journals are indeed paid for in this way. However, relying on authors to pay for the cost of open access publication themselves puts open access journals at a significant disadvantage compared to traditional journals, which are supported centrally through library budgets, and so are often perceived to be 'free' by authors.

That is why BioMed Central introduced its institutional membership scheme, which allows institutions to centrally support the dissemination of open access research in the same way that they centrally support subscription journals, thereby creating a 'level playing field'.

In order to ensure that funding of open access publication is sustainable, we have encouraged institutions to set aside a small fraction of the indirect funding contribution that they receive from funders to create a central open access fund.

Over the last several months, BioMed Central has hosted workshops on the issue of sustainable funding for open access at the UK's Association of Research Manager's and Administrators annual conference and at the Medical Library Association's meeting in Philadelphia [see report]. Further such workshops are planned.

In this way, by helping research funders, administrators, VPs of research and librarians to work together to provide sustainable funding channels for open access, we aim to "provide a viable long-term revenue base built upon logical and scalable options", as called for in statement fromYale's library. . . .

We look forward to working with librarians and research administrators at Yale to develop a solution that will make it as easy as possible for Yale's researchers to continue publish their open access research articles in BioMed Central's journals.

Legal Aspects of Data Access and Reuse in Collaborative Research

The Open Access to Knowledge Law Project and the Legal Framework for e-Research Project have released Building the Infrastructure for Data Access and Reuse in Collaborative Research: An Analysis of the Legal Context.

Here's an excerpt from the "Executive Summary":

This Report examines the broad legal framework within which research data is generated, managed, disseminated and used. The background to the Report is the growing support for systems that enable research data generated in publicly-funded research projects to be made available for access and use by others in the research community.

The Report provides an overview of the operation of copyright law, contract and confidentiality laws, as well as a range of legislation—privacy, public records and freedom of information legislation, etc—that is of relevance to research data. The Report considers how these legal rules apply to define rights in research data and regulate the generation, management and sharing of data. In any given research project there will be a multitude of different parties with varying interests. . . The Report examines the relationships between these parties and the legal arrangements that must be implemented to ensure that research data is properly and effectively managed, so that it can be accessed and used by other researchers.

Important in the context of collaborative research and open access, the Report describes and explains current practices and attitudes towards data sharing. . . . Often these practices are informed by international and national policies on access and use, formulated by international organisations and conferences, research funders and research bodies. The Report considers these policies at length and canvasses the development of the open access to research data movement.

Finally, the Report encourages researchers and research organisations to adopt proper management and legal frameworks for research data outputs. . . . The Report describes best practice strategies and mechanisms for organising, preserving and enabling access to and reuse of research data, including data management policies and principles, data management plans and data management toolkits. Proposals are made for further work to be undertaken on data access policies, frameworks, strategies and mechanisms.

Web/Web 2.0 Tools

Here’s a list of a few Web/Web 2.0 resources and tools that developers may find useful.

Litman on Lawful Personal Use

Jessica Litman, Professor at the University of Michigan Law School and author of Digital Copyright: Protecting Intellectual Property on the Internet, has written a paper that examines copyright from the point of user rights.

Here's an excerpt from the e-print:

This Article seeks to refocus the discussion of users’ and consumers’ rights under copyright, by placing people who make personal use of copyright works at the center of the copyright system. . . .

Limiting myself to personal use, moreover, allows me to evade, for now, many of the interesting questions that arise when readers, listeners, users, and experiencers morph into publishers and distributors. Finally, personal use is a realm where even the most rapacious copyright owners have always agreed that some uses are lawful even though they are neither exempted or privileged in the copyright statute nor recognized as legal by any judicial decision.

In Part II of this Article, I urge that reading, listening, viewing, watching, playing, and using copyrighted works is at the core of the copyright system. . . . In Part III, I revisit copyright cases that have attracted criticism for their stingy construction of copyright owners’ property rights, and suggest that the courts’ narrow reading of copyright rights was motivated, at least in part, by their solicitude for the interests of readers and listeners. . . . In Part IV, I articulate a definition of personal use. Armed with that definition, in Part V, I look at a range of personal uses that are uncontroversially noninfringing under current law. . . . I proceed in Parts VI and VII to offer an alternative analysis of the scope of copyright owners’ rights and the lawfulness of personal uses that might invade them. Finally, in Part VIII, I return to the conventional paradigm of copyright statutory interpretation, under which all unlicensed uses are infringing unless excused.

Source: Litman, Jessica. "Lawful Personal Use." (2007).

ACRL Recommends Next Steps for Supporting NIH Mandate

As reported on DigitalKoans previously, the House passed H. R. 3043, which includes the NIH deposit mandate.

ACRL has some suggestions about follow-up actions that supporters of the mandate can take as the battle moves to the Senate.

Here’s an excerpt from ACRL Legislative Update:

  1. Send a thank you note if your Representative voted yes to pass the House appropriations bill (check the roll call). Your legislators want to hear from you and need to know they did the right thing.
  2. Contact both of your Senators during August. While a phone call, e-mail or fax would work, consider taking advantage of the fact that they are home for the August recess. Make a visit to the local district office or invite your Senators to visit your library. Urge them to maintain the language put forth by the Senate appropriations committee on the NIH public access policy. Find talking points and contact info in the ALA Legislative Action Center.
  3. Ask library advocates in your state to talk to their Senators.
  4. Talk about this issue with leaders on your campus—your government relations office, library advisory committee, faculty senate—to enlist individual and institutional support.

Publisher Author Agreements

According to today's SHERPA/RoMEO statistics, 36% of the 308 included publishers are green ("can archive pre-print and post-print"), 24% are blue ("can archive post-print (i.e. final draft post-refereeing)"), 11% are yellow ("can archive pre-print (i.e. pre-refereeing)"), and 28% are white ("archiving not formally supported"). Looked at another way, 72% of the publishers permit some form of self-archiving.

These are certainly encouraging statistics, and publishers who permit any form of self-archiving should be applauded; however, leaving aside Creative Commons licenses and author agreements that have been crafted by SPARC and others to promote rights retention, publishers recently liberalized author agreements still raise issues that librarians and scholars should be aware of.

Looking deeper, there are publisher variations in terms of where e-prints can be self-archived. Typically, this might be some combination of the author's Website, institutional repository or Website, funding agency's server, or disciplinary archive. Some agreements allow deposit on any noncommercial or open access server. Restricting deposit to open access or noncommercial servers is perfectly legitimate in my view; more specific restrictions are, well, too restrictive. The problem arises when the agreement limits the author's deposit options to ones he or she doesn't have, such as only allowing deposit in an institutional repository when the author's institution doesn't have one or only allowing posting on an author's Website when the author doesn't have one.

Another issue is publisher requirements for authors to remove e-prints on publication, to modify e-prints after publication to reflect citation and publisher contact information, to replace e-prints with published versions, or to create their own versions of postprints. Low deposit rates in institutional repositories without institutional mandates suggest that anything that involves extra effort by authors is a deterrent to deposit. The above kinds of publisher requirements are likely to have equally low rates on compliance, resulting in deposited e-prints that do not conform to author agreements. To be effective, such requirements would have to be policed by publishers or digital repositories. Otherwise, they are meaningless and are best deleted from author agreements.

A final issue is retrospective deposit. We can think of the journal literature as an inverted pyramid, with the broad top being currently published articles and the bottom being the first published journal articles. The papers published since the emergence of author agreements that permit self-archiving are a significant resource; however, much of the literature precedes such agreements. The vast majority of these articles are under standard copyright transfer agreements, with publishers holding all rights. Consequently, it is very important that publishers clarify whether their relatively new self-archiving policies can be applied retroactively. Elsevier has done so:

When Elsevier changes its policies to enable greater academic use of journal materials (such as the changes several years ago in our web-posting policies) or to clarify the rights retained by journal authors, Elsevier is prepared to extend those rights retroactively with respect to articles published in journal issues produced prior to the policy change.

Elsevier is pleased to confirm that, unless explicitly noted to the contrary, all policies apply retrospectively to previously published journal content. If, after reviewing the material noted above, you have any questions about such rights, please contact Global Rights.

Unfortunately, many publishers have not clarified this issue. Under these conditions, whether authors can deposit preprints or author-created postprints hinges on whether these works are viewed as being different works from the publisher version, and, hence, owned by the authors. Although some open access advocates believe this to be the case, to my knowledge this has never been decided in a court of law. Michael Carroll, who is a professor at the Villanova University School of Law and a member of the Board of the Creative Commons, has said in an analysis of whether authors can put preprints of articles published using standard author agreements under Creative Commons licenses:

Although technically distinct, the copyrights in the pre-print and the post-print overlap. The important point to understand is that copyright grants the owner the right to control exact duplicates and versions that are "substantially similar" to the copyrighted work. (This is under U.S. law, but most other jurisdictions similarly define the scope of copyright).

A pre-print will normally be substantially similar to the post-print. Therefore, when an author transfers the exclusive rights in the work to a publisher, the author precludes herself from making copies or distributing copies of any substantially similar versions of the work as well.

Much progress has been made in the area of author agreements, but authors must still pay careful attention to the details of agreements, which vary considerably by publisher. The SHERPA/RoMEO—Publisher Copyright Policies & Self-Archiving database is a very useful and important tool and users should actively participate in refining this database; however, authors are well advised not to stop at the summary information presented here and to go to the agreement itself (if available). It would be very helpful if a set of standard author agreements that covered the major variations could be developed and put into use by the publishing industry.

Yale Cancels BioMed Central Membership

Except for current submissions, Yale’s Cushing/Whitney Medical and Kline Science Libraries have stopped funding author fees for Yale faculty who publish papers in BioMed Central journals. According to ARL statistics, the Yale spent $7,705,342 on serials in 2005-06, which raises the question: If Yale can’t afford to support BioMed Central, what academic library can?

Here’s an excerpt from the Yale posting:

The libraries’ BioMedCentral membership represented an opportunity to test the technical feasibility and the business model of this OA publisher. While the technology proved acceptable, the business model failed to provide a viable long-term revenue base built upon logical and scalable options. Instead, BioMedCentral has asked libraries for larger and larger contributions to subsidize their activities. Starting with 2005, BioMed Central page charges cost the libraries $4,658, comparable to a single biomedicine journal subscription. The cost of page charges for 2006 then jumped to $31,625. The page charges have continued to soar in 2007 with the libraries charged $29,635 through June 2007, with $34,965 in potential additional page charges in submission.

As we deal with unprecedented increases in electronic resources, we have had to make hard choices about which resources to keep. At this point we can no longer afford to support the BioMedCentral model.

(Thanks to Open Access News.)

Are Laser Printers Safe?

A study by Congrong He, Lidia Morawska, and Len Taplin in Environmental Science & Technology titled "Particle Emission Characteristics of Office Printers" raises questions about the safety of laser printers.

Here's an excerpt from a related Environmental Science & Technology news article:

When researchers in Australia discovered that particulate matter levels were five times higher during the workday inside a nonsmoking office building than outside near a freeway, they looked for indoor culprits. After testing more than 50 printers throughout the building, they found that particle emissions varied depending on the type and age of the printer. In one case, standing near a working printer was much like standing next to a cigarette smoker.

Here's an excerpt from a related Scientific American article:

When the researchers investigated emissions from all 62 printers in the entire six-story building, they found that 25 of them (40 percent) were emitting particles. Among those, 17 (27 percent) were "high emitters" (including HP LaserJet and HP Color LaserJet models, and one Toshiba Studio model), which caused the concentration of particles in the surrounding air to jump tenfold when just one page was printed. The majority of particles were ultrafine, or less than 0.1 micrometer in diameter.

"Because these particles are so small, there is a very high probability for these particles to deposit in the deepest alveoli in the lung. . .; from there they can enter the bloodstream," Morawska says. This could cause changes in blood properties that lead to cardiovascular disease, she notes. If the particles contain cancer-causing agents, exposure could also increase the risk of cancer, but Morawska says researchers did not test the chemical composition of the particles. The primary purpose of this study was to determine the concentration of ultrafine particles emitted by laser printers.

More coverage: "HP Dismisses Laser Printer Health Risks," "Is Your Printer Polluting the Air You Breathe?," "Printer Emissions as Bad as Cigarettes?," "Some Top Laser Printers Called Office Polluters," "Study: Laser Printers May Pose Health Risks," and "Warning: Laser Printers Could Be a Health Hazard."

Sources: He, Congrong, Lidia Morawska, and Len Taplin. "Particle Emission Characteristics of Office Printers." Environmental Science & Technology Articles ASAP, 1 August 2007; Lubick, Naomi. Printer Particle Emissions Add Up. Environmental Science & Technology: Science News, 1 August 2007.

Open Access to Books: The Case of the Open Access Bibliography Updated

Last July, I reported on use of the Open Access Bibliography: Liberating Scholarly Literature with E-Prints and Open Access Journals, which is both a printed book and a freely available e-book. Both versions are under a Creative Commons Attribution-NonCommercial 2.0 License. You can get a detailed history at the prior posting; the major changes since then have been the conversion of the HTML version to XHTML and the addition of a Google Custom Search Engine.

So, what does cumulative use of the e-book OAB version look like slightly over one year down the road from the last posting? Here's a summary:

  • UH PDF: 29,255 (March through May 2005)
  • All Web files on both Digital Scholarship hosts: 192,849 (33,814 uses of the PDF file; June 2005 through July 2007)
  • dLIST PDF: 655 (March 2005 to present)
  • E-LIS PDF: 556 (November 2005 to present)
  • ARL PDF: Not Available

Combined, OAB Web files have been accessed 223,315 times since March 2005.

Pamela Samuelson on Copyright Reform

Pamela Samuelson, Professor at the School of Information and the School of Law, University of California at Berkeley, has written an interesting paper about copyright reform. While not very hopeful about immediate action, she outlines a number of reasons why such an effort is still worthwhile, blocks out the main areas of concern, and offers suggestions about possible reforms.

Here's an extract from the paper:

The 1976 Act has been amended more than twenty times since 1976. As a result of these many amendments to its text, the 1976 Act has become an amalgam of inter- and intra-industry negotiated compromises. As a consequence, it has become a hodgepodge law. Although Congress has occasionally given the Copyright Office rule-making authority, most of the controversial issues have been left for the Congress or the courts to resolve. This has given rise to serious public choice problems with the copyright law and policy making process. The copyright industries have become accustomed to drafting legislation that suits their perceived needs and to having that legislation adopted without careful scrutiny.

The ’76 Act is, moreover, the intellectual work product of a copyright reform process that was initiated in the late 1950’s. This legislation was written without giving serious thought to how it would apply to computers, computer programs, or computer networks. . . .

The ’76 Act was also drafted in an era when it mainly regulated the copyright industries and left alone the acts of ordinary people and non-copyright industries who might interact with copyrighted works. The copyright industries had negotiated many of the fine details of the statute and knew what they meant, even if no one else did. Advances in digital technologies have, among other things, democratized the creation and dissemination of new works of authorship and brought ordinary persons into the copyright realm not only as creators but also as users of others’ works. . . .

Thirty years after enactment of the ’76 Act, with the benefit of considerable experience with computer and other advanced technologies and the rise of amateur creators, it may finally be possible to think through in a more comprehensive way how to adapt copyright to digital networked environments as well as how to maintain its integrity as to existing industry products and services that do not exist outside of the digital realm. If one considers, as I do, that the 1976 Act was the product of 1950/1960’s thinking, then a copyright reform process should be well underway, for copyright revision projects have occurred roughly every 40 years in the U.S.25 A copyright reform project would, moreover, take years of careful thought, analysis, and drafting, and would then face the daunting challenge of persuading legislators to enact it. Viewed in this light, time’s awasting, and someone should get on with it.

Source: Samuelson, Pamela. "Preliminary Thoughts on Copyright Reform." SSRN. (2007).