Anything? You likely have a disaster plan that addresses digital asset issues. The potential problem with a disaster plan is that it can be grounded in assumptions of relative normalcy: the building burns down, a tornado hits, a lower-category hurricane strikes. It may assume severe damage within a confined area and an unimpaired ability of federal, state, and local agencies (as well as relief organizations) to respond. It may assume that workers are not at the disaster site, that they are relatively unaffected if they are, or that they can evacuate and return with relative ease and speed. It may assume that your offsite tape storage or "hot" backup site is far enough away to be unaffected.
What it probably doesn’t assume is the complete devastation of your city or town; widespread Internet, phone, power, and water outages that could last weeks or months; improbable multiple disasters across a wide region surrounding you; the inability of officials at all levels of government to adequately respond to a quickly deepening crisis; the lack of truly workable evacuation plans; depleted gas supplies for a hundred miles in all directions; your evacuated workers being scattered across a multiple-state area in motels, hotels, and the houses of friends and relatives after trips or 20 to 30 hours in massive traffic jams; your institution’s administration being relocated to a hotel in another city; evacuees ending up in new disaster zones and needing to evacuate yet again; and the possibility of more local post-catastrophe catastrophes in short order.
Here’s some thoughts. You may need to have your backups and hot sites in a part of the country that is unlikely to be experiencing a simultaneous catastrophe. This will not be reliable or convenient if physical data transportation is involved. Your latest data could end up in a delivery service depot in your city or town when the event happens. Even if this doesn’t occur, how frequently will you ship out those updates? Daily? Weekly? Another frequency?
Obviously, a remote hot site is better than just backups. But, if hot sites were cheap, we’d all have them.
In terms of backups, how software/hardware-specific are your systems? Will you have to rebuild a complex hardware/software environment to create a live system? Will the components that you need be readily available? Will you have the means to acquire, house, and implement them?
Lots of copies do keep stuff safe, but there have to be lots of copies. Here are two key issues: copyright and will (no doubt there are many more).
You may have a treasure trove of locally produced digital materials, but, if they are under normal copyright arrangements, no one can replicate them. It took considerable resources to create your digital materials. It’s a natural tendency to want to protect them so that they are accessible, but still yours alone. The question to ask yourself is what do I want to prevent users from doing, now and in the future, with these materials? The Creative Commons licences offer options that bar commercial and derivative use, but still provide the freedom to replicate licensed data. True, if you allow replication, you will not really be able to have unified use statistics, but, in the final analysis, what’s more important statistics or digital asset survival? If you allow derivative works, you may find others add value to your work in surprising and novel ways that benefit your users.
However, merely making your digital assets available doesn’t mean that anyone will go to the trouble of replicating or enhancing them. That requires will on the part of others, and they are busy with their own projects. Moreover, they assume that your digital materials will remain available, not disappear forever in the blink of an eye.
It strikes me that digital asset catastrophe planning may call for cooperative effort by libraries, IT centers, and other data-intensive nonprofit organizations. Perhaps by working jointly economic and logistical barriers can be overcome and cost-effective solutions can emerge.