Revised NSF Software Development for Cyberinfrastructure Solicitation

The NSF has issued a revised solicitation for Software Development for Cyberinfrastructure grants (NSF 10-508). It is anticipated that $15,000,000 over a three-year period will be available for 25 to 30 awards. The full proposal deadline is February 26, 2010.

Here's an excerpt:

The FY2010 SDCI solicitation supports the development, deployment, and maintenance of software in the five software focus area listed above, i.e., software for HPC systems, software for digital data management, software for networking, middleware, and cybersecurity, and specifically focuses on cross-cutting issues of CI software sustainability, manageability and power/energy efficiency in each of these software focus areas. . . .

  1. Software for Digital Data

The Data focus area addresses software that promotes acquisition, transport, discovery, access, analysis, and preservation of very large-scale digital data in support of large scale applications or data sets transitioning to use by communities other than the ones that originally gathered the data. Examples of such datasets includes climatologic, ecologic, phonologic, observation data, sensor systems, spatial visualizations, multi-dimensional datasets correlated with metadata and so forth.

Specific focus areas in Software for Digital Data for the FY2010 SDCI solicitation include:

  • Documentation/Metadata: Tools for automated/facilitated metadata creation/acquisition, including linking data and metadata to assist in curation efforts; tools to enable the creation and application of ontologies, semantic discovery, assessment, comparison, and integration of new composite ontologies.
  • Security/Protection: Tools for data authentication, tiered/layered access systems for data confidentiality/privacy protection, replication tools to ensure data protection across varied storage systems/strategies, rules-based data security management tools, and assurance tools to test for digital forgery and privacy violations.
  • Data transport/management: Tools to enable acquisition of high data rate high volume data from varied, distributed data sources (including sensors systems and instruments), while addressing stringent space and data quality constraints; tools to assist in improved low-level management of data and transport to take better advantage of limited bandwidth.
  • Data analytics and visualization: Tools that operate in (near) real-time, not traditional batch mode, on possible streaming data, in-transit data processing, data integration and fusion.