Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

NASA Access to Terra Data Fusion

Terra is the flagship of NASA’s Earth Observing System. Launched in 1999, Terra’s five instruments continue to gather data that enable scientists to address fundamental questions that are central to the six NASA Earth Science Research Focus Areas. It is amongst the most popular NASA datasets, serving not only the scientific community, but also governmental, commercial, and educational communities.

The strength of the Terra mission has always been rooted in its five instruments and the ability to fuse the instrument data together for obtaining greater quality of information for Earth Science compared to individual instruments alone. As the data volume grows and the central Earth Science questions shift from process-oriented to climate-oriented questions, the need for data fusion and the ability for scientists to perform large-scale analytics with long records have never been greater. The challenge is particularly acute for Terra, given its growing volume of data (> 1 petabyte), the storage of different instrument data at different archive centers, the different file formats and projection systems employed for different instrument data, and the inadequate cyberinfrastructure for scientists to access and process whole-mission fusion data (including Level 1 data). Sharing newly derived Terra products with the rest of the world also poses challenges.

The ACCESS to Terra Data Fusion Products effort aims to resolve two long-standing problems:

  1. How do we efficiently generate and deliver Terra data fusion products?
  2. How do we facilitate the use of Terra data fusion products by the community in generating new products and knowledge through national computing facilities, and disseminate these new products and knowledge through national data sharing services?


The effort leverages national facilities and services that are managed by the National Center for Supercomputing Applications (NCSA), specifically the National Petascale Computing Facility, which houses the Blue Waters supercomputer, and the National Data Service (NDS). Key advantages of leveraging Blue Waters and the NDS for access, usage, and distribution of Terra data fusion products and science results are that the Terra data and processing are local, with access and sharing that are global. This represents a significant community-element addition to NASA’s system of systems infrastructure. ACCESS to Terra Data Fusion Products will initiate the development, access and delivery of Level 1B radiance Terra Fusion files for the broader community. Level 1B fusion provides the necessary stepping-stone for developing higher-level products, and provides the framework for other flavors of fusion. Enhancements to our existing open source codes in the CyberGIS Toolkit for scalable map projections on any grid for the new Terra Fusion files will also be delivered.

NIH BD2K KnowEng

KnowEnG (pronounced "knowing") is a National Institutes of Health-funded initiative that brings together researchers from the University of Illinois and the Mayo Clinic to create a Center of Excellence in Big Data Computing. It is part of the Big Data to Knowledge (BD2K) Initiative that NIH launched in 2012 to tap the wealth of information contained in biomedical Big Data. KnowEnG is one of 11 Centers of Excellence in Big Data Computing funded by NIH in 2014.

...

NIST IN-CORE

The new center will collaborate with NIST to achieve its long-term goal of developing tools that individual communities can use to assess their resilience. This includes evaluating the effectiveness of alternative measures intended to improve performance and minimize post-disaster disruption and recovery time. These tools will improve decision-making so that communities can build a “business case” for the measures they take. The centerpiece of the center’s effort will be NIST-CORE—the NIST-Community Resilience Modeling Environment. NIST-CORE will be built on the Ergo software (http://ergo.ncsa.illinois.edu/), developed at NCSA for hazard assessment, response, and planning. Ergo is already used around the world, and according to NCSA’s Danny Powell, this collaboration with NIST will further expand on the functionality and applications currently available through the software platform.  The National Data Service consortium, of which NCSA is a founding member, will also be part of the project, working with NIST-CORE developers and researchers on data publishing.

NIST Materials Data Facility

...