common modeling infrastructure: esmf to nuopc to gip
DESCRIPTION
Common Modeling Infrastructure: ESMF to NUOPC to GIP. Cecelia DeLuca NOAA ESRL/CIRES May 18, 2010. Outline. Common Modeling Infrastructure ESMF Part 1: Prototype Part 2: Clean Up Part 3: NUOPC and Applications GIP Summary. Origins. - PowerPoint PPT PresentationTRANSCRIPT
CECELIA DELUCANOAA ESRL/CIRES
MAY 18, 2010
Common Modeling Infrastructure:ESMF to NUOPC to GIP
Outline
Common Modeling InfrastructureESMF
Part 1: Prototype Part 2: Clean Up Part 3: NUOPC and Applications
GIPSummary
Origins
The Common Modeling Infrastructure Working Group (late 1990s) Chaired by Steve Zebiak/IRI and Robert Dickenson/GA Tech Brought together research and operational groups, several
of which had developed institutional frameworks: GEMS at NASA, Flexible Modeling System at GFDL
Members were motivated by and participated in reports and papers calling for common infrastructure [1,2,3]
Experimented with Kalnay rules for physicsinteroperability [4]
Formulated a collective response to a NASA solicitation calling for an Earth System Modeling Framework - ESMF (2001)
ESMF Part 1: The Prototype
First round: Three linked proposals to NASA Earth Science Technology Office (PIs Killeen/NCAR, da Silva/NASA, Marshall/MIT, 2002)
Focused on a layered architecture: ESMF scope included a utility layer (parallel communication, time management, error handling) and a coupling layer, with user code sandwiched in between:
Low Level Utilities
Fields and Grids Layer
Model Layer
Components Layer:Gridded ComponentsCoupler Components
ESMF Infrastructure
User Code
ESMF Superstructure
BLAS, MPI, NetCDF, …External Libraries
ESMF Part 1: Goals
Aim for models that are:Scalable in complexity
Models are built from modular components, and can be nested within larger applications
Performance - portable ESMF high-performance communication libraries offer a consistent interface across computer architectures
Exchangeable Standard component interfaces enable interoperability
ESMF Part 1: Successes
ESMF scope and architecture defined [5]The GEOS-5 atmospheric GCM used ESMF extensively
Hierarchicalarchitecture, shown atright
Each box is a componentwith standard interfaces
Many functions filled inby NASA GEMS
ESMF created a network of technical collaborators
ESMF Part 2: The Clean-Up
Second round support came from the DoD Battlespace Environments Institute, NASA Modeling Analysis and Prediction Program, and NOAA NWS
ESMF v3 (start 2005)•Restructured the development team•Rewrote central data structures for greater performance and flexibility
ESMF v4 (start 2006)•Rewrote the grid and regridding software
ESMF Part 2: Successes
Performance, portability, robustness Unit/system test suite, regression tested on 30+ platforms, performance overhead negligible (typically <3%), bug fixes
Capability Multiple modes of coupling,
logically rectangular orunstructured grids
Adoption Used in CCSM4, GEOS-5,
COAMPS, GFS,NEMS, TIMEGCM, andother codes
ESMF Part 2: Multi-Agency Governance
Executive BoardStrategic directionOrganizational changesBoard appointments
Interagency Working GroupStakeholder liaisonProgrammatic assessment & feedback
Review CommitteeExternal review
Functionality change requests
Working Project
ExecutiveManagement
Reporting
Reporting
annually
Joint meetings
Potentialstandardization tasks
Core Development TeamSoftware project managementSoftware development of ESMFDevelopment of NUOPC Layer (NEW)Testing & maintenanceDistribution & user Support daily
Collaborative designBeta testing
Joint Specification TeamRequirements definitionDesign, code and other reviewsExternal code contributions
weekly
NUOPC Content Standards Committee (NEW)Conventions for physical constants, documentation, metadata, etc.
monthly
Community standardsInput into NUOPC standards
ProposedNUOPCstandards
Change Review BoardDevelopment prioritiesRelease review & approval
quarterly
Implementation schedule
Resourceconstraints
ESMF Part 3: NUOPC and Apps
National Unified Operational Prediction Capability (NUOPC) aims to develop an operational multi-model for numerical weather prediction ESMF as a technical foundation for component
interoperability The level of interoperability desired requires greater
specification than ESMF alone provides Solution: create NUOPC Layer, with areas of activity
outlined in a NUOPC Common Model Architecture (CMA) report [6]
New committees: CMA (Chairs Lapenta/NCEP and McCarren/Navy), Content Standards Committee or CSC (Chairs Campbell/NRL)
NUOPC Layer
Encode interoperability rules in code and guidance documents:
Code templates, including component and coupler templates, for describing software structure that is not part of ESMF proper
Additional rules (e.g. component sequencing, data access) encoded in ESMF
Content standards, including metadata and physical constants, expressed in schema, code modules, and/or guidance documents
Usage conventions, where rules cannot or should not be encoded in software, outlined in guidance documents
Compliance verification software, to automate checks on component compliance
Anticipated Results
Standardized implementation of ESMF across NOAA, Navy, and NASA applications
Demonstrably improved level of interoperability, aiming for target level described in the CMA report (Appendix 1)
Reconciliation of NASA/MAPL, NEMS, and Navy ESMF infrastructure, with the resulting NUOPC Layer supported by the ESMF core team
Development Strategy
CMA report:Determine the level of interoperability desiredRecommend general solutions (REC in CMA report)Post-CMA reportCMA determines application milestonesESMF Change Review Board prioritizes development
tasks in each REC areaFor each development task:
Design of solution and verification strategy for adoption Implementation of framework code, tests, and documentation Implementation of compliance checks Beta release and implementation in application prototypes Refinement of code in response to feedback Production release and implementation in operations
Application Milestones (est.)
Single-column atm model
Coupled atmosphere-ocean
Ensemble implementation
April2010
April2011
April 2012
single component multi-component ensemble
Reconciliation Strategy
For each application milestone: Compare NASA GEOS-5, NEMS and NRL (COAMPS
and NOGAPS) implementation Migrate common, merged functionality into ESMF or
NUOPC software distribution, test and document Update prototype application codes, including
NOGAPS Refine and implement in production code
First Step: Single Column Model
• Motivation and approach•Define and execute a inter-agency project to exercise the CMA/CSC interoperability standards
•A development tool that benefits all participating modeling centers •Outcome will serve as a foundation for building the NUOPC Layer•Next steps will be extending the NUOPC Layer to coupled systems and then ensembles
• The Single Column Model (SCM)•A SCM is a one-dimensional time-dependent version of a fully three-dimensional modeling system
•A tool generally used for the development of physics code •Useful for testing new parameterizations•Computationally efficient
NUOPC Layer Task Estimates
Initial Prioritization from CSC
ExamplesPriority 1:
Convention for data ownership Convention for use of Clocks Determine Component and Field metadata
Priority 2: Establish portability requirements and implement Implement Component and Coupler templates Conventions for the intake of externally calculated
interpolation weights
Risks
Failure to implement acceptable solutions Maximize communication and involvement of
application groups in development
Failure to adopt framework and conventions in applications Recognize good faith involvement (e.g. contact user
support with questions and problems when they occur)
Reserve resources for implementation in application codes
Implement automated tests for compliance wherever possible
Beyond ESMF: GIP
Global Interoperability Program (2009)Focuses on development of infrastructure for
a range of application areas in Earth science modeling: Climate simulation Application of climate information Weather and water forecasting Training modelers
Focuses on modeling workflows from configuration to data dissemination
GIP: Building Connections
Table 1. Sampling of Tools and Standards
GIP Domains
Climate Simulation
Application of Climate Information
Weather and Water Forecasting
Training Modelers
Infr
astr
uctu
re
Model Utilities and Coupling
ESMF, FMS, MCT, OASIS
OpenMI, web services
ESMF, OpenMI Ad hoc
Metadata Standards NetCDF CF, METAFOR CIM
NetCDF CF, WaterML, ISO TC/211
GRIB, BUFR Ad hoc
Data Formats NetCDF NetCDF, GIS vector and raster formats (ESRI shapefiles, GeoTiff, KML, WKB)
NetCDF, GRIB, BUFR
NetCDF
Data Services and Workflows
Earth System Grid (ESG)
Hydrologic Information System (HIS)
NOMADS Purdue Climate Portal
Analysis and Visualization CDAT, NCL GIS tools NCL
Entries are representative, not comprehensive!
GIP: Campaigns
High-priority activities that focus community efforts
Used in GIP to define projects and assess impactsTable 2. FY11 GIP Campaigns
Climate Simulation Application of Climate Information
Weather and Water Forecasting
Training Modelers
Execution of CMIP5 and multi-model ensembles for climate
Improvements in model sustained performance on extreme scale computing platforms
Integrated analysis environments for model output and observational data
NOAA Climate Service formation and delivery of climate information
Linking climate and hydrological systems
Execution of NUOPC and multi-model ensembles for weather
Increasing usability of federal weather and water models
Establishment of summer schools that exercise federal models
Establishment of courses that introduce climate informatics
Establishment of workshops for objective analysis of components
GIP: Status
Projects submitted for FY11Include:
Increasing usability of NCEP forecast models Distribution of climate model data in GIS formats Examine NUOPC Layer in CCSM Summer School in Atmospheric Modeling (focus on federal
models) Core support for ESMF
International involvement with links to E.U. based METAFOR metadata and IS-ENES projects
More at http://gip.noaa.gov
Summary
Common Modeling Infrastructure has been evolving for more than a decade: CMIWG, ESMF, NUOPC, GIP
Capabilities, scope, and adoption are increasing
Science collaborations (e.g. BEI, NUOPC) are starting to build upon interface standards and tools
International networks are emergingMany successes, but still more to do to
improve interoperability!
References
1 Dickenson, R.E., S.E. Zebiak, J.L. Anderson, M.L. Blackmon, C. DeLuca, T.F. Hogan, M. Iredell, M. Ji, R. Rood, M.J. Suarez and K.E. Taylor (2002) How Can We Advance Our Weather and Climate Models as a Community? Bulletin of the American Meteorological Society, Volume 83, Number 3, pp. 431-434.
2 Improving the Effectiveness of U.S. Climate Modeling, National Research Council of the National Academies, National Academies Press, 2001.
3 High-End Climate Science: Development of Modeling and Related Computing Capabilities, Report to the USGCRP from an ad hoc Working Group on Climate Modeling, December, 2000.
4 "Rules for Interchange of Physical Parameterizations", E. Kalnay, M. Kanamitsu, J. Pfaendtner, J. Sela, M. Suarez, J. Stackpole, J. Tuccillo, L. Umscheid and D. Williamson, Bull. Amer. Met. Soc., 70, 620-622, 1989.
5 Hill, C., C. DeLuca, V. Balaji, M. Suarez, and A. da Silva (2004). Architecture of the Earth System Modeling Framework. Computing in Science and Engineering, Volume 6, Number 1, pp. 18-28.
6 Final Report from the National Unified Operational Prediction Capability (NUOPC) Interim Committee on Common Model Architecture (CMA), June, 2009.
All Years
Nightly regression testing and release management (REC 3.1.1,1 FTE)
Compiler and platform updates (REC 3.1.1, .2 FTE) Functional updates in response to feature requests and bug
reports (REC 3.1.1, 3 FTE) Routine support requests (REC 3.5.1, .8 FTE) Longer-term adoption support (REC 3.7.1, 1 FTE) Performance evaluation and reporting (REC 3.3.1, .5 FTE) Tutorials and training (REC 3.6.1, .5 FTE) Project administration, including boards and meetings, contracts
and finance, staffing, planning, reporting (REC 3.1.1, 1 FTE) Project operations, including updates to website, repository,
trackers and other tools, project metrics, code backup, computer accounts (REC 3.1.1, 6.1.4, 1 FTE)
TOTAL ~ 9 FTE
Year 1 Development
Finalize and implement organizational plan – including reporting, management and staffing for distributed development and support teams, and funding vehicles.
Set up joint website, trackers, lists, and other communication and management infrastructure, initial code distribution infrastructure, and initial repository access and policies.
Prototype the component template and highest level coupler template, document them, and distribute them via the web.
This activity must address aspects of the common physical architecture.
Examine relationship of NUOPC templates to MAPL and develop interoperability plan.
Other code and convention development activities as prioritized by the ESMF Change Review Board.
Year 2 Development
Migrate ESMF code to Subversion Assess and evolve NUOPC-wide code distribution and
repository strategy. Finalize development of the component and highest-level
coupler templates and distribute. Prototype diagnostics, postprocessing and IO templates
and distribute. Refine and distribute common physical constants module. Finalize component, field, and grid metadata packages. Develop initial conventions for configuration files, working
closely with GFDL, AFWA, etc. Other code and convention development activities as
prioritized by the ESMF Change Review Board.
Year 3 Development
Finalize development of diagnostics, postprocessing, and IO templates and distribute.
Refine conventions for configuration files. Clean up documentation and prepare training
materials.Other code and convention development
activities as prioritized by the ESMF Change Review Board.