the gem computational system and recent scientific results andrea donnellan third international aces...

27
The GEM Computational System and Recent Scientific Results Andrea Donnellan Andrea Donnellan Third International ACES Meeting Third International ACES Meeting May 10, 2002 May 10, 2002 GEM GEM

Upload: ezra-elliott

Post on 25-Dec-2015

219 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

The GEM Computational System and Recent Scientific Results

Andrea DonnellanAndrea DonnellanThird International ACES MeetingThird International ACES Meeting

May 10, 2002May 10, 2002 GEMGEM

Page 2: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Data Volumes from Observations

• GRACE: 50 MB/day onboard, 8GB/day derived product

• ECHO: 100 GB/day onboard• SRTM: 12 TB raw data, • ICESat: 1 GB/day onboard, 2 GB/day derived• SCIGN: 250MB daily - 7.5 GB/day for real time• Airborne observations: LIDAR• VCL: 2 GB/day onboard, 4 GB/day derived• Hyperspectral imagery: 100GB/day raw• Imaging LIDAR: >20 GB/day, >40 GB/day

Page 3: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Volumes from Models• Geodynamo model:

– 1GB of storage for one model run– 2010: 5 TB/run– Minimal need of 10 runs

• General earthquake/lithospheric models: – 1TB/run– 2010: 10 PB/run (multiple scales combined, many regions)

• Gravity– 100 GB/run– 2010: 2 TB/run

• Mantle convection models– 1 TB/run– 2010: 10PB/run

• Geomagnetic field models– 32 GB/run– 2010: 300 GB/run

Page 4: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Where We Will Be in 2010

• Multiple solid earth missions flying• PetaBytes of data per year gathered in a

distributed fashion• Data analyzed by widely distributed scientists

using widely distributed computational resources

• Growing need for integration of information from multiple sources on multiple scales into a integrated analysis

Page 5: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Goal

• World-wide computational systems supporting gathering of 3 PetaBytes of data per year, integrating analysis, visualization, simulation, and interpretation.

Page 6: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Requirements

• Onboard adaptive processing

• High space to ground bandwidth of TeraBytes per day per mission

• Data transmission and handling

• Reusable capabilities (framework)

• Data processing (100 Petaflops per mission per year)

Page 7: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Requirements (continued)

• Product storage (National Virtual Solid Earth Science Observatory) using cooperative federated databases

• Distributed computational environment for analysis (interoperable framework, portal)

• Software tools• Hardware

Page 8: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Hardware (Hierarchical)

• Large central Petaflop computers with TeraBytes of memory

• Single sign-on seamless access• Distributed computers for decomposable

problems• Cluster computers (e.g. Beowulf for cost

performance)• Heterogeneous computational capabilities

(e.g. for storage, visualization, computing)

Page 9: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Software

• Problem Solving Environment– Visualization tools– Analysis algorithms– Data mining

• Framework– Supports software integration into multidisciplinary

analysis– Interoperability between data,software, and

computer systems

Page 10: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

GEM/SERVO Components• Visualization• Model and algorithm development• IT: GRID technologies• Computational Environments/PSEs• Data handling/archiving• Assimilation• Datamining/pattern recognition• Data fusion• High speed networks• High end computers• Clusters• Laptops• Cycles needed and other infrastructure• Scalable system

Page 11: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Solid Earth Research Virtual Observatory Solid Earth Research Virtual Observatory (SERVO)(SERVO)

Tier2 Center

Archive

SERVO

…Goddard Langley Ames

InstituteInstituteInstituteInstitute

Fully functional problem solving environment

100 - 1000 Mbits/sec

•Plug and play composing of parallel programs from algorithmic modules

•On-demand downloads of 100 GB in 5 minutes•106 volume elements rendering in real-time

•Program-to-program communication in milliseconds

•Approximately 100 model codesData cache

~TBytes/day

Tier2 CenterTier2 CenterTier2 Center

Tier 0 +1

Tier 1

Tier 3

Tier 4

Tier2 Center

1 PB per year data rate in 2010

Observations

Archive

Downlink

Archive Downlink

Downlink

……

…… …

100 TeraFLOPs sustained

Tier 2

Workstations, other portals

Page 12: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Virtual Observatory Project

2003 2004 2005 2006 2007 2008 2009 2010Timeline

Cap

abili

ty

Architecture & technology approach

Decomposition into services with requirements

Prototype cooperative federated data base service integrating 5 datasets of 10 TB each

Prototype data analysis service

Prototype modeling service capable of integrating 5 modules

Prototype 1920x1080 pixels at 120 frames per second visualization service

Scaled to 100 sites

• Solid earth research virtual observatory (SERVO)

• On-demand downloads of 100 GB files from 40 TB datasets within 5 minutes.

• Uniform access to 1000 archive sites with volumes from 1 TB to 1 PB

Page 13: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Problem Solving Environment Project

2003 2004 2005 2006 2007 2008 2009 2010Timeline

Cap

abili

ty

Isolated platform dependent code fragments

Prototype PSE front end (portal) integrating 10 local and remote services

Extend PSE to Include• 20 users collaboratory with shared windows• Seamless access to high-performance computers

linking remote processes over Gb data channels.

Integrated visualization service with volumetric rendering

• Fully functional PSE used to develop models for building blocks for simulations.

• Program-to-program communication in milliseconds using staging, streaming, and advanced cache replication

• Integrated with SERVO

• Plug and play composing of parallel programs from algorithmic modules

Plug and play composing of sequential programs from algorithmic modules

Page 14: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Computational Environment

2003 2004 2005 2006 2007 2008 2009 2010Timeline

Cap

abili

ty

100’s GigaFLOPs40 GB RAM1 Gb/s network bandwidth

~100 model codes with parallel scaled efficiency of 50%

~104 PetaFLOPs throughput per subfield per year

~100 TeraFLOPs sustained capability per model

~106 volume elements rendering in real time

Access to mixture of platforms low cost clusters (20-100) to supercomputers with massive memory and thousands of processors

Page 15: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

The Ventura Basin is Actively Deforming

Yeats 1983

Page 16: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Northridge Example• Northridge class simulation: 100,000 unknowns, 4000 time steps –

> 8 hours on high end workstation.

• Southern California system: 0.5 km resolution –> 100,000 processor hours or 400 hours (17 days) on a dedicated 256 processor machine.

Page 17: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Steep Gradient Largely Attributable to Low Rigidity Basin Fill

Page 18: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Coseismic Removed from the Interferogram

Postseismic Interferogram

Page 19: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Results from Data Inversion Show Fault Afterslip as Primary Mechanism

Page 20: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Comparison of InSAR and Seismic Anomalies

• Similar anomaly shows up in both the postseismic deformation indicated by GPS and InSAR (Donnellan et al) and seismic anomalies identified using Principal Component Analysis (Rundle and Tiampo).

• Mojave desert shows a similar correlation near Barstow and the Blackwater Fault (Rundle and Tiampo; Peltzer).

Page 21: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Recent GPS Results

• Similar to pre-seismic velocity field, particularly near the source.

Page 22: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Residuals

Page 23: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Anomalous Motion at JPL was Observed Related to the Northridge Earthquake

Res

idua

l Geo

detic

Lon

gitu

de (

cm)

-3

-2

-1

0

1

2

3

4

5

6

7

1991.0 1992.0 1993.0 1994.0 1995.0 1996.0 1997.0

Time (years)

Landers Earthquake June 28, 1992 -0.4±0.3 cm

Northridge Earthquake January 17, 1994 1.0±0.2 cm

Post-seismic Motion 3.5±0.4 cm

Res

idua

l Geo

detic

Lon

gitu

de (

cm)

• JPL is several fault dimensions away from the Northridge rupture.• The earthquake probably triggered slip on the Sierra Madre Fault in

the upper 0.5 km.• Based on additional observations collected near JPL.• Later extent of anomaly is unknown due to lack of stations.

Sierra Madre Fault

1 m

Page 24: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

• Faults are shown as light lines, the earthquakes at model year 4526 are shown as dark lines

• Simulations indicate that major events are clustered in time like the real events.

• Simulations using a realistic heterogeneous earth structure are computationally intensive.

California 3D Fault Simulations

Page 25: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Modeling Faults as Interacting Systems

Southern California Seismicity

Courtesy John Rundle

Space-time Stress Diagram

• Transients likely occur as a result of stress redistribution.• Are observed on different faults, sometimes a few fault dimensions away.

Page 26: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

Conclusions

• 90% of Northridge postseismic motion was aseismic.

• Afterslip on the mainshock rupture plane responsible for most of the deformation.

• No evidence for lower crustal relaxation playing a major role in postseismic motions.

• Recent deformation is consistent with that observed before the earthquake.

Page 27: The GEM Computational System and Recent Scientific Results Andrea Donnellan Third International ACES Meeting May 10, 2002 GEM

More Conclusions

• High velocity gradient largely attributable to a low rigidity basin.

• Lower crust is a minor player in interseismic and postseismic motion in this region – consistent with a cold lower crust.

• The earthquake probably triggered slip on the Sierra Madre fault.