supercomputing 2013 slides

33
Tools Nanomat. Bloodflow Multiscale Communities

Upload: university-college-london

Post on 26-May-2015

484 views

Category:

Technology


1 download

DESCRIPTION

A collection of 4 mini-presentations which I will share in the European Exascale Projects booth at SC13.

TRANSCRIPT

Page 1: Supercomputing 2013 slides

Tools Nanomat.

Bloodflow

Multiscale Communities

Page 2: Supercomputing 2013 slides

Tools

Page 3: Supercomputing 2013 slides

Wide area message passing

Connect applications running on different platforms by establishing communication paths.Each path can be hand-tuned for better performance.More light-weight than MPI, useful for coupling and parallelizing codes over long distances.

Page 4: Supercomputing 2013 slides

AppsCosmological N-body simulation across supercomputers. Here MPWide facilitates wide area message passing over the wide area networks.

Page 5: Supercomputing 2013 slides

Apps 1D 3D

supercomputer

Multiscale bloodflow modelling. Here we use MPWide to efficiently exchange data between a desktop in London and a supercomputer in Edinburgh.

Page 6: Supercomputing 2013 slides

● Couples computational models.● Connects these over wide area networks

using MPWide.● Handles models’ time and space scales as

per the Multiscale Modeling and Simulation framework*.

● Supports Java, C, C++ and Fortran.● Used by 10+ production applications.

Page 8: Supercomputing 2013 slides

Bloodflow

Page 9: Supercomputing 2013 slides

● Aim: Accurately model cerebrovascular bloodflow with acceptable performance.

● Approach: integrate a person-specific circulation model with a high-res local vasculature model.

Page 10: Supercomputing 2013 slides

● Future applications:○ Comparison of rheology models.○ Validation against medical data (ongoing).○ Look for predictive indicators of aneurysm rupture.

● And eventually predict the outcome of cerebrovascular surgery.

Page 11: Supercomputing 2013 slides

Example visualization

Page 12: Supercomputing 2013 slides

HECToR @ EPCC

Page 13: Supercomputing 2013 slides

SuperMUC @ LRZ

Page 14: Supercomputing 2013 slides

1D 3D

supercomputer

We couple the 1D Python Navier-Stokes (PyNS) solver to HemeLB to construct a multiscale model..

We use MPWide to efficiently exchange data between a desktop in London and a supercomputer in Edinburgh.

Page 15: Supercomputing 2013 slides
Page 16: Supercomputing 2013 slides

More?

Groen et al., Interface Focus 3(2), 2013.Groen et al., Journal of Computational Science 4(5), 2013.Bernabeu et al., Interface Focus 3(2), 2013.http://www.slideshare.net/DerekGroen/multiscale-modelling-of-brain-bloodflow

Page 17: Supercomputing 2013 slides

Nanomaterials

Page 18: Supercomputing 2013 slides

Aim: To develop quantitative coarse-grained models of clay-polymer nanocomposites.We will use these models to:● Predict the thermodynamically favourable state of the

composites.● Predict their elasticity.

Page 19: Supercomputing 2013 slides

We require:● Accurate potentials.● Realistic structures.● Task farming many MD simulations.

Page 20: Supercomputing 2013 slides

Atomistic representation of a charged clay sheet

Page 21: Supercomputing 2013 slides

Coarse-grained representation of a charged clay sheet

Page 22: Supercomputing 2013 slides

1 2 3

4

Page 23: Supercomputing 2013 slides
Page 24: Supercomputing 2013 slides

More?Suter, Groen, Kabalan and Coveney, MRS Proceedings 1470, 2012.Borgdorff et al., "Multiscale Simulations on distributed European e-Infrastructures", inSiDE, Vol. 10, No. 1, Spring 2012.Suter, Coveney, Anderson, Greenwell and Cliffe, Energy Environ. Sci., 2011.More papers soon. :)

Page 25: Supercomputing 2013 slides

Communities

Page 26: Supercomputing 2013 slides
Page 27: Supercomputing 2013 slides

Groen, Zasada, Coveney, accepted by CiSE, 2013.

Page 28: Supercomputing 2013 slides
Page 29: Supercomputing 2013 slides

Groen, Zasada, Coveney, accepted by CiSE, 2013.

Page 30: Supercomputing 2013 slides

Scientific Challenges

Just scratching the surface here:● Which couplings can deliver useful

information?● What information should we exchange?● How do we validate and error-check coupled

models?○ ...what if they are multi-physics as well?

Page 31: Supercomputing 2013 slides

Computational Challenges

● Where to simulate the models?● How do we couple?● Can we make it fast?● Can we make it reusable?● How do we analyze the resulting data?

Page 32: Supercomputing 2013 slides

More?

“Survey of Multiscale and Multiphysics Applications and Communities”Derek Groen, Stefan Zasada and Peter CoveneyIEEE Computing in Science & Engineering (in press), 2013.preprint at: http://arxiv.org/1208.6444

Page 33: Supercomputing 2013 slides

Acknowledgements

Slides made by Derek GroenThanks go out to:James Suter, Rupert Nash, James Hetherington, Peter Coveney, Hywel Carver, Stefan Zasada, Steven Rieder, Simon Portegies Zwart, Chris Kurowski, Alfons Hoekstra, Werner Dubitzky...and many others!