moser september2011
Post on 07-Nov-2015
25 views
Embed Size (px)
DESCRIPTION
Uncertainity analysis talkTRANSCRIPT
Quantifying Uncertainty in Simulations of ComplexEngineered Systems
Robert D. Moser
Center for Predictive Engineering & Computational Science (PECOS)Institute for Computational Engineering and Sciences (ICES)
The University of Texas at Austin
September 20, 2011
Special thanks to: Todd Oliver, Onkar Sahni, Gabriel Terejanu
PECOS
Acknowledgment: This material is based on work supported by the Department of Energy [National NuclearSecurity Administration] under Award Number [DE-FC52-08NA28615].
R. D. Moser Quantifying Uncertainty in Simulations 1 / 43
Motivation
PECOS: a DoE PSAAP CenterDevelop Validation & UQ Methodologies for Reentry Vehicles Multi-physics, multi-scale physical modeling challenges Numerous uncertain parameters Models are not always reliable (e.g. turbulence)
R. D. Moser Quantifying Uncertainty in Simulations 2 / 43
Motivation
Prediction
Prediction is very difficult, especially if its about the future. N. Bohr
Predicting the behavior of the physical world is central to both scienceand engineering
Advances in computer simulation has led to prediction of morecomplicated physical phenomena
The complexity of recent simulations . . .I Makes reliability difficult to assessI Increases the danger of drawing false conclusions from inaccurate
predictions
R. D. Moser Quantifying Uncertainty in Simulations 3 / 43
Motivation
Imperfect Paths to Knowledge and Predictive Simulation
ofTHE UNIVERSE
REALITIESPHYSICAL
VALIDATION VERIFICATION
ObservationalErrors
OBSERVATIONS MATHEMATICALTHEORY /
MODELS
COMPUTATIONALMODELS
ErrorsModeling Discretization
Errors
KNOWLEDGE
DECISION
Predictive Simulation: the treatment of model and data uncertainties and theirpropagation through a computational model to produce predictions of quantitiesof interest with quantified uncertainty.
R. D. Moser Quantifying Uncertainty in Simulations 4 / 43
Motivation
Quantities of Interest
Simulations have a purpose: to inform a decision-making process
Quantities are predicted to inform the decision These are the Quantities of Interest (QoIs) Models are not (evaluated as) scientific theories
I Involve approximations, empiricisms, guesses . . . (modelingassumptions)
I Generally embedded in an accepted theoretical framework (e.g.conservation of mass, momentum & energy)
Acceptance of a model is conditional on: its purpose the QoIs to be predicted the required accuracy
R. D. Moser Quantifying Uncertainty in Simulations 5 / 43
Motivation
What are Predictions?
PredictionPurpose of predictive simulation is to predict QoIs for whichmeasurements are not available (otherwise predictions not needed)
Measurements may be unavailable because:
instruments unavailable scenarios of interest inaccessible system not yet built ethical or legal restrictions its the future
How can we have confidence in the predictions?
R. D. Moser Quantifying Uncertainty in Simulations 6 / 43
Motivation
Sir Karl Popper
The Principle of Falsification
A hypothesis can be accepted asa legitimate scientific theory
if it can possibly be refuted byobservational evidence
A theory can never be validated; it canonly be invalidated by (contradictory)experimental evidence.
Corroboration of a theory (survival ofmany attempts to falsify) does notmean a theory is likely to be true.
R. D. Moser Quantifying Uncertainty in Simulations 7 / 43
Motivation
Willard V. Quine
The falsification of an individualproposition by observations is notpossible, as such observations rely onnumerous auxiliary hypotheses.
Only the complete theory (including allauxiliary hypotheses) can be falsified byan experiment.
R. D. Moser Quantifying Uncertainty in Simulations 8 / 43
Motivation
Rev. Thomas Bayes
P (|D) = P (D|)P ()P (D)
Genesis of Bayesian interpretation ofprobability & Bayesian statistics
Theories have to be judged in terms of
their probabilities in light of the evidence.
R. D. Moser Quantifying Uncertainty in Simulations 9 / 43
Principles of Validation & Uncertainty Quantification
Posing a Validation ProcessValidation of physical models, uncertainty models & their calibration
Two types of validation question:I Are their unanticipated phenomena affecting the systemI Do modeling assumptions yield acceptable prediction of the QoIs
Challenge the model with validation data (generally not of QoIs)I Use observations that challenge modeling assumptionsI Use observations that are informative for the QoIs
Are discrepancies between model & data significant?I Can they be explained by plausible errors due to modeling
assumptions?I If so, what is their impact on the prediction of the QoIs?
Validation expectations are model dependent Interpolation models: simple fit to data Physics-based models: formulated from theory extrapolatable
R. D. Moser Quantifying Uncertainty in Simulations 10 / 43
Principles of Validation & Uncertainty Quantification
Uncertainty
Need to Treat Uncertainty in these Processes
Mathematical representation of uncertainty (Bayesian probability) Uncertainty models Probabilistic calibration & validation processes (Bayesian inference)
Modeling Uncertainty Uncertainty in data
I instrument error & noiseI inadequacy of instrument models (a la Quine)
Uncertainty due to model inadequacyI Represent errors introduced by modeling assumptions.I Impact of these errors within the accepted theoretical framework
R. D. Moser Quantifying Uncertainty in Simulations 11 / 43
Principles of Validation & Uncertainty Quantification
Stochastic Extension of Physical Models
Physics Mathematical representation of physical phenomena of interest At macroscale, usually deterministic (e.g., RANS)
Experimental UncertaintyModel for uncertainty introduced by imperfections in observations used toset model parameters (calibrate)
Model UncertaintyModel for uncertainty introduced by imperfections in physical model
Prior InformationAny relevant information not encoded in above models
R. D. Moser Quantifying Uncertainty in Simulations 12 / 43
Principles of Validation & Uncertainty Quantification
Model Likelihood
p(D|) =
p(D|Dtrue, ) Experimental uncertainty
p(Dtrue|) Prediction model
dDtrue
p(Dtrue|) =p(Dtrue|Dphys, )
Model uncertainty
p(Dphys|) Physical model
dDphys
Physics + model uncertainty = Prediction model Prediction model + experimental uncertainty = Likelihood Different/further decomposition possible depending on available
information
Models coupled with prior form a stochastic model class
R. D. Moser Quantifying Uncertainty in Simulations 13 / 43
Principles of Validation & Uncertainty Quantification
UQ Using Stochastic Model Classes
Processes Single Model Class, M
I Calibration: p(|D) p() p(D|)I Prediction: p(q|D) = p(q|,D) p(|D) dI Experimental design
Multiple Model Classes,M = {M1, . . . ,MN}I Calibration, prediction, and experimental design with each model classI Model comparison/selection: P (Mi|D,M) P (Mi|M) p(D|Mi)I Prediction averaging: p(q|D,M) = i p(q|D,Mi)P (Mi|D,M)
Software used at PECOS DAKOTA: Forward propagation QUESO: Calibration, model comparison
I Metropolis-Hastings, DRAM, Adaptive Multi-Level Sampling
R. D. Moser Quantifying Uncertainty in Simulations 14 / 43
Principles of Validation & Uncertainty Quantification
Information Theoretic InterpretationRearranging Bayes formula:
P (d|M1) = P (|M1) P (d|,M1)P (|d,M1) = P (d|,M1)
/P (|d,M1)P (|M1)
Log and integrating:
ln[P (d|M1)] P (|d,M1) d = 1
= . . .
Then:
ln[P (d|M1)] log evidence
= E [ln[P (d|,M1)]] how well the modelclass fits the data
E[lnP (|d,M1)P (|M1)
]
how much the modelclass learns with the data
(expected information gain,EIG)
R. D. Moser Quantifying Uncertainty in Simulations 15 / 43
Principles of Validation & Uncertainty Quantification
Summary: Four Stage Bayesian Framework
Stochastic Model DevelopmentGenerate extension of physical model to enable probabilistic analysis
Closure parameters viewed as random variables Stochastic representations of model and experimental errors
CalibrationBayesian update for parameters: p(|D) p()L(;D)
PredictionForward propagation of uncertainty using stochastic model
Model ComparisonBayesian update for plausibility: P (Mj |D,M) P (Mj |M)E(Mj ;D)
R. D. Moser Quantifying Uncertainty in Simulations 16 / 43
Data Reduction Modeling
Data Reduction Modeling
Why is it needed? Very likely that quantities we wish to measure are not directly
measurable in an experiment
Have to infer the values from other measurements using amathematical model
Estimate/recover uncertainties in legacy experimental data
Impact on Validation and UQ All mathematical models must be validated Must incorporate uncertainty of both the measurements and the data
reduction model into the final uncertainty quantification of the data
R. D. Moser Quantifying Uncertainty in Simulations 17 / 43
Data Reduction Modeling
Data Reduction Modeling
Traditional calibration schematic:
ValidationProcess
Inversion Process
Model
Da