the subjective bayesian approach to pra: theoretical and practical perspectives

7
Reliability Engineering and System Safety 23 (1988) 269-275 Technical Note The Subjective Bayesian Approach to PRA: Theoretical and Practical Perspectives ABSTRACT The views presented here are in response to questions raised by George Apostolakis (editor of Reliability Engineering and System Safety), concerning the use of subjective judgment in PRA. In order to be useful, of course, subjective judgments must conform to the real world. In this paper, we present our thoughts (some of which have also been stated by other researchers) on how and why we believe that this can be achieved using subjective probability theory. 1 WHAT IS THE PHILOSOPHICAL BASIS FOR YOUR APPROACH TO PROBABILITY AS APPLIED TO SAFETY ASSESSMENTS? We believe that the Bayesian theory of probability is the most appropriate approach for quantifying uncertainty (which, by definition, is one of the main elements of risk). In addition, we believe that decision analysis is the appropriate theory to guide the interpretation of such uncertainties in making decisions regarding risk and safety. The subjective theory of probability is sometimes criticized on the grounds that it merely reflects personal opinion, and as such can give results that are not well-founded. However, valid assessments of subjective probabilities are in fact heavily constrained by the demands of coherence. For instance, it has been shown that coherent assessments of subjective probabilities must be consistent with the long-term relative frequencies of repetitive events. 1 In fact, we would hypothesize that most people who claim 269 Reliability Engineering and System Safety 0951-8320/88/$03.50 © 1988 Elsevier Science Publishers Ltd, England. Printed in Great Britain

Upload: vicki-m-bier

Post on 26-Jun-2016

219 views

Category:

Documents


6 download

TRANSCRIPT

Page 1: The subjective Bayesian approach to PRA: Theoretical and practical perspectives

Reliability Engineering and System Safety 23 (1988) 269-275

Technical Note

The Subjective Bayesian Approach to PRA: Theoretical and Practical Perspectives

ABSTRACT

The views presented here are in response to questions raised by George Apostolakis (editor of Reliability Engineering and System Safety), concerning the use of subjective judgment in PRA. In order to be useful, of course, subjective judgments must conform to the real world. In this paper, we present our thoughts (some of which have also been stated by other researchers) on how and why we believe that this can be achieved using subjective probability theory.

1 WHAT IS THE PHILOSOPHICAL BASIS FOR YOUR APPROACH TO PROBABILITY AS APPLIED TO SAFETY

ASSESSMENTS?

We believe that the Bayesian theory of probability is the most appropriate approach for quantifying uncertainty (which, by definition, is one of the main elements of risk). In addition, we believe that decision analysis is the appropriate theory to guide the interpretation of such uncertainties in making decisions regarding risk and safety.

The subjective theory of probability is sometimes criticized on the grounds that it merely reflects personal opinion, and as such can give results that are not well-founded. However, valid assessments of subjective probabilities are in fact heavily constrained by the demands of coherence. For instance, it has been shown that coherent assessments of subjective probabilities must be consistent with the long-term relative frequencies of repetitive events. 1 In fact, we would hypothesize that most people who claim

269 Reliability Engineering and System Safety 0951-8320/88/$03.50 © 1988 Elsevier Science Publishers Ltd, England. Printed in Great Britain

Page 2: The subjective Bayesian approach to PRA: Theoretical and practical perspectives

270 Vieki M. Bier, Ali Mosleh

to believe in so-called 'objective' probabilities do not actually believe that the universe is inherently random, with probabilities inherent in particular objects or processes. Rather, we think that they are merely expressing a strong belief in the power and importance of coherence.

2 CONTRAST YOUR APPROACH TO A L T E R N A T E APPROACHES. WHAT ARE THE STRENGTHS AND

WEAKNESSES OF YOUR APPROACH AND A L T E R N A T E APPROACHES?

We will not attempt here to contrast our approach with all other approaches that have been suggested. Instead, we will focus our comments on two general categories of methods. The first category is classical statistics, and the second includes fuzzy set theory and other alternative approaches to the treatment of uncertainty.

Bayesian vs. classical statistics

The primary advantage of the Bayesian approach over classical statistics when applied to safety assessments is that the Bayesian approach does not break down in the absence of large amounts of data. Past attempts to apply classical statistics to such situations, such as the Maximus method, 2 have required substantial adaptation of the classical theory, and still yielded only mixed results. In addition, the use of Bayesian methods facilitates the use of non-statistical information. This can include not only general engineering knowledge or 'expert judgment' , but also non-controversial information such as the laws of physics. By contrast, in the classical approach, it can be difficult to incorporate even relatively simple constraints on the possible values of an unknown quantity, since the use of non-statistical information is not explicitly encompassed within the philosophical and technical boundaries of the classical theory.

Another strength of the Bayesian approach is its internal consistency when viewed from a theoretical perspective. In particular, the Bayesian approach is completely axiomatic, and those axioms are designed to reflect what we mean by the term 'probability' or 'confidence'. By contrast, classical statistics consists of a somewhat ad hoc set of techniques and criteria (e.g. maximum likelihood estimation, unbiased minimum variance estimators, etc.), each of which is applied to a specific class of problems.

In fact, most classical statistical techniques correspond to the results of a Bayesian analysis with a non-informative prior distribution, while some are actually incoherent when viewed from a Bayesian point of view. 3 The use of a non-informative prior is certainly not unreasonable (although one may

Page 3: The subjective Bayesian approach to PRA: Theoretical and practical perspectives

The subjective Bayesian approach to PRA 271

wish to incorporate additional knowledge when it is available). Those classical techniques that are incoherent (i.e. violate the axioms of Bayesian probability theory) are more problematic, however. We are unaware of any discussion explaining which of the axioms are thought to be inapplicable in each situation, and why.

Despite its theoretical advantages, however, the coherent assessment of subjective probabilities can pose some formidable difficulties in practice. In particular, a truly rigorous Bayesian analysis is time-consuming and difficult, requiring a substantial effort. First, the Bayesian approach requires not only normative expertise (e.g. a good knowledge of probability and statistics), but also substantive expertise; i.e. a thorough knowledge of the particular issue being addressed. 4'5 Of course, a classical analysis also requires some guidance by substantive experts, to assist in the selection of an appropriate model. However, the level of substantive expertise required is typically not as great, since the expert is called upon only to formulate a hypothesis or specify the general form of a relationship, not to estimate actual parameter values.

In addition to the need for substantive expertise in assessing subjective probabilities, such assessments are also subject to cognitive biases such as overconfidence (e.g. Refs 6-9). However, techniques have been developed for overcoming some of these problems. For example, Lichtenstein & Fischhoff 1° and Koriat e t al. ~ have suggested the use of calibration training and other probability elicitation techniques. Similarly, Martz & Waller ~2 have suggested the use of pre-posterior analysis as a way of providing feedback to the expert. Finally, Mosleh and Apostolakis, ~3 Winkler, ~4 and others have developed mathematical techniques to correct for suspected biases once the elicitation process is already complete.

All of these approaches require substantial care and effort to implement. This is not a problem when performing detailed assessments of a small number of important quantities. In production work, however, where there may be hundreds of probability distributions to be assessed, it may be difficult to devote the desired level of care and expertise to each individual assessment. Nonetheless, we believe that the difficulty in making proper use of non-statistical information (e.g. expert opinion) is a reasonable price to pay in return for the ability to make probabilistic statements about a large class of events that are not amenable to analysis by the classical theory of statistics.

Probability theory vs. fuzzy set theory

In recent years, fuzzy set theory, confidence factors, and other new approaches to the treatment of uncertainty have been proposed for use in

Page 4: The subjective Bayesian approach to PRA: Theoretical and practical perspectives

272 Vicki M. Bier, Ali Mosleh

fields such as risk assessment and artificial intelligence (e.g. Refs 15 and 16). These approaches are certainly of theoretical interest, and may have useful applications in many fields (e.g. pattern recognition). However, we suspect that there is little reason to apply them to risk assessment.

Some researchers (e.g. Schmucker 17) have claimed that fuzzy set theory is desirable for use in risk assessments due to the vagueness of the information typically available for use in such assessments. However, we believe that probability theory is capable of addressing such problems,~ 8 and that it is the method of choice for two reasons. First, the computational tools that have been developed for performing probabilistic analysis (e.g. Monte Carlo simulation) are highly sophisticated. Second, based on the principle of simplicity (or 'Occam's razor'), we believe that it is inappropriate to proliferate new axioms and theories to account for phenomena that are already well-explained by probability theory. (In simpler language, 'if it ain't broke, don't fix it.')

3 DO Y O U R ANSWERS TO THE P R E C E D I N G TWO QUESTIONS HAVE ANY REAL IMPACT ON RISK ASSESSMENT?. ANY

EXAMPLES?

Here again, we will focus most of our comments on a comparison of the Bayesian and classical approaches. Since most classical techniques are equivalent to a Bayesian analysis with a non-informative prior, they will tend to give similar numerical results in cases with large amounts of data (at least when the prior distribution is not too strong). On the other hand, the choice of methods can have a significant impact when relatively little data is available. In particular, the classical approach can give needlessly broad confidence intervals, since it does not provide a way of incorporating engineering knowledge or common sense into the results. For example, in estimating the failure rate of a component with no recorded failures to date, the most one can get from the classical approach is an upper bound. However, even limited engineering knowledge would generally be sufficient to establish a reasonable lower bound on the failure rate, as well.

In addition, the Bayesian approach allows the analyst to account for uncertainties in the data itself. In many cases, it may be unclear whether a particular event (such as degraded operation of a component) should be counted as a failure. In other instances, even the total number of trials over which the data was collected may not be known with much precision. Classical statistics is not well-suited to handling such situations. By contrast, using a Bayesian approach, it is relatively straightforward to incorporate judgments about quantities such as the probability that a particular event would constitute an actual failure.18 in fact, this is the approach adopted in the recent accident precursor studies (e.g. Ref. 19).

Page 5: The subjective Bayesian approach to PRA: Theoretical and practical perspectives

The subjective Bayesian approach to PRA 273

4 DO THESE ISSUES AFFECT DECISION M A K I N G AND RISK MANAGEMENT?. DO YOU HAVE ANY EXAMPLES WHERE

'WRONG' DECISIONS HAVE BEEN MADE BECAUSE OF MISUNDERSTANDINGS RELATED TO THE CONCEPT OF

PROBABILITY?

First, it is important to emphasize the fact that performing point-estimate quantifications instead of full uncertainty analyses can definitely have a major impact on decision-making. It is not hard to find cases where the alternative with the lowest 'average' risk is actually less desirable because of the greater uncertainty associated with it. In addition, in many cases a point- estimate quantification may not even yield a mean value as a result, due to dependence among the input variables. 2°

Another point that deserves continued emphasis is the importance of rigor in both probability elicitation and probabilistic analysis. This is particularly true when the available data is extremely limited. In analyses of issues for which abundant data is available (e.g. the risks of tobacco smoking or automobile accidents), most researchers are likely to reach similar conclusions about average risk levels. When relatively little data is available, there is proportionately more room for bias, either intentional (e.g. deliberate whitewashing) or unintentional (e.g. overconfidence). The only protection against such biases in situations with relatively little empirical data is to pay careful attention to the techniques used (both in eliciting judgments and in the subsequent analysis), and to make full use of available empirical and theoretical results on the preferred analysis techniques. For example, it has been shown empirically that aggregating the opinions of multiple experts tends to yield better results than relying on the judgments of a single expert; ~4'21 failure to take such findings into account could needlessly limit the validity of probabilistic risk assessments.

One specific situation where misunderstandings about the concept of probability have significantly affected the decision-making process is in the formulation of safety goals for nuclear power plants by the US Nuclear Regulatory Commission (NRC). In early discussions of the safety goals (e.g. Ref. 22), it was proposed that the goals be applied to median core melt frequencies. However, a goal based on median values would impose no constraint at all on the average level of risk posed by a nuclear power plant. By contrast, a mean-value goal implicitly trades offthe amount by which the goal is exceeded against the likelihood of exceeding the goal, and therefore does a better job of controlling what we mean by r i sk . 23

More recently, the NRC has adopted the use of mean-value goals. 24 However, there is clearly still some confusion over the correct interpretation of such a goal. For example, a recent NRC staff report 25 has stated that mean values are 'distorted' by large input uncertainties. Of course, excessive

Page 6: The subjective Bayesian approach to PRA: Theoretical and practical perspectives

274 Vicki M. Bier, Ali Mosleh

conservatism in input parameters can result in a distorted mean value. However, if the uncertainties in the input parameters are honestly stated, the mean value of the resulting risk estimate is not 'distorted'; instead, it accurately reflects the impact of those uncertainties on the overall level of risk,

The N R C safety goals have so far seen little application to actual decision- making, so it is difficult to identify specific instances where such misinterpretations have led to a wrong decision. However, it seems likely that the confusion in N R C discussions of the safety goals has also affected other areas of decision-making, especially decisions that hinge on evaluations of risk assessment results.

R E F E R E N C E S

1. de Finetti, B., Probability, Induction and Statistics: The Art of Guessing. Wiley, New York, 1972.

2. Handbook for the Calculation of Lower Statistical Confidence Bounds on System Reliability. Maximus, Inc., 1980.

3. Lindley, D. V., Making Decisions. Wiley-lnterscience, London, 1971. 4. Winkler, R. L. & Murphy, A. K., Good probability assessors. J. Appl.

Meteorology, 7 (1978), 75 I-8. 5. Mosleh, A., Bier, V. M. & Apostolakis, G., A critique of current practice for the

use of expert opinions in probabilistic risk assessment. Reliability Engineering and System Safety, 20 (1988), 63-85.

6. Tversky, A. & Kahneman, D., Judgment under uncertainty: heuristics and biases. Science, 185 (1974), 1124-31.

7. Hogarth, R. M., Cognitive processes and the assessment of subjective probability distributions. J. Amer. Stat. Assoc., 70 (1975), 271-89.

8. Lichtenstein, S., Fischhoff, B. & Phillips, L. D., Calibration of probabilities: the state of the art to 1980. In Judgment under Uncertainty: Heuristics and Biases, ed. D. Kahneman, P. Slovic & A. Tversky. Cambridge University Press, Cambridge, UK, 1982.

9. Winkler, R. L., The assessment of prior distributions in Bayesian analysis. J. Amer. Stat. Assoc., 62 (1967), 776-800.

10. Lichtenstein, S. & Fischhoff, B., Training for calibration. Organizational Behavior and Human Performance, 26 (1980), 149-71.

11. Koriat, A., Lichtenstein, S. & Fischhoff, B., Reasons for confidence, J. Experimental Psychology: Human Learning and Memory, 6 (1980), 107 18.

12. Martz, H. F. & Waller, R. A., Bayesian Reliability Analysis. Wiley, New York, 1982.

13. Mosleh, A. & Apostolakis, G., Models for the use of expert opinions. In Low- Probability~High-Consequence Risk Analysis: Issues, Methods, and Case Studies, ed. R. A. Waller and V. T. Covello. Plenum, New York, 1984.

14. Winkler, R. L., Probabilistic prediction: Some experimental results. J. Amer. Stat. Assoc., 66 (1971), 675-85.

Page 7: The subjective Bayesian approach to PRA: Theoretical and practical perspectives

The subjective Bayesian approach to PRA 275

15. Zadeh, L. A., Fuzzy sets as a basis for a theory of possibility. Fuzzy Sets and Systems, 1 (1978), 3-28.

16. Bonissone, P. P. & Tong, R. M., Editorial: Reasoning with uncertainty in expert systems. Int. J. Man-Machine Studies, 22 (1985), 241-50.

17. Schmucker, K. J., Fuzzy Sets, Natural Language Computations, and Risk Analysis. Computer Science Press, Rockville, MD, 1984.

18. Mosleh, A., Hidden sources of uncertainty: Judgment in collection and analysis of data. Nuclear Engineering and Design, 93 (1986), 187-198.

19. Minarick, J. W., Harris, J. D., Austin, P. N., Cletcher, J. W. & Hagen, E. W., Precursors to Potential Severe Core Damage Accidents: 1985--A Status Report. Oak Ridge National Laboratory, US Nuclear Regulatory Commission, Washington, DC, NUREG/CR-4674, 1986.

20. Apostolakis, G. & Kaplan, S., Pitfalls in risk calculations. Reliability Engineering, 2 (1981), 13545.

21. Stael von Holstein, C.-A. S., Probabilistic forecasting: an experiment related to the stock market. Organizational Behavior and Human Performance, 8 (1972), 139-158.

22. US Nuclear Regulatory Commission, Plan to evaluate the Commission's safety goal policy statement. Washington, DC, January 1983.

23. Bier, V. M., The US Nuclear Regulatory Commission safety goal policy: a critical review. Risk Analysis (in press).

24. US Nuclear Regulatory Commission, Safety goals for the operations of nuclear power plants: Policy statement, Federal Register, 51 (1986), 28044-9.

25. US Nuclear Regulatory Commission, Summary paper on safety goals for the operation of nuclear power plants. Washington, DC, 1986.

Vicki M. Bier & Ali Mosleh Pickard, Lowe and Garrick, Inc., 2260 University Drive, Newport Beach, California 92660, USA