using uncertainty reduction as a measure of value to … · 2013-11-19 · using uncertainty...
TRANSCRIPT
GET CONNECTED to LEARN, SHARE, and ADVANCE
Eileen Bjorkman Independent Aerospace Professional
14 November 2013
USING UNCERTAINTY REDUCTION AS A MEASURE OF VALUE TO OPTIMIZE TEST PROGRAMS
30th Annual International Test and Evaluation Symposium 2 13 Nov 2013 GET CONNECTED to LEARN, SHARE, and ADVANCE
OVERVIEW
• Problem • Uncertainty as a Value Measure • Example • Conclusions and Further Work
30th Annual International Test and Evaluation Symposium 3 13 Nov 2013 GET CONNECTED to LEARN, SHARE, and ADVANCE
PROBLEM (1/2)
• No consistent approach in DoD to quantify test value – Largely subjective process – Past attempts at using cost and rework have
failed
• Lack of quantified test value impacts test efficiency, especially when: – Customers expect defensible test results – Multiple stakeholders are involved – Costs are constrained and schedules accelerated
30th Annual International Test and Evaluation Symposium 4 13 Nov 2013 GET CONNECTED to LEARN, SHARE, and ADVANCE
PROBLEM (2/2)
• Past attempts to optimize test portfolios failed – Prioritization schemes, rework costs, cost savings – Could not scale or transition to real problems
• Cost metric don’t capture true value of testing, which is to reduce uncertainty & risk
• Uncertainty quantification is well defined field • Information theory approaches provide
consistent measures of uncertainty
30th Annual International Test and Evaluation Symposium 5 13 Nov 2013 GET CONNECTED to LEARN, SHARE, and ADVANCE
UNCERTAINTY AS A VALUE MEASURE
• Technical Uncertainty Framework • Shannon’s Information Uncertainty • Test Planning and Portfolio Optimization
Process
30th Annual International Test and Evaluation Symposium 6 13 Nov 2013 GET CONNECTED to LEARN, SHARE, and ADVANCE
UNCERTAINTY AS A VALUE MEASURE: TECHNICAL UNCERTAINTY FRAMEWORK
Unknowable Uncertainty Knowable Uncertainty
(Ambiguity)
Essential Elements of Uncertainty:
Components of Uncertainty Aleatory Epistemic
Sources of Uncertainty Measurement(input/output), model structure, model selection,
prediction error, inference uncertainty
Application to Test and Evaluation:
Test Goal Reduce Uncertainty Characterize and Reduce
Uncertainty
Type of Model Available Physics-based None or limited
Empirical
Characterization of Uncertainty:
Uncertainty Reduction Model Using and Updating:
Using test data to reduce or
estimate uncertainty and
validate/update model
Model Building:
Using data to build model and
estimate uncertainty
Uncertainty Depiction
(not an exhaustive list)
Probability Distribution/Summary Statistics
Confidence, Prediction or Tolerance Intervals
Credible Interval (Bayesian)
Akaike Information Criterion
Deviance Information Criterion
Test Value/Uncertainty Estimation Measures Based on Shannon’s Information Entropy
30th Annual International Test and Evaluation Symposium 7 13 Nov 2013 GET CONNECTED to LEARN, SHARE, and ADVANCE
UNCERTAINTY AS A VALUE MEASURE: SHANNON’S INFORMATION ENTROPY
7
Shannon’s Entropy for a Bernoulli Random Variable
Discrete:
Continuous:
Entropy: a measure of the uncertainty of a random variable
Normal distribution: h(x) = (1/2)log(2πeσ2)
Same variance, exponential has lower entropy
Uniform has lower variance, beta has lower entropy
30th Annual International Test and Evaluation Symposium 8 13 Nov 2013 GET CONNECTED to LEARN, SHARE, and ADVANCE
UNCERTAINTY AS A VALUE MEASURE: WHY ENTROPY?
• Meets desirable properties for an uncertainty measurement: – Concavity – Global maximum at the uniform distribution
• Easy to calculate for a given probability distribution • Achieves both positive and negative values • Provides values in a common set of units • Lowest variance not necessarily lowest uncertainty • Variance reduction not always a test goal • Generally a more conservative measure of uncertainty
reduction compared to variance • Covers a wide range of uncertainties, including modeling • Can be used with stakeholder preferences to develop utilities
30th Annual International Test and Evaluation Symposium 9 13 Nov 2013 GET CONNECTED to LEARN, SHARE, and ADVANCE
UNCERTAINTY AS A VALUE MEASURE: PLANNING & OPTIMIZATION PROCESS
1. Identify decision situation 2. Determine test objectives for each test in the portfolio
For each test in the portfolio:
1. Establish baseline uncertainty
2. Identify 2-3 test options
Model the problem:
1. Test Portfolio/Constraints
2. Other Uncertainties (e.g., cost)
3. Stakeholder Preferences
Choose the optimum portfolio:
Maximize portfolio value or utility within constraints
Sensitivity analysis if desired
Adapted from Clemen, R. T., & Reilly, T. (2001). Making Hard Decisions with DecisionTools(R) (Second ed.). Pacific Grove, California: Duxbury Thomson Learning.
Further analysis if needed
30th Annual International Test and Evaluation Symposium 10 13 Nov 2013 GET CONNECTED to LEARN, SHARE, and ADVANCE
EXAMPLE: THE NOTIONAL U-100
Lindley (1956) suggested relative uncertainty reduction as the valid measure of information provided by an experiment
30th Annual International Test and Evaluation Symposium 11 13 Nov 2013 GET CONNECTED to LEARN, SHARE, and ADVANCE
EXAMPLE: BASELINE PORTFOLIO OPTIMIZATION
Multiple-Choice Knapsack Problem
30th Annual International Test and Evaluation Symposium 12 13 Nov 2013 GET CONNECTED to LEARN, SHARE, and ADVANCE
EXAMPLE: OPTIMIZED PORTFOLIO COMPARED TO SME
Portfolio Radar Low Radar Medium Radar High
Cost 203.2 213.2 225.2
Optimized Value 2.461 2.495 2.524
SME Value 2.270 2.391 2.420
% Difference, Optimized to SME +8.4% +4.3% +4.3%
Optimization process outperforms SME “selections”, particularly when resources are more constrained
Based on actual tests conducted; no radar test actually conducted
30th Annual International Test and Evaluation Symposium 13 13 Nov 2013 GET CONNECTED to LEARN, SHARE, and ADVANCE
EXAMPLE: SIMULATED LARGE PORTFOLIO
Each portfolio contained: - 22 tests with 3 options each - 28 tests with 2 options each
Test values and costs were generated using random draws from uniform distributions
30th Annual International Test and Evaluation Symposium 14 13 Nov 2013 GET CONNECTED to LEARN, SHARE, and ADVANCE
EXAMPLE: LARGE PORTFOLIO COMPARISON TO SME
Method 1 randomly selected
test options
Method 2 selected sub-portfolios,
allocated resources, and then optimized
- Optimization process outperforms SME “selections” in all cases - Dividing into sub-portfolios helps simulated SMEs - Simulated SMEs are inefficient in resource allocation
30th Annual International Test and Evaluation Symposium 15 13 Nov 2013 GET CONNECTED to LEARN, SHARE, and ADVANCE
OVERALL RESULTS
• Optimization was easy to conduct and ran quickly, even for large portfolio
• Portfolio was robust to sensitivities in cost and test value measure (not shown in this presentation)
• Initial planning required somewhat more resources than would be expected for a SME-designed test
• Using entropy as a basis for uncertainty reduction for tests where variance reduction is primary test goal may be too conservative
• Optimized portfolios using entropy values outperforms SME portfolios and allocates resources more efficiently
30th Annual International Test and Evaluation Symposium 16 13 Nov 2013 GET CONNECTED to LEARN, SHARE, and ADVANCE
CONCLUSIONS AND FURTHER WORK
• Uncertainty reduction provides a robust measure of value for a test
• Explicitly considering uncertainty reduction during test planning can eliminate tests from consideration
• Additional work: – Apply methods to other domains – Accommodate multiple stakeholders – Further investigation of entropy as measure – Include continuous functions for a wider range of test
options
30th Annual International Test and Evaluation Symposium 17 13 Nov 2013 GET CONNECTED to LEARN, SHARE, and ADVANCE
FOR MORE INFORMATION
• Bjorkman, E. A., S. Sarkani, & T. A. Mazzuchi (2013). Test Resource Allocation Using Uncertainty Reduction as a Measure of Test Value. IEEE Transactions on Engineering Management, 60(3), pp. 541-551.
• Bjorkman, E. A., S. Sarkani, & T. A. Mazzuchi (2013). Using Model-Based Systems Engineering as a Framework for Improving Test and Evaluation Activities. Systems Engineering, 16(3), pp. 346-342.
• Bjorkman, E. A., S. Sarkani, & T. A. Mazzuchi (2013). Systems Test Optimization Using Monte Carlo Simulation. ITEA Journal, 34(2), pp. 178-188.