advances in robust engineering design henry wynn and ron bates department of statistics workshop at...
TRANSCRIPT
Advances in Robust Engineering Design
Henry Wynn and Ron BatesDepartment of Statistics
Workshop at Matforsk, Ås, Norway13th-14th May 2004
Design of Experiments – Benefits to Industry
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
2
Background
• 2 EU-Funded Projects:
– (CE)2 : Computer Experiments for Concurrent Engineering (1997-2000)
– TITOSIM: Time to Market via Statistical Information Management (2001-2004)
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
3
What is Robustness?
• Many different definitions• Many different areas
– Biological– Systems theory– Software design– Engineering design, Reliability ….
• Quick Google web search : 176,000 entries
• 16 different definitions on one website!
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
4
Working definitions (Santa Fe Inst.)
• 1. Robustness is the persistence of specified system features in the face of a specified assembly of insults.
• 2. Robustness is the ability of a system to maintain function even with changes in internal structure or external environment.
• 3. Robustness is the ability of a system with a fixed structure to perform multiple functional tasks as needed in a changing environment.
• 4. Robustness is the degree to which a system or component can function correctly in the presence of invalid or conflicting inputs.
• 5. A model is robust if it is true under assumptions different from those used in construction of the model.
• 6. Robustness is the degree to which a system is insensitive to effects that are not considered in the design.
• 7. Robustness signifies insensitivity against small deviations in the assumptions.
• 8. Robust methods of estimation are methods that work well not only under ideal conditions, but also under conditions representing a departure from an assumed distribution or model.
• 9. Robust statistical procedures are designed to reduce the sensitivity of the parameter estimates to failures in the assumption of the model.
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
5
Continued…• 10. Robustness is the ability of software to react appropriately to
abnormal circumstances. Software may be correct without being robust. • 11. Robustness of an analytical procedure is a measure of its ability to
remain unaffected by small, but deliberate variations in method parameters, and provides an indication of its reliability during normal usage.
• 12. Robustness is a design principle of natural, engineering, or social systems that have been designed or selected for stability.
• 13. The robustness of an initial step is determined by the fraction of acceptable options with which it is compatible out of total number of options.
• 14. A robust solution in an optimization problem is one that has the best performance under its worst case (max-min rule).
• 15. "..instead of a nominal system, we study a family of systems and we say that a certain property (e.g., performance or stability) is robustly satisfied if it is satisfied for all members of the family."
• 16. Robustness is a characteristic of systems with the ability to heal, self-repair, self-regulate, self-assemble, and/or self-replicate.
• 17. The robustness of language (recognition, parsing, etc.) is a measure of the ability of human speakers to communicate despite incomplete information, ambiguity, and the constant element of surprise.
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
6
Engineering design paradigms
• Example: Clifton Suspension Bridge
• Creative input vs. mathematical search
Conceptual Design
Creative solutions, e.g. arch, girder, truss or suspension bridge.
Redesign Design improvement/optimisation e.g. arrangement of structural elements.
Routine Design Minor modification e.g. geometry values for different sizes of structural elements
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
7
A Framework for Redesign
• Define the “Design Space”,• Write where,
• Parameterisation is important
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
8
Robustness in Engineering Design
• Based around the notion of “Design Space” and “Performance Space”
x1
x2
y1
y2
design evaluation(modelling / prototyp ing )
Design Space Performance Space
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
9
Adding Noise
• No noise
• Internal noise
• External noise
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
10
Propagation of variation
• Monte Carlo– Flexible– Expensive
• Analytic– Need to know function– Mathematically more complex– (Usually) restricted to univariate
distributions
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
11
Dual Response Methods
• Estimate both mean and variance 2 of a response or key performance indicator (KPI)
• This leads to either: 1. Multi-Objective problem e.g. min(,2)2. Constrained optimisation e.g. min(2)
subject to: t1<< t2
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
12
Stochastic Responses
• Output distribution type is unknown
• Possibilities:– Estimate Mean & Variance (Dual
Response)– Select another criteria e.g. % mass
A B C
Den
sit
y
Response
85 %5%0% 10%
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
13
Stochastic Simulation (Monte Carlo)
x1
x2
y1
y2Design SpacePerformance Space
S ing le evaluation
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
14
Piston Simulator Example
C: Initial Gas Volume (m3)
B: Piston Surface Area (m2)A: Piston Weight (Kg)
D: Spring Coefficient (N/m)
E: Atmospheric Pressure (N/m2)F: Ambient Temperature (0K)
G: Gas Temperature (0K)
C: Initial Gas Volume (m3)C: Initial Gas Volume (m3)
B: Piston Surface Area (m2)A: Piston Weight (Kg)B: Piston Surface Area (m2)A: Piston Weight (Kg)A: Piston Weight (Kg)
D: Spring Coefficient (N/m)
E: Atmospheric Pressure (N/m2)F: Ambient Temperature (0K)
G: Gas Temperature (0K)
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
15
Noise added to design factors
New boundsfor searchspace
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
16
Experiment details
• All 7 design factors are subject to noise
• Minimize both mean and standard deviation of cycle time response
• Perform 50 simulations in a sub-region of the design space:
• For each simulation, compute mean and std of cycle time with 50 simulations
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
17
Visualisation of search strategy
Design Poin t:50 rep lications
Search Space:50 design poin ts
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
18
Searching for an improved design
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
19
Features of Stochastic Simulation
• Large number of runs required (17500)
• No errors introduced by modelling• Design improvement, but not
optimisation.• Can accept any type of input noise
(e.g. any distribution, multivariate)• Can be applied to highly nonlinear
problems
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
20
Statistical Modelling: Emulation
1) Perform computer experiment on simulator and replace with emulator…
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
21
Experimentation using the Emulator
2) Perform a 2nd experiment on emulator and estimate output distribution using Monte Carlo
Emu la to rDesign Fac tors
Noise Fac tors
Response
0
10
20
30
40
50
60
70
80
90
100
-1 -0.9 0 0.5 1
o r0
10
20
30
40
50
60
70
80
90
100
-1 -0.5 0 0.5 1
Internal External
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
22
Stochastic Emulation
3) Build 2nd stochastic emulator to estimate stochastic response…
Emu la to rControl Fac tors
Noise Fac tors
Response
S tochas ticR esponse
0
10
20
30
40
50
60
70
80
90
100
-1 -0.9 0 0.5 1
o r0
10
20
30
40
50
60
70
80
90
100
-1 -0.5 0 0.5 1
StochasticEmu la to r
InternalExternal
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
23
Piston Simulator Example
• Initial experiment, 64-run LHS design
• DACE Emulator of Cycle Time fitted
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
24
Stochastic Emulators ( and )
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
25
Pareto-optimal design points
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
26
Satellite simulation data
• Historical data set• 999 simulation runs• Two responses: LOS and T• Data split into two sets of 96 and
903 points for modelling and prediction
• Stochastic emulators built with reasonable accuracy
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
27
Response “LOS” vs. Factor 6
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
28
DACE emulator models
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
29
DACE Emulator Prediction
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
30
Satellite Study: Pareto Front
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
31
Conclusions
• Need flexible methods to describe robustness in design
• Simulations are expensive and therefore experiments need to be carefully designed
• Stochastic Simulation can provide design improvement which may be useful in certain situations
13-14 May 2004 Wynn & Bates, Dept. of Statistics, LSE
32
(more specific) Conclusions…
• Two-level emulator approach provides a flexible way of achieving robust designs
• Reduced number of simulations• Stochastic emulators used to estimate
any feature of a response distribution• Method needs to be tested on more
complex examples• Use of simulator gradient information
may help when fitting emulators