Vous êtes sur la page 1sur 45

Uncertainty Propagation: Introduction

Laura Swiler
Sensitivity Analysis and Uncertainty Quantification
UNM Course 579, Spring 2010
Contact: Laura Swiler, lpswiler@msn.com
Lecture Outline
Uncertainty Propagation:
Output Y is a random variable
Quantities of Interest
Uncertainty Propagation: Methods
Sampling
Reliability Methods
Stochastic Expansion Methods
Response Surface (Meta-model) Methods
Differential Analysis
2
Involve generation and exploration of mapping from
analysis inputs to analysis results
Analysis input: x = [x
1
,x
2
,x
nX
]
Analysis results: y(x) = [y
1
(x),y
2
(x),,y
nY
(x)]
Two Questions
What is the uncertainty in y(x) given the uncertainty in x?
How important are the individual elements of x with
respect to the uncertainty in y(x)?
Sampling-Based Methods for Uncertainty
and Sensitivity Analysis
3
Uncertainty in Analysis Input
Uncertainty in y derives from uncertainty in x
Assumption: Appropriate value for y obtained if
appropriate value for x used
Problem: Impossible to specify appropriate value for x
unambiguously
Many possible values of x of varying levels of plausibility
Uncertainty with respect to x
Characterized by distributions D
1
,D
2
,,D
nX
assigned to
elements x
1
,x
2
,,x
nX
of x
4
Propagation of Uncertainty
Generate sample: x
k
, k = 1,2,,nS
Evaluate y: y(x
k
), k = 1,2,,nS
Resultant mapping: [x
k
, y
k
(x
k
)], k = 1,2,nS
Mapping forms basis for
Uncertainty analysis (distribution functions, output
moments, box plots)
Sensitivity analysis ( scatterplots, regression analysis,
variance based decomposition,)
5
Components of Sampling-Based
Uncertainty/Sensitivity Analysis
Characterization of uncertainty in x (i.e., definition of D
1
,D
2
,,D
nX
)
Generation of sample from x (i.e., generation of x
k
,
k = 1,2,,nS, in consistency with D
1
,D
2
,,D
nX
)
Propagation of sample through analysis (i.e., generation of
mapping [x
k
, y(x
k
)], k = 1,2,,nS)
Presentation of uncertainty analysis results (i.e., approximations
to the distributions of the elements of y obtained from y(x
k
),
k = 1,2,,nS)
Determination of sensitivity analysis results (i.e., exploration of
the mapping [x
k
, y(x
k
)], k = 1,2,,nS)
6
Sampling Procedures: Random Sampling
Random sample: x
k
= [x
1k
,x
2k
,,x
nX,k
], k = 1,2,,nR
Sample elements (i.e.,x
k
s) from different regions of sample
space occur in direct relationship to the probability of these
regions
Each sample element selected independently of all other
sample elements
7
Random Sampling
Assume certain distributions on the uncertain input values, sample from
those distributions, run the model with the sampled values, and do this
repeatedly to build up a distribution of the outputs.
8
Simulation
Model
Output
Distributions
N samples of X
Output 1
Output 2
Input
Distributions
N realizations of Y
Random Sampling
Simulation
Model 1
Output
Distributions
N samples of X
Measure 1
Measure 2
Input
Distributions
N realizations of X
Sampling is not the most efficient UQ method, but it is easy to
implement and is transparent in terms of tracing sample realizations
through multiple codes for complex UQ studies
Simulation
Model 2
Simulation
Model 3
Additional Inputs for Simulation 2
Additional Inputs for Simulation 3
9
10
EXAMPLE 1
Cantilever Beam Description
PP
x
y
Goal: understand how the deflection of the beam varies with
respect to the length, width, and height of the beam as well as
to applied load and elastic modulus of the beam
Variable Description Nominal Value
L Length 1 m
W Width 1 cm
H Height 2 cm
I Area Moment of Inertia 1/12 WH
3

P Load 100 N
E Elastic Modulus of
Aluminum6061-T6
69 GPa

11
Sampling Results
0
0.2
0.4
0.6
0.8
1
1.2
0 5 10 15 20 25
Displacement (cm)
C
D
F
Variable Distribution Distribution Parameters
L Normal Mean =1m
Std. Dev. =0.01 m
W Fixed 1 cm
H Fixed 2 cm
P Normal Mean =100 N
Std. Dev. =5 N
E Normal Mean =69 GPa
Std. Dev. =13.8 GPa

12
Sampling Results
0
5
10
15
20
25
0 20 40 60 80 100 120 140
Load (N)
D
i
s
p
l
a
c
e
m
e
n
t

(
c
m
)
0
5
10
15
20
25
0.00E+00 2.00E+10 4.00E+10 6.00E+10 8.00E+10 1.00E+11 1.20E+11
Modulus of Elasticity (GPa)
D
i
s
p
l
a
c
e
m
e
n
t

(
c
m
)
13
Example: Cantilever Beam
Sensitivity Analysis with Gradients
L = Length = 1 m
Width = 1 cm, Height = 2 cm
P = load = 100 N
Material = Aluminum 6061-T6:
E = Elastic Modulus = 69 GPa
Deflection = PL^3/(3EI)
Sensitivity Analysis of deflection
()vs. P, L, and E
P
Scaled Sensitivity Coefficients

x
*(/x)

P
*(/P) = 0.0724

L
*(/L) = 0.217

E
*(/E) = -0.0724
Notes:
1. Gradients typically computed via finite
difference estimates.
2. Be wary of extrapolating trends.
3. No interaction data from this approach,
but still useful.
4. For a follow-on UQ study, maybe Id
freeze P and E at nominal values, and focus
resources to study uncertainty in L.
14
Example: Cantilever Beam
Sensitivity Analysis with DAKOTA
L = Length = 1 m
Width = 1 cm, Height = 2 cm
P = load = 100 N
Material = Aluminum 6061-T6:
E = Elastic Modulus = 69 GPa
Deflection = PL^3/(3EI)
Sensitivity Analysis of deflection ()vs. P,
L, and E via random sampling over +/-
5% bounds around nominal values.
P
Correlation Analysis Method
1. Generated 20 random samples of L, P, E
within +/-5% bounds.
2. Compute deflection for each random sample.
3. Look at partial correlation results generated
by DAKOTA software.
4. Result: L most important parameter, but all
have about equal impact.
Load
Length
Modulus
Deflection
.
-0.1177
-0.0753
0.2624
-0.1177
.
0.2146
0.3251
-0.0753
0.2146
.
-0.3088
0.2624
0.3251
-0.3088
.
Load Length Modulus Deflection
Partial Correlation Table
15
Analytic Reliability Results
0.00
0.10
0.20
0.30
0.40
0.50
0.60
0.70
0.80
0.90
1.00
0.0 2.0 4.0 6.0 8.0 10.0 12.0 14.0 16.0
Displacement (cm)
16
Polynomial Chaos Results
PCE - Sampling
0.00
0.10
0.20
0.30
0.40
0.50
0.60
0.70
0.80
0.90
1.00
0.0 2.0 4.0 6.0 8.0 10.0 12.0 14.0
Displacement (cm)
PCE - Sampling
17
Epistemic Uncertainty Results
Cumulative Belief and Plausbility Functions
0.00
0.10
0.20
0.30
0.40
0.50
0.60
0.70
0.80
0.90
1.00
0.00 0.05 0.10 0.15 0.20 0.25
Displacement (m)
C
B
F
/
C
P
F
Plausibility Example 2
Belief Example 2
Belief Example 1
Plausiblity Example 1
18
Second-order Probability
Variable Epistemic Mean Distribution
L [0.98, 1.02] m Normal(epistemic mean, 0.01) m
P [90,110] N Normal(epistemic mean, 5) N
E [41.4,96,6] GPa Normal(epistemic mean, 13.8) GPa


Second-order Probability
0.00
0.10
0.20
0.30
0.40
0.50
0.60
0.70
0.80
0.90
1.00
0.0 5.0 10.0 15.0 20.0 25.0 30.0 35.0 40.0 45.0
Displacement (cm)
C
D
F
Lower Bound on CDF
Upper Bound on CDF
1 of 20 CDFs
Another of 20 CDFs
19
20
Overall Comparison
0
0.2
0.4
0.6
0.8
1
1.2
0 10 20 30 40
Displacement (cm)
C
D
F
Sampling
Reliability
Polynomial Chaos
Belief
Plausibility
Second-Order Prob - Lower
Second Order Prob Upper
21
Sensitivity Analysis (1/2)
Correlations (Raw, Partial, Rank)
Identifies monotonic relationships between input and output, graphical
data analysis (e.g. scatterplots) also useful
Stepwise Regression Analysis
Identifies important variables to add to a regression model to explain the
greatest amount of variability in the output
Variance-based Decomposition
Identifies the fraction of the variability in the output that can be attributed
to an individual variable alone or with interaction effects
Use of meta-models
Perform stepwise analysis or VBD on the surrogate model, not the full
model (reduce computational cost)
Morris One-At-A-Time Sampling
Take large derivative steps (e.g. more than half the domain). Average
over different starting points, different trajectories to get main effects
type analysis
Orthogonal Arrays
What happens to the mean response when you change variable 1 from
setting A to setting B (averaged over the other variables)? If a large change
significant main effect
22
Sensitivity Analysis (2/2)
Step 1 2 3
Constant 15.894 8.656 -11.433
E_scaled -1.205 -1.205 -1.206
T-Value -72.16 -83.2 -87.73
P-Value 0 0 0
P 0.0724 0.0725
T-Value 18.14 19.14
P-Value 0 0
L 20.1
T-Value 10.59
P-Value 0
R-Sq 83.91 87.91 89.13
L P E Disp
L 1.00
P 0.00 1.00
E 0.00 0.00 1.00
Disp 0.11 0.20 -0.92 1.00
Stepwise Regression
Simple Correlation Coefficients
Uncertainty Propagation: Output Measures
Cumulative Distribution Function
Complementary Cumulative Distribution
Function
Moments: Mean, Variance
Percentiles: Median, 5
th
, 95
th
percentiles
23
Uncertainty Analysis: Scalar Results
Single scalar result: y
k
= y(x
k
), k = 1,2,,nS
Mean and variance
[ ]

( ) / ,

( )

( ) ( ) E y y nS V y y E y nS
k k
k
nS
k
nS
= =
= =

2
1 1
1
24
Uncertainty Analysis: Scalar Results (cont)
Distribution function
25
Example Analysis: CDFs and CCDFs
26
Example Analysis: CDF and Density
Function
27
Example Analysis: Box Plots
28
Uncertainty Analysis: Functions
Analysis outcomes are often functions of one or more variables
Uncertain analysis inputs result in many possible values for such
functions
Example analysis: pressure as a function of time
29
Example Analysis: Pressure at Time t
30
Example Analysis: Replicated Samples and
Stability
31
32
Detour
Quick Overview on OTHER Uncertainty
Propagation methods
Differential Analysis
Calculate the derivative of the output with respect to all
of the inputs
ONLY gives local information
Need to exercise caution if you are taking the derivatives
at the mean of the inputs (for example)
Can be expensive if done repeatedly throughout the
space, using central finite-differencing
i
x
y

33
Response Surface Methods
Take an initial set of samples over the uncertain variables x,
then construct a response surface or meta-model from those
samples
Regression (e.g. linear, quadratic, rank)
MARS (splines)
Neural networks
Radial Basis Functions
Nonparametric regression
More advanced surrogates
Can be very useful in understanding sensitivities
Need to exercise caution if you sample the surrogate
extensively to perform UQ (inaccuracies, especially at the
boundaries)
34
Response Surface (Surrogate) Methods
Generate ~ 10D samples
of computer simulation
Create Surrogate Models:
Regression, Neural nets,
Splines (MARS)
Sample Surrogate
to obtain
CDF on output
Use Surrogate in
Sensitivity Analysis
UQ analyses often require thousands of runs, which is often not
feasible with an expensive simulation, requiring surrogates
You need to understand something about the goodness of your
surrogate. There are a variety of diagnostic metrics to help (R
2
, mean
absolute error, sum-squared error, cross-validation metrics, etc.)
Often the surrogate is less accurate at bounds or endpoints: use
caution
35
Reliability Analysis
Assume that the probability of failure is based on a specific
performance criterion which is a function of random variables, denoted
X
i
.
The performance function is described by Z:
Z = g(X
1
, X
2
, X
3
, , X
n
)
The failure surface or limit state is defined as Z = 0. It is a boundary
between safe and unsafe regions in a parameter space.
n n
g
X f
f
dx dx dx x x x f p
Z P failure P p
... ) ,..., , ( ...
) 0 ( ) (
2 1 2
0 ()
1

<
=
< = =
36
Reliability Analysis
Note that the failure integral has the joint probability density function,
f, for the random variables, and the integration is performed over the
failure region
If the variables are independent, we can replace this with the product
of the individual density functions
In general, this is a multi-dimensional integral and is difficult to
evaluate.
People use approximations. If the limit state is a linear function of the
inputs (or is approximated by one), first-order reliability methods
(FORM) are used.
If the nonlinear limit state is approximated by a second-order
representation, second-order reliability methods (SORM) are used.
n n
g
X f
dx dx dx x x x f p ... ) ,..., , ( ...
2 1 2
0 ()
1

<
=
37
Polynomial Chaos Expansions (PCE)
Represent a stochastic process (the uncertain output f(X)) as a
spectral expansion in terms of suitable orthonormal eigenfunctions
with weights associated with a particular density
The uncertain output f(X) is approximated by finite dimensional
series based on unit Gaussian distributions
In the expansion, the H terms are Hermite polynomials (multi-
dimensional orthogonal polynomials), the are standard normal
random variables, and the coefficients a
j
are deterministic but
unknown.
The job of PCE is to determine the coefficients a
j
. Then, one has an
approximation that can be sampled many times to calculate desired
statistics
) ( ) (
0

j
P
j
j
H a R X f

=
=
38
Polynomial Chaos Expansions (PCE)
Conceptually, the propagation of input uncertainty through a model using PCE in a
non-intrusive approach consists of the following steps:
(1) Transform input uncertainties X to unit Gaussian random variables: X
(2) Assume a particular form for the orthogonal polynomials such as Hermite
(3) Generate many samples of X and . These will generate a a set of linear
equations to solve for the spectral expansion coefficients
for i = 1N samples
(4) Once the coefficients a
j
are determined, take 1000s of samples of and run
them through the spectal expansion equation to obtain an approximation for
f(X) build up a CDF of f(X)
) ( ) (
0
i j
P
j
j i
H a R X f

=
=
39
Epistemic UQ
Second-order probability
Two levels: distributions/intervals on
distribution parameters
Outer level can be epistemic (e.g., interval)
Inner level can be aleatory (probability distrs)
Strong regulatory history (NRC, WIPP).
Dempster-Shafer theory of evidence
Basic probability assignment (interval-based)
Solve opt. problems (currently sampling-based)
to compute belief/plausibility for output intervals
40
Second-order Probability
0.00
0.25
0.50
0.75
1.00
C
u
m

P
r
o
b

response metric
epistemic
sampling
aleatory
sampling
simulation
50 outer loop samples
50 CDF traces
each discrete
(empirical)
CDF: 100
inner loop
samples
For each outer loop sample of epistemic (interval) variables, run
an inner loop UQ study over aleatory (probability) variables
Envelope of CDF traces represents response epistemic uncertainty
41
Dempster-Shafer Evidence Theory
Concept of Dempster-Shafer belief structures:
For each uncertain input variable, one specifies basic probability
assignment for each potential interval where this variable may exist.
Intervals may be contiguous, overlapping, or have gaps
Belief is a lower bound on the probability that is consistent with the
evidence
Plausibility is the upper bound on the probability that is consistent
with the evidence
BPA=0.5
BPA=0.2
BPA=0.3
Variable 1
BPA=0.5 BPA=0.2 BPA=0.3
Variable 2
42
Epistemic Uncertainty Quantification
Implementation: Look at various combinations of intervals.
In each joint interval box, one needs to find the maximum and
minimum value in that box
Can use sampling or optimization (local or global)
Computationally expensive
Variable 1
Variable 2
.5 .3 .2
0.1
0.2
0.7
Original LHS samples used
To generate a surrogate
Million sample points
generated from the
surrogate, used to
determine the max
and min in each cell
to calculate plausibility
and belief
43
Software tools
44
DAKOTA
Minitab statistics package
JMP statistics package
Mathematica
Matlab with Statistics Toolbox
R or S+ language
Simlab
Excel add-ins (@Risk, Crystal Ball)
Others
Additional Information
45
J.C. Helton and F.J. Davis, Sampling-based Methods, in Sensitivity Analysis, A. Saltelli,
C. Chan, and E.M. Scott, Eds. New York: Wiley, 2000, pp. 101-153.
J.C. Helton and F.J. Davis, Latin hypercube sampling and the propagation of
uncertainty in analyses of complex systems, Reliability Engineering and System Safety,
Vol. 81, pp. 23-69, 2003.
J.C. Helton, J.D. Johnson, C.J. Sallaberry, C.B. Storlie, Survey of sampling-based
methods for uncertainty and sensitivity analysis. Reliability Engineering & System
Safety, Volume 91, Issues 10-11, October-November 2006, Pages 1175-1209.
Swiler, L. P. and A. A. Giunta. Aleatory and Epistemic Uncertainty Quantification for
Engineering Applications. Sandia Technical Report, SAND2007-2670C, also in 2007
Proceedings of the Joint Statistical Meetings (JSM).
Measurement Uncertainty (Experimental): GUM report. Guidelines for Evaluating and
Expressing the Uncertainty of NIST Measurement Results, by Barry N. Taylor and Chris
E. Kuyatt. NIST Technical Note 1297, 1994 Edition,
http://physics.nist.gov/Pubs/guidelines/contents.html

Vous aimerez peut-être aussi