Vous êtes sur la page 1sur 47

Incorporating Risk and Uncertainty into the

Assessment of Impacts of Global Climate Change on


Transportation Systems

H. Christopher Frey, Ph.D.


Professor

Department of Civil, Construction, and Environmental Engineering


North Carolina State University
Raleigh, NC 27695
Prepared for:

2nd Workshop on Impacts of Global Climate Change


On Hydraulics and Hydrology and Transportation
Center for Transportation and the Environment
Washington, DC

March 29, 2006


Outline

Risk and Uncertainty


Overview of impacts of climate
change on transportation systems
Risk assessment methodologies
Uncertainty analysis methodologies
Qualitative assessments
Recommendations

2
Definitions

Risk: Probability and severity of an adverse


outcome
Uncertainty: Lack of knowledge regarding the
true value of a quantity

3
POSSIBLE IMPACTS OF GLOBAL CLIMATE
CHANGE ON TRANSPORTATION SYSTEMS

All modes:
highway, rail, air, shipping, pipeline, pedestrian
Passenger and freight
Possible climate impacts (natural processes)
Sea-level rise
Increased frequency and severity of storms
Higher average temperatures (location-specific)

4
Implications of Possible Climate Change
(Effects Processes)
Loss of coastal land area
Damage to infrastructure via storms (e.g.,
winds, flooding)
Damage to infrastructure because of
temperature extremes (e.g., rail kinks,
pavement damage)
Impede operations and safety
Design, construction, operation, maintenance,
repair, decommissioning
5
METHODOLOGICAL FRAMEWORKS FOR
DEALING WITH RISK
Vulnerability or hazard assessment
Exposure assessment
Effects processes
Quantification of risk
Risk management

6
Vulnerability Assessment
Physical, social, political, economic, cultural, and psychological
harms to which individuals and modern societies are susceptible
(Slovic, 2002).
Identify valuable targets at risk
Conceptualize various ways in which they are vulnerable to such
an attack by defining various scenarios.
Clearly state the scale and the scope of the analysis (e.g., the
world, a country, or specific region) considering that the risk
assessment process will become easier as the scope narrows
down.
Does not include assessment of the likelihood of such an event.
For example, coastal cities are vulnerable to the effects of sea
level rise.

7
Paradigm for Human Health Risk
Assessment (NRC, 1983)

Research Risk Assessment

Laboratory and Hazard Regulatory


Field Work Identification Options

Extrapolation Dose-Reponse Risk Evaluations of


Methods Assessment Characterization Options

Field
Exposure Decisions and
Measurements,
Assessment Actions
Modeling
8
An Alternative View of Human Health Risk
Assessment (PCRARM, 1997)

Problem/
Context
Risks
Evaluation

Stakeholder
Collaboration

Options
Actions

Decisions

9
Example of A General Risk Assessment
Framework (Morgan)
Natural Environment

Exposure of objects Effects on objects


and processes in and processes in Human
natural and human the natural and Perceptions
Natural environment to the human of exposures Costs and
Processes possibility of change environment and of effects Benefits

Exposure Effects Human Human


Processes Processes Perception Evaluation
Processes Processes

Human
Activities Human Environment 10
Risk Analysis and Risk Management

Analysis should be free of policy-motivated


assumptions
Yet, analysis should include scenarios relevant
to decision-making
Some argue for analysts and decision makers
to be kept apart to avoid biases in the analysis
Others argue that they must interact in order to
define the assessment objective
A practical, useful analysis needs to balance
both concerns
11
Realities of Decision-Making

Decision-making regarding response to the impacts of


climate change will involve:
multiple parties;
a local context;
considerations beyond just the science and technology
(such as equity, justice, culture, and others); and
implications for potentially large transfers of resources
among different societal stakeholders.
Such decision-making may not produce an optimal
outcome when viewed from a particular (e.g.,
national, analytical) perspective.
12
Based on Morgan (2003)
METHODOLOGICAL FRAMEWORKS FOR
DEALING WITH UNCERTAINTY
Role of uncertainty in decision making
Scenarios
Models
Model inputs
Empirically-based
Expert judgment-based
Model outputs
Other quantitative approaches
Qualitative approaches 13
Uncertainty and Decision Making

How well do we know these numbers?


What is the precision of the estimates?
Is there a systematic error (bias) in the estimates?
Are the estimates based upon measurements,
modeling, or expert judgment?
How significant are differences between two
alternatives?
How significant are apparent trends over time?
How effective are proposed control or management
strategies?
What is the key source of uncertainty in these numbers?
14
How can uncertainty be reduced?
Implications of Uncertainty in Decision
Making
Risk preference
Risk averse
Risk neutral
Risk seeking
Utility theory
Benefits of quantifying uncertainty: Expected
Value of Including Uncertainty
Benefits of reducing uncertainty: Expected
Value of Perfect Information
15
Framing the Problem: Objectives and
Scenarios
Need a well-formulated study objective that is
relevant to decision making
A scenario is a set of structural assumptions about
the situation to be analyzed:
spatial and temporal dimensions
specific hazards, exposures, and adverse outcomes
Typical errors: description, aggregation, expert
judgment, incompleteness
Failure to properly specify scenario(s) leads to bias in
the analysis, even if all other elements are perfect.
16
Model Uncertainty

A model is a hypothesis regarding how a system


works.
Ideally, the model should be tested by comparing its
predictions with observations from the real world
system, under specified conditions.
Difficult for unique or future events.
In practice, validation is often incomplete.
Extrapolation.
Other factors: simplifications, aggregation, exclusion,
structure, resolution, model boundaries, boundary
conditions, and calibration. 17
Examples of Alternative Models

State
System Response

Change?

Sublinear

Linear

Superlinear Threshold

Explanatory Variable
18
Model Uncertainty Climate Change
Impacts
Enumeration of a set of plausible or possible
alternative models,
Comparisons of their predictions or
development of a weighting scheme to
combine the predictions of multiple models into
one estimate
It seems inappropriate to increase the
complexity of the analysis in situations where
less is known (Casman et al., 1999)
19
Model Uncertainty

Model 1
w1

w2 Model 2

w3

Model 3

Weighted Combination
Of Model Outputs
20
The Role of Models When Structural
Uncertainties are Large
Assessment of climate change impacts involves many
component models
Some are better than others, and they degrade at
different rates as one goes farther into the future.
For problem areas in which there is little relevant
data, theory, or experience, a simpler order-of-
magnitude model may be adequate.
For problem areas in which little is known, very simple
bounding analyses may be all that can be justified.
For poorly supported models, it is no longer possible
to search for optimal decision strategies. Instead,
one can attempt to find feasible or robust strategies
21
Quantification of Uncertainty in Inputs and
Outputs of Models
Input Uncertainties

Output
Uncertainty

Model

22
Statistical Methods
Based Upon Empirical Data
Frequentist, classical
Statistical inference from sample data
Parametric approaches
Parameter estimation
Goodness-of-fit
Nonparametric approaches
Mixture distributions
Censored data
Dependencies, correlations, deconvolution
Time series, autocorrelation
23
Statistical Methods Based on Empirical
Data
Need a random, representative sample
Not always available when predicting events
into the future

24
Example of an Empirical Data Set
Regarding Variability

1
Cumulative Probability

0.8

0.6

0.4

0.2

0
0.001 0.01 0.1 1
Empirical Quantity
Benzene Emission Factor
(ton/yr/tank)

25
Fitted Lognormal Distribution

1
Cumulative Probability

0.8

0.6

0.4

0.2

0
0.001 0.01 0.1 1
Empirical Quantity
Benzene Emission Factor
(ton/yr/tank)

26
Bootstrap Simulation to Quantify
Uncertainty

1.0

0.8
Cumulative Probability

0.6
Data Set
Fitted Distribution
0.4 Confidence 90
Interval
percent
50 percent
0.2 90 percent
95 percent

0.0
-3 -2 -1 0
10 10 10 10
Empirical Quantity
Benzene Emission Factor
(ton/yr/tank)
27
Results of Bootstrap Simulation:
Uncertainty in the Mean

1
Cumulative Probability

0.8

0.6 mean =0.06


0.6

0.4 95% Probability


Range (0.016, 0.18)
0.2

0
0 0.05 0.1 0.15 0.2
Empirical
BenzeneQuantity
Emission Factor
(ton/yr/tank)

Uncertainty in mean -73% to +200% 28


Estimating Uncertainties Based on Expert
Judgment
Probability can be used to quantify the state of knowledge (or
ignorance) regarding a quantity.
Bayesian methods for statistical inference are based upon sample
information (e.g., empirical data, when available) and a prior
distribution.
A prior distribution is a quantitative statement of the degree of
belief a person has that a particular outcome will occur.
Methods for eliciting subjective probability distributions are
intended to produce estimates that accurately reflect the true state
of knowledge and that are free of significant cognitive and
motivational biases
Useful when random, representative data, or models, are not
available, but when there is some epistemic status upon which to
base a judgment

29
Heuristics and Possible Biases in Expert
Judgment
Heuristics and Biases
Availability
Anchoring and Adjustment
Representativeness
Others (e.g., Motivational, Expert, etc.)
Consider motivational bias when choosing experts
Deal with cognitive heuristics via an appropriate
elicitation protocol

30
An Example of an Elicitation Protocol:
Stanford/SRI Protocol
Motivating
(Establish Rapport)

Structuring
(Identify Variables)

Conditioning
(Get Expert to Think About Evidence)

Encoding
(Quantify Judgment About Uncertainty)

Verify
(Test the Judgment) 31
Frequently Asked Questions Regarding
Expert Elicitation
How to choose the experts
How many experts are needed
Whether to perform elicitation individually or with
groups of experts
Elicitation of correlated uncertainties
What to do if experts disagree
Whether and how to combine judgments from
multiple experts
What resources are needed for expert elicitation
32
Propagating Uncertainties Through Models

Analytical solutions exact but of limited


applicability
Approximate solutions more broadly
applicable but increase in complexity or error
as model and inputs become more complex
(e.g., Taylor series expansion)
Numerical methods flexible and popular
(e.g., Monte Carlo simulation)

33
Monte Carlo Simulation and Similar
Methods
f(x) F(x)==Pr(xX)
F(x) P(xX)
PROBABILITY
DENSITY FUNCTION 1

Probability, u
Probability

Cumulative
Density

CUMULATIVE
DISTRIBUTION
FUNCTION
0
Value of Random Variable, x Value of Random Variable, x

MONTE CARLO SIMULATION


Generate a random number u~U(0,1) -1 INVERSE CUMULATIVE
Calculate F-1 (u) for each value of u F (u)
DISTRIBUTION
Value of Random
FUNCTION
LATIN HYPERCUBE SAMPLING
Divide u into N equal intervals Variable, x
Select median of each interval
Calculate F-1 (u) for each interval
Rank each sample based on U(0,1)
(or restricted pairing technique)
0 Cumulative Probability, u 1 34
Sensitivity Analysis: Which Model Inputs
Contribute Most to Uncertainty in Output?
z
Linearized sensitivity y
= sy,b
b
coefficients z
z
z
x
= sx,b
= sy,a b
y
Statistical methods: z
a
= sx,a
Correlation x
a

Regression y
Advanced methods (xb,yb)

(xa,ya)
Interactions,
24% F TR , 25% x

DR , 1%

BW, 8%

WB, 6%
Example from
Sobols Method
Main Effect
of Others, AM, 6%
30%
35
Other Quantitative Methods

Interval Methods: Provide bounds, but not


very informative
Fuzzy Sets: represents vagueness, rather
than uncertainty

36
Qualitative Methods

Principles of Rationality
Lines of Reasoning
Weight of Evidence

37
Principles of Rationality

Conceptual clarity: well-defined terminology


Logical consistency: inferences should follow
from assumptions and data
Ontological realism: free of scientific error
Epistemological reflection: evidential support
Methodological rigor: use of proven
techniques
Practicality
Valuational selection: focus on what matters
the most
38
Lines of Reasoning

Direct empirical evidence


Semi-empirical evidence (surrogate data)
Empirical correlations (relationships between
known processes and the unknown process of
interest)
Theory-based inference causal mechanisms
Existential insight expert judgment

39
Judgment of Epistemic Status

The result of an analysis of epistemic status is


a judgment regarding the quality of each
premise or alternative e.g.,
no basis for using a premise in decision-
making.
partial or high confidence basis for using a
particular premise as the basis for decision
making.

40
Weight of Evidence

Legal context - whether the proof for one


premise is greater than for another.
Often used when a categorical judgment is
needed.
However,
tends to be less formal than the analysis of
epistemic status,
less transparent than properly documented
analyses of epistemic status
41
Qualitative Statements Regarding
Uncertainty
Qualitative approaches for describing uncertainty are
best with fundamental problems of ambiguity.
The same words mean:
different things to different people,
different things to the same person in different contexts
Based on Wallsten et al., 1986:
Probable was associated with quantitative
probabilities of approximately 0.5 to 1.0
Possible was associated with probabilities of
approximately 0.0 to 1.0.
Qualitative schemes for dealing with uncertainty are
typically not useful
42
CONCLUSIONS - 1

There is growing recognition that climate


change has the potential to impact
transportation systems.
The available literature on the impacts of
climate change on transportation systems
appears to be a vulnerability assessment,
rather than a risk analysis.

43
CONCLUSIONS - 2
The commitment of large resources should be based on,
as thoroughly as necessary or possible, a well-founded
analysis.
There are many alternative forms of analysis that differ in
their epistemic status, depending on what type of
information is available.
Thus, the key question is what kind of analysis is
appropriate here?
It may be possible to seek feasible, and perhaps robust
(but not optimal) solutions for dealing with climate change
impacts.
Actual decisions will be based on a complex deliberative
process, to which analysis is only one input
44
CONCLUSIONS - 3

There is substantial uncertainty attributable to


the structure of scenarios and models.
Given the lack of directly relevant empirical
data for making assessments of future
impacts, there is a strong need for the use of
judgments regarding uncertainty elicited from
experts

45
RECOMMENDATIONS
Vulnerability assessment is only a first step.
Modeling tools should be used to identify feasible and robust
solutions
Assessment should be done iteratively over time.
Expert judgment should be included as a basis for quantifying the
likelihood and severity of various outcomes, as well as
uncertainties.
Uncertainties should be quantified to the extent possible.
Sensitivity and uncertainty analysis should be used together to
identify key knowledge gaps that could be prioritized for addition
data collection or research in order to improve confidence in
estimates.
In order to focus policy debate and inform decision making, these
analyses are highly recommended, despite their limitations
46
ACKNOWLEDGMENTS

Hyung-Wook Choi, of the Department of Civil,


Construction, and Environmental Engineering
at NC State, provided assistance with the
literature review.
This work was supported by the Center for
Transportation and the Environment.
However, the author is solely responsible for
the content of this material.

47

Vous aimerez peut-être aussi