63 vues

Transféré par Haryo Armono

Probabilistic Slope Analysis for Practice

- interaction_effects
- Ch5DW
- 20375369
- HASIL SPSS
- PSY1004_06
- Forecasting
- Uts Statistik. Muhammad Puadi (17702251002) Ptk b
- 2. Bai Thuc Hanh Excel Va SPSS
- 13-14_MAPSS_Regresie multIPLA_04112013
- Libro1.xlsx
- TD3
- spss1
- amfg (Autosaved)
- Notule Grenier 37
- Lampiran SPSS
- chap2ad
- Annale de probabilité DEUG & LICENCE.pdf
- 1m02td
- cours07.pdf
- m09bbue

Vous êtes sur la page 1sur 19

665

H. El-Ramly, N.R. Morgenstern, and D.M. Cruden

Abstract: The impact of uncertainty on the reliability of slope design and performance assessment is often significant.

Conventional slope practice based on the factor of safety cannot explicitly address uncertainty, thus compromising the

adequacy of projections. Probabilistic techniques are rational means to quantify and incorporate uncertainty into slope

analysis and design. A spreadsheet approach for probabilistic slope stability analysis is developed. The methodology is

based on Monte Carlo simulation using the familiar and readily available software, Microsoft Excel 97 and @Risk.

The analysis accounts for the spatial variability of the input variables, the statistical uncertainty due to limited data,

and biases in the empirical factors and correlations used. The approach is simple and can be applied in practice with

little effort beyond that needed in a conventional analysis. The methodology is illustrated by a probabilistic slope

analysis of the dykes of the James Bay hydroelectric project. The results are compared with those obtained using the

first-order second-moment method, and the practical insights gained through the analysis are highlighted. The

deficiencies of a simpler probabilistic analysis are illustrated.

Key words: probabilistic analysis, slope stability, Monte Carlo simulation, spatial variability.

Rsum : L'impact de l'incertitude sur la fiabilit de la conception et de l'valuation de la performance des talus est

souvent significative. La pratique conventionnelle des talus base sur le coefficient de scurit ne peut pas traiter de faon explicite l'incertitude, compromettant ainsi la prcision des projections. Les techniques probabilistes constituent des

moyens rationnels pour quantifier et incorporer l'incertitude dans la conception et l'analyse des talus. On dveloppe une

approche de tableur pour l'analyse de la stabilit des talus. La mthodologie est base sur la simulation de Monte Carlo

au moyen de logiciels familiers et facilement accessibles, Microsoft Excel 97 et @Risk. L'analyse tient compte de la

variabilit spatiale des variables entres, de l'incertitude statistique due aux donnes limites et aux distorsions des facteurs empiriques et des corrlations utiliss. L'approche est simple et peut tre mise ne pratique avec peu d'effort en

plus de celui requis par une analyse conventionnelle. La mthodologie est illustre par une analyse probabiliste des talus des digues du projet hydrolectrique de la Baie James. Les rsultats sont compars avec ceux obtenus en utilisant

la mthode du second moment de premier ordre, et les claircissements pratiques obtenus par cette analyse sont mis en

lumire. On illustre les dficiences d'une analyse probabiliste plus simple.

Mots cls : analyse probabiliste, stabilit de talus, simulation de Monte Carlo, variabilit spatiale.

[Traduit par la Rdaction]

El-Ramly et al.

683

Introduction

Slope engineering is perhaps the geotechnical subject most

dominated by uncertainty. Geological anomalies, inherent

spatial variability of soil properties, scarcity of representative

data, changing environmental conditions, unexpected failure

mechanisms, simplifications and approximations adopted in

geotechnical models, and human mistakes in design and construction are all factors contributing to uncertainty. Conventional deterministic slope analysis does not account for

quantified uncertainty in an explicit manner and relies on conservative parameters and designs to deal with uncertain conditions. The impact of such subjective conservatism cannot be

evaluated, and past experience shows that apparently conservative designs are not always safe against failure.

The evaluation of the role of uncertainty necessarily requires the implementation of probability concepts and meth-

and incorporated rationally into the design process. Probabilistic slope stability analysis (PSSA) was first introduced into

slope engineering in the 1970s (Alonso 1976; Tang et al.

1976; Harr 1977). Over the last three decades, the concepts

and principles of PSSA have developed and are now well established in the literature. In addition to accounting for uncertainty, PSSA is also a useful approach for estimating

hazard frequency for site-specific quantitative risk analyses,

particularly in the absence of representative empirical data.

The merits of probabilistic analyses have long been noted

(Chowdhury 1984; Whitman 1984; Wolff 1996; Christian

1996). Despite the uncertainties involved in slope problems

and notwithstanding the benefits gained from a PSSA, the

profession has been slow in adopting such techniques. The

reluctance of practicing engineers to apply probabilistic

methods is attributed to four factors. First, engineers train-

Received 22 May 2001. Accepted 7 December 2001. Published on the NRC Research Press Web site at http://cgj.nrc.ca on 16 May 2002.

H. El-Ramly.1 AMEC Earth and Environmental, 4810-93 Street, Edmonton, AB T6E 5M4, Canada.

N.R. Morgenstern and D.M. Cruden. Geotechnical and Geoenvironmental Group, Department of Civil and Environmental

Engineering, University of Alberta, Edmonton, AB T6G 2G7, Canada.

1

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:12 AM

DOI: 10.1139/T02-034

Composite Default screen

666

ing in statistics and probability theory is often limited to basic information during their early years of education. Hence,

they are less comfortable dealing with probabilities than

they are with deterministic factors of safety. Second, there is

a common misconception that probabilistic analyses require

significantly more data, time, and effort than deterministic

analyses. Third, few published studies illustrate the implementation and benefits of probabilistic analyses. Lastly, acceptable probabilities of unsatisfactory performance (or

failure probability) are ill-defined, and the link between a

probabilistic assessment and a conventional deterministic assessment is absent. This creates difficulties in comprehending the results of a probabilistic analysis. All of these issues

are addressed in detail in El-Ramly (2001).

This paper addresses the integration of probabilistic methods into geotechnical practice, focusing on the first three obstacles of the factors listed in the previous paragraph. First,

an overview of some of the concepts involved in quantifying

uncertainty of soil properties is presented. The paper then

describes the development of a practical methodology for

probabilistic slope analysis based on Monte Carlo simulation. The methodology makes use of familiar and readily

available software, namely Microsoft Excel 97 and @Risk.

The implementation of the methodology is illustrated

through a probabilistic analysis of the stability of the James

Bay dykes. Lastly, the results are compared with those of the

first-order second-moment (FOSM) method and the practical

insights gained through the analysis are highlighted.

Components of statistical analysis

Statistical analysis of soil data comprises two main stages:

data review and statistical inference. Data review is largely

judgmental and encompasses a broad range of issues. First,

the consistency of the data, known as the condition of

stationarity, should be ensured. Inconsistency can arise from

pooling data belonging to different soil types, stress conditions, testing methods, stress history, or patterns of sample

disturbance (Lacasse and Nadim 1996). The histogram is a

convenient tool in this regard. A multimodal histogram is an

indication of inconsistent data, or a nonstationary condition.

Data review also aims at identifying trends in the data, biases in measurements, errors and discrepancies in the results, and outlier data values. Decisions should be made

whether to reject outliers or to accept them as extreme values. Baecher (1987) warned that care should be exercised in

this process to avoid rejecting true, important information.

Another concern is any clustering of data in a zone within

the spatial domain of interest which may render the data unrepresentative.

Statistical inferences are commonly based on a simplified

model (Vanmarcke 1977a; DeGroot and Baecher 1993) which

divides the quantity, xi, at any location, i, into a deterministic

trend component, ti, and a residual component, i , where

[1]

xi = ti + i

techniques and is usually a function of location. The residual

component is a random variable whose probability density

function (PDF) has a zero mean and a constant variance,

of observations around the trend. The residual component at

location i is spatially correlated with the residual components at surrounding locations.

Uncertainty in trend

Site investigation programs are usually controlled by budget, which limits the number of tests performed. Since the

trend function is estimated from the available sample of observations, it is uncertain. The uncertainty due to the sample

size (number of observations) is known as statistical uncertainty and is typically quantified by considering the regression coefficients of the trend function as variables with

means and non-zero variances, which are inversely proportional to the number of observations, n.

The method of least squares is the most common technique for estimating the means and variances of trendfunction coefficients. Neter et al. (1990) discussed estimating the uncertainties of regression coefficients for linear

trends. Li (1991) pointed out some limitations of the method

of least squares. Baecher (1987) recommended that trend

equations be linear, as the higher the order of the trend function, the more uncertain the regression coefficients. Where

the soil properties do not exhibit a clear trend, a constant

equal to the mean of the observations is assumed.

Bias is another source of uncertainty that could affect the

estimated trend. The measured soil property can be consistently overestimated or underestimated at all locations. Factors such as the testing device used, boundary conditions,

soil disturbance, or the models and correlations used to interpret the measurements may contribute to biases. As an example, Bjerrum (1972) noted that the field vane consistently

overestimated the undrained shear strength of highly plastic

clays. An empirical correction factor is used to adjust the biased measurements. The additional uncertainty introduced

by the use of Bjerrums vane correction factor is, however,

significant. This could also be the case with other empirical

factors.

Spatial variability

Soil composition and properties vary from one location to

another, even within homogeneous layers. The variability is

attributed to factors such as variations in mineralogical composition, conditions during deposition, stress history, and

physical and mechanical decomposition processes (Lacasse

and Nadim 1996). The spatial variability of soil properties is

a major source of uncertainty.

Spatial variability is not a random process, rather it is

controlled by location in space. Statistical parameters such

as the mean and variance are one-point statistical parameters

and cannot capture the features of the spatial structure of the

soil (Fig. 1). The plot in Fig. 1 compares the spatial variability of two artificial sets of data generated using the geostatistics software GSLIB (Deutsch and Journel 1998). Both

sets have similar histograms. The upper plot is highly erratic

and the data are almost uncorrelated, whereas the lower plot

is characterized by high spatial continuity. Additional statistical tools, such as the autocovariance and semivariogram,

are essential to describe spatial variability. The pattern of

soil variability is characterized by the autocorrelation distance (or scale of fluctuation; Vanmarcke 1977a). A large

2002 NRC Canada

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:13 AM

Composite Default screen

El-Ramly et al.

667

Fig. 1. A highly erratic spatial structure (upper right) and a highly continuous structure (lower right), both with similar histograms.

in the lower plot of Fig. 1, whereas a short distance reflects

erratic variability (Fig. 1, upper plot). DeGroot (1996) and

Lacasse and Nadim (1996) illustrated the estimation of autocorrelation distance.

Spatial averaging

The performance of a structure is often controlled by the

average soil properties within a zone of influence, rather than

soil properties at discrete locations. Slope failure is more

likely to occur when the average shear strength along the failure surface is insufficient rather than due to the presence of

some local weak pockets. The uncertainty of the average

shear strength along the slip surface, not the point strength, is

therefore a more accurate measure of uncertainty. Baecher

(1987) warned, however, that depending on performance

mode, average properties are not always the controlling factor.

Internal erosion in dams and progressive failure are examples

of cases where extreme values control performance.

The variance of the strength spatially averaged over some

surface is less than the variance of point measurements. As

being averaged increases, the variance decreases. For quantities averaging linearly, the amount of variance reduction depends on the autocorrelation distance. The mean, however,

remains almost unchanged. The numerical data in Fig. 2 (upper right plot) are discretized based on grids of sizes 1 1,

5 5, and 10 10. For each scheme, the averages within

grid squares are calculated and the histogram of the local averages is plotted as shown in Fig. 2. As the averaging area is

increased, the coefficient of variation of the local averages

drops from 1.55 to 0.28 and the minimum and maximum

values become closer to the mean. If the averaging area is

constant, the amount of variance reduction increases as the

autocorrelation distance decreases, i.e., soil variability becomes more erratic.

Theory of random fields

The theory of random fields (Vanmarcke 1977a, 1977b,

1983) is a common approach for modeling spatial variability

of soil properties. It is also the basis of our probabilistic

slope analysis methodology; to illustrate its rationale, some

2002 NRC Canada

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:13 AM

Composite Default screen

668

Fig. 2. Variance reduction due to spatial averaging over blocks of sizes 1 1, 5 5, and 10 10.

of the basic concepts of the theory of random fields are presented here.

At any location within a soil layer, a soil parameter is an

uncertain quantity, or a random variable, unless it is measured at that particular location. Each random variable is

characterized by a probability distribution and is correlated

with the random variables at adjacent locations. The set of

random variables at all locations within the layer is referred

to as a random field and is characterized by the joint probability distribution of all random variables. A random field is

called stationary (or homogeneous) if the joint probability

distribution that governs the field is invariant when translated over the parameter space. This implies that the cumulative probability distribution function, the mean, and the

variance are constant for any location within the random

field and that the covariance of two random variables de-

locations within the random field.

The point to point variation of a random field is very difficult to obtain in practice and is often of no real interest

(Vanmarcke 1983). Local averages over a spatial or temporal

local domain are of much greater value. For example, the

hourly or daily rainfall is of interest to hydrologists rather

than the instantaneous rate of fall. Similarly, the average

shear strength of a soil over the area of the slip surface is of

more interest to geotechnical engineers than the variation of

strength from point to point within the layer.

Figure 3 shows a one-dimensional (1D) stationary random

field of a variable, x, with a mean, E[x], a variance, 2 , and

a cumulative probability distribution function, F(x); the

graph could also portray the variation along a line in the parameter space of a homogeneous k-dimensional random

2002 NRC Canada

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:13 AM

Composite Default screen

El-Ramly et al.

669

Fig. 3. A realization of a 1D random field of a variable x with a mean E[x], variance 2, and cumulative probability distribution

function F(x), showing local averages over intervals z and z.

field. For this type of random field, Vanmarcke (1977a) defined the dimensionless variance function (z) as

[2]

( z) =

2

2 z

variable, x, over an interval of length z. The variance function is a measure of the reduction in the point variance, 2 ,

under local averaging. For most commonly used correlation

functions, Vanmarcke (1983) showed that (z) can be approximated by

[3]

( z) =

z

z

z

fluctuation has the same meaning as the autocorrelation distance but differs in numeric value. For the common exponential and Gaussian autocorrelation models, is 2.0 and

1.77 times the autocorrelation distance, ro, respectively.

The model indicates no variance reduction, (z) = 1, due

to local averaging up to an averaging interval z equal to .

The rationale of this approximation is that modeling a random phenomenon at a level of aggregation more detailed

than the way the information about the phenomenon is acquired or processed is impractical and unnecessary

(Vanmarcke 1983). In site investigation programs, the spacing or the time interval between observations is often large.

Characterizing the correlation structure at small intervals becomes unreliable and justifies the approximation by a perfect

autocorrelation.

Local averages of the variable x over intervals z and z

(Fig. 3), X(z) and X(z), are spatially correlated. The correlation coefficient, (Xz, X z ), between X(z) and X(z)

is given by eq. [4] (Vanmarcke 1983). It is a function of the

lengths of the two intervals, the separation, Zo, between

them (Fig. 3), and the variance function of the variable being

averaged:

[4]

(X z, X z) =

Z 2o (Z o) Z 12 (Z1) + Z 22 (Z 2 ) Z 23 (Z 3)

2 z z [( z)( z )]0.5

where Z1 is the distance from the beginning of the first interval to the beginning of the second interval, Z2 is the distance

from the beginning of the first interval to the end of the second interval, and Z3 is the distance from the end of the first

interval to the end of the second interval.

Random testing error

Random testing errors arise from factors related to the

measuring process (other than bias), such as operator error

or a faulty device. They can be directly measured only

through repeated testing of the same specimen by different

operators and devices. This is not practical, as most geotechnical field and laboratory tests disturb the sample, so

random measurement errors cannot be known. In data analysis, measurement error is regarded as a random variable with

a zero mean and a constant variance, referred to as random

error variance. The uncertainty due to testing errors is not a

true variation in the soil property and should be discarded in

evaluating parameter variability. Approximate analytical procedures (Baecher 1987) are used to estimate the random error variance. Jaksa et al. (1997), however, pointed to

important inaccuracies associated with such procedures.

Reliable estimation of random error variance requires data

at very small separation distances (r 0), not available in

practice. As a result, the distinction between noise, or random error, and real, short-scale variability (over distances

less than data spacing) is not possible. Hence, the analytically computed variance is composed of random error variance and short-scale variability. Jaksa et al. (1997) also

showed that the computed value of the random error variance is sensitive to data spacing. Kulhawy and Trautmann

(1996) commented that quantitative assessment of testing errors of in situ measurements is rare because of the difficulties in separating natural spatial variability from test

uncertainty. So, the reliability of the analytical estimation of

random error variance is in question. Random testing errors

are best addressed through standardization of testing equipment and procedures, proper personnel training, and continuous inspection.

Probabilistic procedures for slope stability analysis vary

in assumptions, limitations, capability to handle complex

2002 NRC Canada

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:14 AM

Composite Default screen

670

however, fall into one of two categories: approximate methods (the FOSM method, the point estimate method), and

Monte Carlo simulation.

Approximate methods make simplifying assumptions that

often limit their application to specific classes of problems.

For example, some used very simple slope models such as

the ordinary method of slices (Tang et al. 1976; Vanmarcke

1980; Honjo and Kuroda 1991; Bergado et al. 1994). Others

dealt only with frictionless soils (Matsuo and Kuroda 1974;

Vanmarcke 1977b; Bergado et al. 1994). A restriction to a

circular (or cylindrical) slip surface is common in many

studies (Alonso 1976; Vanmarcke 1977b; Yucemen and AlHomoud 1990; Bergado et al. 1994). The spatial variability

of soil properties and pore-water pressure is often ignored,

assuming perfect autocorrelation (Nguyen and Chowdhury

1984; Wolff and Harr 1987; Duncan 2000).

Approximate methods allow the estimation of the mean

and variance of the factor of safety but do not provide any

information about the shape of the probability density function, so the failure probability can only be obtained by assuming a parametric probability distribution of the factor of

safety (typically normal or log-normal). Estimates of low

probabilities, required for safe structures, are sensitive to the

assumed distribution.

Few studies used Monte Carlo simulation in slope analysis

when compared to approximate methods. The extensive computational efforts involved in the simulations required researchers to develop their own software to solve slope stability

problems. Computer times needed for the simulations were

also significant. As a result, a Monte Carlo simulation was considered, until recently, to be uneconomic (Tobutt 1982; Priest

and Brown 1983). Studies applying Monte Carlo simulation

also rarely addressed the spatial variability of soil properties

(Major et al. 1978; Tobutt 1982; Nguyen and Chowdhury

1985) because of difficulties in generating random values in

ways that preserved their correlations. Today, rapid developments in software are significantly changing this situation.

The limitations, and sometimes the complexities, of

probabilistic methods combined with the poor training of

most engineers in statistics and probability theory have substantially inhibited the adoption of probabilistic slope stability analysis in practice. To facilitate the integration of

probabilistic methods into practice, the methodologies and

techniques, while being consistent with principles of logic

and mechanics, should be simple, capable of solving real

slope problems, and in formats familiar to engineers.

methodology

Outline of the methodology

The probabilistic slope analysis methodology based on

Monte Carlo simulation developed here has the advantages

of being simple and not requiring a comprehensive statistical

and mathematical background. The methodology is spreadsheet based and makes use of the familiar and readily available Microsoft Excel 97 (Microsoft Corporation 1997) and

@Risk (Palisade Corporation 1996) software. Other similar

software, e.g., Lotus 1-2-3 and Crystal Ball (Decisioneering

Inc. 1996), is likely capable of undertaking the task.

and slip surface) and the selected method of analysis are first

modeled in a Microsoft Excel 97 spreadsheet. Available

data are examined and uncertainties in input parameters are

identified and described statistically by representative probability distributions. Only those parameters whose uncertainties are deemed significant to the analysis need to be treated

as variables. A Monte Carlo simulation is then performed as

illustrated schematically in Fig. 4. @Risk draws at random a

value for each input variable from within the defined probability distributions. These values (input set 1, Fig. 4) are

used to solve the spreadsheet and calculate the corresponding factor of safety. The process is repeated sufficient times,

m, to estimate the statistical distribution of the factor of

safety. Statistical analysis of this distribution allows estimating the mean and variance of the factor of safety and the

probability of the factor of safety being less than one. The

procedure is applicable to any method of limit equilibrium

analysis that can be represented in a spreadsheet.

Statistical characterization of input variables

Probability distributions

Different approaches can be adopted to estimate the probability distribution of each input variable. Where there are

adequate amounts of data, the cumulative distribution function (CDF) of the measurements can be used directly in the

simulation process. In geotechnical practice, the observed

CDF (or histogram) may show spikes that would not appear

were more observations available. The CDF is better obtained by resetting the probability associated with each observation to the average of its cumulative probability and

that associated with the next lower observation (Deutsch and

Journel 1998). This procedure smooths unwanted spikes and

allows assigning finite probabilities for values of the input

parameter less than the minimum value measured and more

than the maximum value.

Where observations are scarce or absent, parametric distributions can be assumed from the literature. Studies have

estimated coefficients of variation and probability density

functions of soil properties (Lumb 1966; Chowdhury 1984;

Harr 1987; Kulhawy et al. 1991; Lacasse and Nadim 1996).

Care should be exercised, however, to ensure that the minimum and maximum values of the selected distribution are

consistent with the physical limits of the parameter being

modeled. For example, shear strength parameters should not

take negative values. If the selected distribution implies negative values, then the distribution is truncated at a practical

minimum threshold. Alternatively, variability can be assumed

to follow a triangular distribution with estimates of minimum,

maximum, and most likely values based on expert opinion.

@Risk built-in functions allow great flexibility in modeling input variables. Parametric and nonparametric distributions (e.g., experimental CDF) can be modeled using @Risk

functions. Desired truncations can be easily imposed on any

distribution either through @Risk or using Microsoft Excel 97 functions.

Modeling parameter uncertainty

The probabilistic slope analysis methodology accounts for

the uncertainty of the input parameters based on the concepts

discussed in previous sections and the following equation:

2002 NRC Canada

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:14 AM

Composite Default screen

El-Ramly et al.

671

Fig. 4. Monte Carlo simulation procedure using Microsoft Excel 97 and @Risk software.

[5]

V = B(t + )

and bias, B is a bias correction factor, t is the trend component, and is the residual component. The statistical uncertainty in the trend is estimated by standard statistical

methods (Ang and Tang 1975; Neter et al. 1990). The variance of the residual component is equal to the variance of

the measured data, with no explicit consideration of the uncertainty due to random testing errors for two reasons. Estimates of random error variance are unreliable, as discussed

in an earlier section, so any reduction of the observed variance could be unsafe. In addition, random errors fluctuate in

magnitude above and below zero and are uncorrelated spatially. By taking the spatial average of the input variable over

the whole area of interest, such as the slip surface, positive

and negative random errors at the different locations within

the averaging domain tend to cancel out. As a result, the random error variance associated with the averaged quantity is

largely reduced. It is emphasized, however, that a critical

review of the observations should ensure that appropriate

testing standards and procedures are followed. The bias correction factor is also considered a variable with a mean and

variance evaluated by experience and comparison with field

performance or by observations from a more accurate testing

procedure. It does not have to be a multiplier as in eq. [5]; it

could be an addition [V = B + (t + )].

The spatial variability of an input variable is represented

by the correlation structure of the residual component, , and

modeled using the theory of random fields. The spatial variability of along the slip surface is approximated by a 1D

stationary random field. Vanmarckes (1983) approximate

model (eq. [3]) is adopted for the analysis of the random

field. The point to point variability of the residual component along the slip surface is resembled by the variability of

its local averages over segments of the surface. The portion

of the slip surface within the subject layer is divided into

segments of length l not exceeding the scale of fluctuation .

The local average of the residual component over the length

of any of these segments, (l), is considered a variable. The

CDF of the variable (l) is the same distribution function of

the residual component, F(), with no variance reduction.

one for l . The correlation coefficients between the variables representing the local averages over different segments

of the slip surface are estimated using eq. [4]. Choosing the

length of segments equal to eliminates the correlation coefficients between most of the variables and greatly simplifies

the simulation. Figure 5 is a schematic illustration of the

model, where is the correlation coefficient between variables.

Critical slip surface

For any slope, there is an unlimited number of potential

slip surfaces. The slope may fail, or perform unsatisfactorily,

along any of these surfaces. The total probability of unsatisfactory performance of a slope is, thus, the joint probability

of failure occurring along the admissible slip surfaces. These

surfaces, however, are highly correlated, since they are all

analyzed using the same input variables and the same analytical model. Furthermore, the close proximity of many of

them add an element of spatial correlation. Evaluating the

total probability of unsatisfactory performance of the slope

is a mathematically formidable task.

The strong correlation between slip surfaces, however,

significantly reduces the difference between the total probability of unsatisfactory performance and that of the most

critical surface (Mostyn and Li 1993). Hence, the probability

estimate from the most critical slip surface is considered a

reasonable estimate of slope reliability (Vanmarcke 1977b;

Alonso 1976; Yucemen and Al-Homoud 1990). In this study,

the probability of unsatisfactory performance is obtained by

independently analyzing a number of predetermined slip surfaces. The highest estimated probability value is considered

representative of the total probability of unsatisfactory performance of the slope. The question, however, is which surfaces to consider?

Chowdhury and Tang (1987) and Hassan and Wolff (1999)

indicated that the deterministic critical slip surface is not always the most critical surface in a probabilistic analysis.

Hassan and Wolff (1999) proposed a search algorithm for locating the slip circle with the minimum reliability index. It

searches for the slip surface with the longest length within

2002 NRC Canada

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:15 AM

Composite Default screen

672

Fig. 5. Modeling the spatial variability of an input parameter

along the failure surface using Vanmarckes (1983) approximation of a 1D random field.

uncertainty of the factor of safety. Their algorithm, however,

does not directly account for the spatial variability of soil

properties and pore-water pressure. In locating the probabilistic critical slip surface, variance reduction due to spatial

averaging of soil parameters over the length of the slip surface has to be considered. The reduction depends on the

autocorrelation distance and the length of slip surface, which

is not known beforehand. As a result, an increase in the variance of the factor of safety, due to a longer portion within a

highly uncertain layer, could be offset by a variance reduction due to spatial averaging. Hence, in slopes dominated by

uncertainty due to spatial variability, the slip surface based

on the Hassan and Wolff algorithm may not be the most critical surface in a probabilistic analysis.

An essential part of the probabilistic analysis is, thus, to

consider all slip surfaces that may be hazardous. These may

include the deterministic critical slip surface, the minimum

reliability index surface according to Hassan and Wolff

(1999), noncircular, structurally controlled surfaces, and surfaces through weak or presheared layers.

Spreadsheet modeling

The spreadsheet software Microsoft Excel 97 is used in

modeling the slope geometry, stratigraphy, and slope analysis

method. Using analytical geometry, the equations describing

the ground profile, the boundaries between soil layers, and

the slip surface are first established with reference to a coordinate system. The equations are then modeled in a

Microsoft Excel 97 spreadsheet. The coordinates of the

points of intersection of the different boundaries, for example, where the slip surface meets the borders between layers,

can be obtained within the spreadsheet and used in calculating the slice information (width, coordinates of mid-base

point, total height, and thickness in each soil type) for factor

of safety computations.

Next, the slip surface within each soil layer is divided into

segments of length in addition to a residual segment of

smaller length, similar to the illustration in Fig. 5. The local

averages of a soil parameter over the length of each segment

are considered variables. The spatial variability of the soil

parameter along the slip surface is modeled by the correlations between these variables. The variables are characterized by the CDF of the residual component, F(), with no

equal to one (eq. [3]). @Risk functions are used to assign

appropriate probability distributions for the spreadsheet cells

containing the input variables. The correlations between

variables are modeled using @Risk functions IndepC and

DepC or by creating a correlation coefficients matrix using the Correlate command. In the simulation process,

correlated variables are sampled so as to preserve input correlation coefficients.

Slope analysis methods, such as the Bishop, Janbu, and

Spencer methods, can be easily modeled in a spreadsheet.

However, more advanced and complex methods such as the

MorgensternPrice method and three-dimensional (3D)

methods, are, for now, cumbersome to model in a spreadsheet. In this paper, the Bishop method of slices is used. The

spreadsheet model is built by creating a calculation table

similar to that proposed by Bishop (1955) for mechanical

computations. The appropriate soil parameters for each slice

are automatically chosen by a series of nested IF statements that compares the X coordinate of the mid-base point

with the coordinates of the intersections of the slip surface

with soil layers. The Microsoft Excel 97 circular reference feature is used for the iteration process until the difference between the assumed and the calculated factors of safety

is within the user-specified range. Further discussions and an

illustration of the structure of the spreadsheet are presented in

the context of the case study analyzed in a later section.

Issues in simulation

Monte Carlo simulation requires the generation of random

numbers, which are used in sampling the CDFs of the input

variables. @Risk software (Palisade Corporation 1996) allows two sampling techniques, Monte Carlo sampling (or

random sampling) and Latin Hypercube sampling. The latter

allows sampling the entire CDF curve, including the tails of

the distribution, with fewer iterations, and consequently less

computer time. It is adopted in this study.

The output of a Monte Carlo simulation is sensitive to the

number of iterations, m. When m is large, the random samples

drawn for each input variable are also large and the match between the CDF recreated by sampling and the original input

CDF is more accurate. Hence, the level of noise in the simulation diminishes and the output becomes more stable at the

expense of increasing computer time. The optimum number

of iterations depends on the sizes of the uncertainties in the

input parameters, the correlations between input variables,

and the output parameter being estimated. A practical way to

optimize the simulation process is to repeat the simulation using the same seed value and an increasing number of iterations. A plot of the number of iterations, m, against the

probability of unsatisfactory performance indicates the minimum number of iterations at which the probability value stabilizes.

Interpretation of the output

Probability of unsatisfactory performance

The output of the simulation is the probability density function of the factor of safety from which the mean and standard

deviation can be estimated. The probabilistic safety measure

used in this study is the failure probability. It is the probabil 2002 NRC Canada

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:15 AM

Composite Default screen

El-Ramly et al.

ity of the factor of safety, FS, being less than one, or simply

the number of iterations with FS 1.0 relative to the total

number of iterations, m. The term failure, however, implies

that the collapse of the slope is the event of concern to the designer, which is not necessarily the case. The serviceability of

the slope is as important as slope collapse and requires thorough evaluation and assessment. Serviceability issues include,

among others, slope movement and cracking (without the

slope collapsing), high water seepage, and surface erosion.

The term failure may also imply to clients, particularly nonprofessional clients, that the slope would fail. We recommend

using a different terminology. The U.S. Army Corps of Engineers (1992, 1995) used probability of unsatisfactory performance, Pu, instead of failure probability. The same term,

unsatisfactory performance, is adopted here to address failure

mechanisms. Another safety measure, perhaps probability of

unsatisfactory serviceability, should be considered to address

serviceability criteria. The evaluation of slope serviceability

is, however, beyond the scope of this study.

Because the simulation process uses random sampling of

the input variables, the calculated probability of unsatisfactory performance is also a variable. The probability value

from a single simulation could differ from the true value.

Law and McComas (1986) described relying on the results

of a single simulation run as one of the most common and

potentially dangerous simulation practices. It is, therefore,

essential to repeat the simulation using different seed values

to assess the consistency of the estimates. By running the

simulation many times, the histogram of the probability of

unsatisfactory performance, the mean probability, and the

95% confidence interval around the mean could also be estimated. Using @Risk macro functions, the process of running a number of simulations can be fully automated. A

simple macro file can be designed to run a number of simulations and save the output of each simulation in a separate file.

In reflecting on the computed probability of unsatisfactory

performance, three points should be noted. First, the computed probability of unsatisfactory performance does not

address temporal changes in input variables. If pore-water

pressures, as an example, vary with time, the impact of such

variations on stability can only be assessed by undertaking

further analyses using the new pore-pressure distributions.

Second, the computed probability of unsatisfactory performance represents the probability of occurrence of a hazardous

event over the lifetime of the conditions considered in the

analysis, such as loading conditions and environmental conditions. The calculated probability is, thus, associated with a

time frame that varies from one problem to another. If the annual probability of unsatisfactory performance is required, as

is the case in quantitative risk analyses, the reference time

needs to be estimated. For example, if the pore-water pressure

used in the analysis is the result of a rainstorm, the annual

probability of unsatisfactory performance is estimated based

on the return period of the storm. In the case that none of the

input variables is time dependent, the annual probability could

be referenced to the lifetime of the slope.

Third, despite efforts to include all sources of uncertainty,

there is the possibility of undetected uncertainties such as

human mistakes affecting slope performance. The contribution of these unknown uncertainties to the probability of unsatisfactory performance is not considered, so the computed

673

of unsatisfactory performance. That is why comparison with

computed probabilities of different designs is believed to be

of greater value.

Having estimated the probability of unsatisfactory performance, the next step is to assess whether it is acceptable.

Typically, this is achieved through comparing the computed

values with a probabilistic slope design criterion. Commonly,

a criterion based on the observed frequency of slope failures

and judgement is used. A major drawback to this approach

is that the site- and case-specific features such as slope geometry, site conditions, and sources and levels of uncertainty

of the case histories constituting the database are not addressed. Applying such a global criterion to any slope is a

significant generalization.

The authors are not aware of any well-founded probabilistic criteria for the acceptability of a design. A reliable approach to estimate such criteria is to calibrate the computed

probabilities of unsatisfactory performance of slopes with

their observed field performance. This requires probabilistic

slope stability analyses of cases of failed slopes and of

slopes performing satisfactorily. Through the comparison of

the probabilities associated with the two classes, a probabilistic slope design criterion could be established. This approach has been developed by El-Ramly (2001) and will be

published separately.

Reliability index

The reliability index, , another common probabilistic

safety measure, is given by

[6]

E[FS] 1

[FS]

where E[FS] and [FS] are the mean and standard deviation

of the factor of safety, respectively. The above definition is

accurate if the performance function (factor of safety equation) is linear, which is not the case for slope analysis models. Mostyn and Li (1993) suggested, however, that the

performance functions of slopes are reasonably linear and recommended ignoring the nonlinearity when calculating . Ang

and Tang (1984) provided a convenient theoretical background

of the meaning and estimation of the reliability index.

Sensitivity analysis

Through the software @Risk, Spearman rank correlation

coefficients between the factor of safety and the input variables can be calculated for a sensitivity analysis. The

Spearman coefficient is a correlation coefficient based on the

rank of the data values within the minimummaximum

range, not the actual values themselves. It varies between 1

and 1. A value of zero indicates no correlation between the

input variable and the output, a value of 1 indicates a complete positive correlation, and a value of 1 indicates a complete negative correlation. Spearman correlation coefficients

could be used as measures of the relative contribution of

each input variable to the uncertainty in the factor of safety.

This contribution comprises two elements, the degree of uncertainty of the input parameter and the sensitivity of the

factor of safety to changes in that parameter.

The results of sensitivity analyses are of significant practical value, as they quantify the contributions of the various

2002 NRC Canada

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:15 AM

Composite Default screen

674

sources of uncertainty to the overall design uncertainty. Resources, whether intellectual or physical, can thus be rationally allocated towards reducing the uncertainty of the

variables with large impacts on design. Also, the relative impacts of systematic uncertainty (statistical uncertainty and

bias) and uncertainty due to inherent spatial variability can

be estimated. This information is of interest because systematic uncertainty has a consistent effect at all locations within

the domain of the problem and can have a major impact on

design. Unlike the uncertainty due to inherent spatial variability, systematic uncertainty can be reduced by increasing

the size of data sample and by avoiding highly uncertain empirical correlations and factors.

Spreadsheet-based probabilistic slope analysis

The application of the probabilistic slope analysis methodology described earlier in the paper is illustrated through the

analysis of the well-documented dykes of the James Bay hydroelectric project in Quebec, Canada. Although the dykes

were not built, the design was the subject of extensive studies

quantifying the sources of uncertainty, analyzing the spatial

variability of soil properties (Ladd et al. 1983; Souli et al.

1990), and evaluating the stability of the dyke probabilistically

using the FOSM method (Christian et al. 1994).

The project called for the construction of 50 km of earth

dykes in the James Bay area. Various design options were

investigated. Only one design is considered in this study, the

single stage construction of a 12 m high embankment with a

slope angle of 18.4 (3h:1v) and a 56 m wide berm at midheight. Figure 6 shows the geometry of the embankment and

the underlying stratigraphy.

The embankment is on a clay crust, about 4.0 m thick,

overlying a sensitive marine clay which in turn is underlain

by a lacustrine clay. The marine clay is about 8.0 m thick

and the lacustrine clay is about 6.5 m thick. The undrained

shear strength of both clays, measured by field vane tests at

1.0 m depth intervals, exhibited a large scatter. The mean

undrained shear strength of the marine clay is about 35 kPa

and that of the lacustrine clay is about 31 kPa. The lacustrine clay is underlain by a stiff till.

The uncertainty in soil parameters was quantified by Ladd

et al. (1983) and Christian et al. (1994). They identified the

input parameters whose uncertainties are deemed important

to the stability of the dykes and estimated their statistical parameters. The following probabilistic stability analyses are

based on the conclusions of these studies. Other case studies

demonstrating the complete analysis starting with field and

laboratory data, through the quantification of parameter uncertainty, the probabilistic assessment, and the estimation of

the probability of unsatisfactory performance will be published separately by the present authors.

Eight input parameters are considered variables: the unit

weight and friction angle of embankment material, the thickness of the clay crust, the undrained shear strength and

Bjerrum vane correction factors for the marine and lacustrine clays, and the depth of till layer. Table 1 summarizes

the means and variances of all input variables.

embankment material are evaluated judgementally (Christian

et al. 1994) to reflect potential variability in fill properties.

The bias in vane measurements is adjusted by Bjerrums

vane correction factor, which is also considered uncertain

due to the scatter in Bjerrums database. The uncertainty in

the depth of the till layer, Dtill, is entirely statistical uncertainty due to the limited number of borings. In the absence

of the actual probability distributions of the data, all variables are assumed normal. The spatial variability of all uncertain soil parameters is characterized by an isotropic

autocorrelation distance of about 15 m, which was inferred

from Christian et al. (1994).

As the strength of the lacustrine clay is relatively low, the

depth to the till layer controls the location of the critical slip

circle. The uncertainty in Dtill, thus, introduces uncertainty

in the location of the slip circle. To examine the impact of

this uncertainty, deterministic stability analyses were performed varying Dtill incrementally between 15.5 and 21.5 m

(3 standard deviations). The minimum factor of safety varied between 1.69 and 1.30, implying that the uncertainty in

Dtill could have an important impact on the reliability of the

design. All the critical slip circles were tangent to the top of

the till, daylighted within a short distance at the top of the

embankment, and shared almost the same x coordinate for

the centres.

The equations describing the dyke profile and soil layers

with reference to a coordinate system are estimated and

modeled in a Microsoft Excel 97 spreadsheet. The general

layout and structure of the spreadsheet are illustrated in Appendix A. Since the depth of the till is considered a variable,

a different value is sampled for each simulation iteration and

consequently the critical slip circle varies from one iteration

to another. To minimize the computer time, some restrictions

were imposed on the geometry of the slip circles based on

the results of the previous parametric study. The slip circles

are assumed tangent to the till layer, daylight at a fixed point

at the top of the embankment (X1 = 4.9, Y1 = 36.0), and have

a common X coordinate for the centres (X0 = 85.9). The

equation of the slip circle is then added to the spreadsheet as

a function of the Y coordinate of the centre, Y0, and the radius, R; both depend on the sampled value of Dtill. In each

iteration the sampled value of Dtill thus defines only one critical slip circle. Using the principles of analytical geometry,

the intersections of the slip circle and the boundaries between layers and the breakage points in ground profile are

computed.

The spatial variabilities of soil parameters are modeled as

1D random fields, assuming exponential autocovariance

functions. The spatial variability of the unit weight of the

embankment is, in fact, a two-dimensional (2D) random

field. Vanmarcke (1983) showed that the process of averaging a 2D random field over a rectangular area with one side

of the area smaller than the scale of fluctuation in the same

direction could be approximated by averaging a 1D random

field in the perpendicular direction. If the cross section of

the embankment is regarded as a rectangle, the variability of

unit weight could be approximated by a 1D random field in

the horizontal direction. This approximation is not likely to

have any effect on the analysis, as the impact of the spatial

2002 NRC Canada

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:16 AM

Composite Default screen

El-Ramly et al.

675

Fig. 6. Cross section and stratigraphy of the James Bay dykes, showing the approach adopted to account for spatial variability of soil

properties.

Table 1. Input variables and their statistical parameters (based on Christian et al. 1994)

Soil parameters

Variance

Bias factors

Input variable

Mean

Inherent variability

Statistical uncertainty

Mean

Variance

Fill unit weight, fill (kN/m3)

Clay crust thickness, tcr (m)

Shear strength of marine clay, SuM (kPa)

Bjerrum vane correction factor for SuM, M

Shear strength of lacustrine clay, SuL (kPa)

Bjerrum vane correction factor for SuL, L

Depth of till, Dtill (m)

30.0

20.0

4.0

34.5

31.2

18.5

1.00

1.00

0.19a

66.26

74.82

3.00

1.00

0.04

0.90

3.00

1.00

1.0

1.0

0.006

0.023

Variance is reduced by 80% to account for spatial averaging based on the assessment of Christian et al. (1994).

(Alonso 1976; Nguyen and Chowdhury 1984). This is attributed to the small spatial variability of the unit weight (a coefficient of variation of only 5% in the James Bay case) and

the insensitivity of stability calculations to variations in unit

weight.

The slip surface within each soil layer is divided into segments of lengths not exceeding = 30 m ( = 2ro for exponential autocovariance functions), as illustrated in Fig. 6.

Hence, the average undrained shear strength over the length

of each segment has the same variance as the input data (Table 1) with no reduction. The undrained shear strength of the

lacustrine clay, for example, is modeled by three variables

representing the average strength over three segments of the

slip surface, as illustrated in Fig. A1 in Appendix A. Similarly, the embankment fill is divided into five zones and the

average unit weight within each zone is considered a random

variable. The correlation coefficients between the variables

are estimated using eq. [4], and the correlation coefficient

matrices are shown in Appendix A. Additional variables are

added to the spreadsheet to model statistical uncertainties

and the uncertainties in Bjerrum vane correction factors. In

Carlo simulation.

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:16 AM

Composite Default screen

676

Fig. 9. Histogram of the probability of unsatisfactory performance based on the results of 25 simulations.

cells in Fig. A1 in Appendix A) to account for the various

components of parameter uncertainty.

Using @Risk options, truncation limits are imposed on

the probability distributions of the undrained shear strength

to prevent sampling negative values. Truncation limits at the

mean 3 are also imposed on the probability distributions

of the thickness of the clay crust and the depth of the till

layer. The purpose of these limits is to avoid sampling extreme low or high values that may cause discrepancies in the

sequence of layers.

The stability calculations are based on the Bishop method

of slices. The tables used for factor of safety computations are

input parameters, the iterative process for factor of safety calculations is repeated until the difference between the factors

of safety in two consecutive calculations is less than 0.01.

Trial simulations indicated that the optimum number of iterations is 32 000 (Fig. 7). Using a seed number of 31 069

(an arbitrary value), the mean factor of safety is estimated to

be 1.46, with a standard deviation of 0.20. The probability of

unsatisfactory performance is 4.70 103. Figure 8 shows

the histogram and the probability distribution function of the

factor of safety. The histogram is slightly right skewed, with

a coefficient of skewness of 0.29. After 25 simulations, the

mean factor of safety is 1.46, with a standard deviation of

0.20 and a coefficient of skewness of 0.30. The mean probability of unsatisfactory performance is 4.70 103, with the 95%

confidence interval between 4.50 103 and 4.90 103. Figure 9 shows the histogram of the probability of unsatisfactory

performance. The reliability index is calculated to be 2.32.

A sensitivity analysis (Fig. 10) shows Spearman rank correlation coefficients for all 19 input variables. It is interesting that many of the factors with major contributions to the

uncertainty of the factor of safety are not related to soil

property measurements. For example, Bjerrums correction

factor for the undrained shear strength of the lacustrine clay,

the statistical uncertainty in the depth of the till, and the statistical uncertainty in the unit weight of the fill (which was

evaluated judgementally) are among the main factors affecting the reliability of the design. The results highlight the

significance of the additional uncertainty that could be introduced by the designer through the use of empirical factors

and subjective estimates of uncertainty. Although subjective

estimates based on experience are acceptable in practice,

quantitative estimates of uncertainty, such as variance, are

not yet reliable. Therefore care should be exercised in making such judgements.

First-order second-moment (FOSM) analysis

The FOSM method is an approximate approach based on

Taylors series expansion of the performance function g(x1,

2002 NRC Canada

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:17 AM

Composite Default screen

El-Ramly et al.

677

Fig. 10. Sensitivity analysis results. Spearman rank correlation coefficients for all input variables.

could be Bishops method of slices, which is a function of a

number of input variables xn that represents soil properties,

pore pressure, and slope geometry. For uncorrelated input

variables, the mean and variance of the factor of safety are

given by eqs. [7] and [8], respectively:

[7]

[8]

g 2

[FS]

[x i ]

i =1 x i

where E[] and 2 [] denote the mean and variance, respectively. For most geotechnical models, the analytical evaluation of the derivatives (g/x i ) is cumbersome. A finite

difference approach is commonly used to approximate the

partial derivatives as follows:

[9]

g

FS1 FS 2

FS

=

1

2

xi

xi xi

xi

where FS1 and FS2 are the factor of safety calculated for values of the input variable xi equal to x1i and x i2 , respectively,

with all other input variables assigned their mean values.

Usually, x1i and x i2 are taken equal to one standard deviation

above and below the mean E[xi]. To account for spatial variability and the reduction in the variance of the average parameters, the variance of measured data, 2 [xi], is reduced

by a reduction factor f. Vanmarcke (1977a) suggested that

the variance reduction factor can be approximated by

[10]

2ro

L

being averaged.

When the stability of James Bay dykes is analyzed

probabilistically using the FOSM method, the mean factor of

safety is estimated to be 1.46. Table 2 summarizes the calculations of the variance of the factor of safety. For computing

the variance reduction factor f, the length L is taken equal to

the length of the slip surface within each layer, except for

the fill unit weight, where L is taken equal to the length of

the embankment in cross section. The standard deviation of

the factor of safety is computed to be 0.19, thus the reliability index is 2.42. To estimate the probability of unsatisfactory performance, a form of the probability density function

of the factor of safety must be assumed. For normal and lognormal probability distributions, the probability of unsatisfactory performance is estimated to be 8.4 103 and 2.5

103, respectively. Table 3 compares the results of the FOSM

method and the spreadsheet-based Monte Carlo simulation.

The FOSM method appears to be a reasonable approach for

estimating the mean and variance of the factor of safety.

However, the uncertainty about the shape of the probability

density function of the factor of safety introduces uncertainties in estimating the probability of unsatisfactory performance, as illustrated in Table 3.

Simplified analysis

The spatial variability of soil properties and pore-water

2002 NRC Canada

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:17 AM

Composite Default screen

678

Variance, 2 [x]

Input variable

fill ()

fill (kN/m3)

tcr (m)

SuM-1 (kPa)

SuM-2 (kPa)

SuL (kPa)

Dtill (m)

Output

2 [FS]

[FS]

Inherent

variability

Systematic

uncertainty

FS/x

f 2ro /L

Inherent variability

(FS/x)2f 2 [x]

Systematic uncertainty

(FS/x)2 2 [x]

1.00

1.00

0.19a

66.26

66.26

74.82

0.00

3.00

1.00

0.04

7.60

7.60

24.90

1.00

0.009

0.061

0.007

0.005

0.005

0.022

0.055

1.00

0.24

1.00

1.00

0.37

0.00

0.0001

0.0009

0.0000

0.0019

0.0019

0.0129

0.0000

0.0003

0.0037

0.0000

0.0002

0.0002

0.0115

0.0030

0.0180

0.0370

0.1920

0.0190

Method of analysis

E[FS]

[FS]

Skewness

Pu

FOSMa

Simplified analysis

1.46

1.46

1.46

0.20

0.19

0.25

0.30

Not available

0.32

4.70103

8.40103; 2.50103

2.37102

2.32

2.42

1.84

a

The value 8.40 103 is based on the assumption that the probability density function of the factor of safety is normal, and the value 2.50 103

assumes the probability density function of the factor of safety is log-normal.

Table 4. Input variables and statistical parameters for the simplified analysis (based on Christian

et al. 1994).

Soil parameters

Bias factors

Input variable

Mean

Variance

Mean

Variance

Fill unit weight, fill (kN/m3)

Clay crust thickness, tcr (m)

Shear strength of marine clay, SuM (kPa)

Bjerrum vane correction factor for SuM, M

Shear strength of lacustrine clay, SuL (kPa)

Bjerrum vane correction factor for SuL, L

30

20

4

34.5

31.2

4

2

1

66.26

74.82

0.006

0.023

pressure is a major source of parameter uncertainty. However, probabilistic analyses ignoring the issues of spatial

variability and statistical uncertainty and relying only on the

means, variances, and probability distributions of measured

data are not uncommon (e.g., Nguyen and Chowdhury 1984;

Wolff and Harr 1987; Duncan 2000). Such an approach, referred to hereafter as a simplified analysis, can be erroneous

and misleading.

To illustrate the errors incurred by the simplified approach,

the James Bay dykes are reanalyzed probabilistically, ignoring the spatial variability of soil properties. The analysis is

based directly on the probability distributions of the data

with no variance reduction and no considerations of statistical uncertainties. Table 4 summarizes the input variables and

the statistical parameters used in the analysis, based on

Christian et al. (1994). All variables are assumed to be normally distributed.

The deterministic critical slip surface almost coincided

with the surface of minimum reliability index according to

simulations, the latter slip surface yielded a slightly higher

probability of unsatisfactory performance and was subsequently used in the probabilistic assessment. A Microsoft

Excel 97 model and a Monte Carlo simulation using 32 000

iterations and a seed number of 31 069 gave a mean factor

of safety of 1.46, with a standard deviation of 0.25 and a coefficient of skewness of 0.32. The probability of unsatisfactory performance is calculated to be 2.39 102. Based on

the results of 25 simulations, the mean probability of unsatisfactory performance is 2.37 102, with the 95% confidence interval between 2.34 102 and 2.41 102. The

reliability index is 1.84. Table 3 compares the results of the

simplified analysis with those of the previous analyses. The

simplified analysis overestimates the probability of unsatisfactory performance by a factor of 5. This is attributed to the

significant reduction in the uncertainty due to soil variability

as a result of spatial averaging, which is taken into account

in all the analyses but the simplified analysis. In slopes dom 2002 NRC Canada

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:18 AM

Composite Default screen

El-Ramly et al.

properties, the simplified analysis could significantly overestimate the probability of unsatisfactory performance.

Conclusions

Conventional slope design practice addresses uncertainty

only implicitly and in a subjective manner, thus compromising the adequacy of projections. Without proper consideration of uncertainty, the factor of safety alone can give a

misleading sense of safety and is not a sufficient safety indicator. Probabilistic slope stability analysis is a rational

means to incorporate quantified uncertainties into the design

process. An important conclusion of this study is that probabilistic analyses can be applied in practice without an extensive effort beyond that needed in a conventional analysis.

The stated obstacles impeding the adoption of such techniques into geotechnical practice are more apparent than real.

The developed probabilistic spreadsheet approach, based

on Monte Carlo simulation, makes use of Microsoft Excel

97 and @Risk software, which are familiar and readily available to most engineers. The underlying procedures and concepts are simple and transparent, requiring only fundamental

knowledge of statistics and probability theory. At the same

time, the analysis accounts for the spatial variability of the

input variables, the statistical uncertainty due to limited data,

and the bias in the empirical factors and correlations used.

The approach is flexible in handling real slope problems, including various loading conditions, complex stratigraphy, c

soils, and circular and noncircular slip surfaces.

Probabilistic slope stability analysis provides practicing

engineers with valuable insights that cannot be reached otherwise. For example, the level of uncertainty in the factor of

safety is quantified through the variance of the factor of

safety and the probability of unsatisfactory performance.

This could have an important impact on decisions about a

design factor of safety. If the reliability of the computed factor of safety is deemed high, the profession may be willing

to adopt lower design factors of safety than usual, provided

that the serviceability of the slope is not compromised. With

increasingly sparse funds, many agencies and organizations

prioritize slope repair and maintenance expenditures, as is

the case of hydraulic structures, based on safety levels.

Comparing the safety of different structures based on the

factor of safety alone is inadequate because the underlying

sources and levels of uncertainty are not addressed. By combining the most likely value of the factor of safety and the

uncertainty in that value, the probability of unsatisfactory

performance and the reliability index provide a sounder basis for the comparison.

The practical value of quantifying the relative contributions of the various sources of uncertainty to the overall uncertainty of the factor of safety through sensitivity analyses,

using Spearman rank correlation coefficient, cannot be underestimated. Such information allows available resources,

whether intellectual or physical, to be rationally allocated towards reducing the uncertainties of the variables with the

largest impact on design. The sensitivity analyses undertaken

in this study showed that the uncertainty of Bjerrums vane

correction factor is substantial and could have a large impact

on the reliability of the design. This warns that the reliability

679

empirical factors and correlations without the designer even

realizing this. Understanding the limitations and, more importantly, the reliability of such factors and correlations

prior to using them is essential.

It is our view that combining conventional deterministic

slope analysis and probabilistic analysis will be beneficial to

slope engineering practice and will enhance the decisionmaking process. It is important to note, however, that simplified probabilistic analyses can be erroneous and misleading.

For example, ignoring the spatial variability of soil properties

and assuming perfect autocorrelations as in our simplified

analysis can significantly overestimate the probability of unsatisfactory performance.

Acknowledgements

The authors would like to thank the Natural Sciences and

Engineering Research Council of Canada for providing the

financial support for this research. We are also grateful for

useful discussions with colleagues at the University of Alberta and elsewhere.

References

Alonso, E.E. 1976. Risk analysis of slopes and its application to

slopes in Canadian sensitive clays. Gotechnique, 26: 453472.

Ang, A.H-S., and Tang, W.H. 1975. Probability concepts in engineering planning and design. Vol. I. Basic principles. John Wiley,

New York.

Ang, A.H-S., and Tang, W.H. 1984. Probability concepts in engineering planning and design. Vol. II. Decision, risk and reliability. John Wiley, New York.

Baecher, G.B. 1987. Statistical analysis of geotechnical data. Final

Report GL-87-1, U.S. Army Corps of Engineers, Waterways Experiment Station, Vicksburg, Miss.

Bergado, D.T., Patron, B.C., Youyongwatana, W., Chai, J.C., and

Yudhbir. 1994. Reliability-based analysis of embankment on

soft Bangkok clay. Structural Safety, 13: 247266.

Bishop, A.W. 1955. The use of the slip circle in the stability analysis of slopes. Gotechnique, 5: 717.

Bjerrum, L. 1972. Embankments on soft ground. In Proceedings of

the American Society of Civil Engineers Specialty Conference on

Performance of Earth and Earth-Supported Structures, Purdue

University, Lafayette, Ind., June 1114, Vol. 2, pp. 154.

Chowdhury, R.N. 1984. Recent developments in landslide studies:

probabilistic methods. In Proceedings of the 4th International

Symposium on Landslides, Toronto, Ont., September 1621. Canadian Geotechnical Society, Vol. 1, pp. 209228.

Chowdhury, R.N., and Tang, W.H. 1987. Comparison of risk models for slopes. In Proceedings of the 5th International Conference on Applications of Statistics and Probability in Soil and

Structural Engineering, Vancouver, B.C., May 2529, Vol. 2,

pp. 863869.

Christian, J.T. 1996. Reliability methods for stability of existing

slopes. In Proceedings of Uncertainty 96. Geotechnical Special

Publication 58, Vol. 2, pp. 409419.

Christian, J.T., Ladd, C.C., and Baecher, G.B. 1994. Reliability and

probability in stability analysis. Journal of Geotechnical Engineering, ASCE, 120: 10711111.

Decisioneering Inc. 1996. Crystal ball: Monte Carlo simulation software

for spreadsheet risk analysis. Decisioneering Inc., Denver, Colo.

2002 NRC Canada

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:18 AM

Composite Default screen

680

DeGroot, D.J. 1996. Analyzing spatial variability of in situ soil

properties. In Proceedings, Uncertainty 96. Geotechnical Special Publication 58, Vol. 1, pp. 210238.

DeGroot, D.J., and Baecher, G.B. 1993. Estimating autocovariance

of in-situ soil properties. Journal of Geotechnical Engineering,

ASCE, 119(1): 147166.

Deutsch, C.V., and Journel, A.G. 1998. GSLIB: geostatistical software

library and users guide. Oxford University Press, New York,

543 pp.

Duncan, J.M. 2000. Factors of safety and reliability in geotechnical

engineering. Journal of Geotechnical and Geoenvironmental Engineering, ASCE, 126: 307316.

El-Ramly, H. 2001. Probabilistic analyses of landslide hazards and

risks: bridging theory and practice. Ph.D. thesis, University of

Alberta, Edmonton, Alta.

Harr, M.E. 1977. Mechanics of particulate media a probabilistic

approach. McGraw-Hill, New York, 543 pp.

Harr, M.E. 1987. Reliability-based design in civil engineering.

McGraw-Hill Inc., New York, 290 pp.

Hassan, A., and Wolff, T. 1999. Search algorithm for minimum reliability index of earth slopes. Journal of Geotechnical and Geoenvironmental Engineering, ASCE, 125: 301308.

Honjo, Y., and Kuroda, K. 1991. A new look at fluctuating geotechnical data for reliability design. Soils and Foundations, 31:

110120.

Jaksa, M.B., Brooker, P.I., and Kaggwa, W.S. 1997. Inaccuracies

associated with estimating random measurement errors. Journal

of Geotechnical and Geoenvironmental Engineering, ASCE,

123: 393401.

Kulhawy, F.H., and Trautmann, C.H. 1996. Estimation of in-situ test

uncertainty. In Proceedings of Uncertainty 96. Geotechnical Special Publication 58, Vol. 1, pp. 269286.

Kulhawy, F.H., Roth, M.S., and Grigoriu, M.D. 1991. Some statistical

evaluation of geotechnical properties. In Proceedings of the 6th International Conference on Applications of Statistics and Probability

in Civil Engineering, Mexico City, Vol. 2, pp. 705712.

Lacasse, S., and Nadim, F. 1996. Uncertainties in characterizing

soil properties. In Proceedings of Uncertainty 96. Geotechnical

Special Publication 58, Vol. 1, pp. 4975.

Ladd, C.C., Dascal, O., Law, K.T., Lefebrve, G., Lessard, G., Mesri,

G., and Tavenas, F. 1983. Report of the subcommittee on embankment stability annex II. Committee of Specialists on Sensitive Clays on the NBR Complex, Socit dEnergie de la Baie

James, Montral, Que.

Law, A.M., and McComas, M.G. 1986. Pitfalls in the simulation of

manufacturing systems. In Proceedings of the Winter Simulation

Conference, Washington, D.C., pp. 539542.

Li, K.S. 1991. Discussion: probabilistic potentiometric surface mapping. Journal of Geotechnical Engineering, ASCE, 117: 14571458.

Lumb, P. 1966. The variability of natural soils. Canadian Geotechnical

Journal, 3: 7497.

Major, G., Ross-Brown, D.M., and Kim, H-S. 1978. A general

probabilistic analysis for three-dimensional wedge failures. In

Proceedings of the 19th U.S. Symposium on Rock Mechanics,

Stateline, Nev., 13 May, pp. 4556.

Matsuo, M., and Kuroda, K. 1974. Probabilistic approach to design

of embankments. Soils and Foundations, 14: 117.

Microsoft Corporation. 1997. Microsoft Excel 97. Microsoft Corporation, Redmond, Wash.

Mostyn, G.R., and Li, K.S. 1993. Probabilistic slope stability analysis state-of-play. In Probabilistic methods in geotechnical

engineering. Edited by K.S. Li and S.-C.R. Lo. A.A. Balkema,

Rotterdam. pp. 89109.

Neter, J., Wasserman, W., and Kutner, M.H. 1990. Applied linear

statistical models. Richard D. Irwin Inc., Boston.

Nguyen, V.U., and Chowdhury, R.N. 1984. Probabilistic study of

spoil pile stability in strip coal mines two techniques compared. International Journal of Rock Mechanics, Mining Science

and Geomechanics, 21: 303312.

Nguyen, V.U., and Chowdhury, R.N. 1985. Simulation for risk

analysis with correlated variables. Gotechnique, 35: 4758.

Palisade Corporation. 1996. @Risk: risk analysis and simulation

add-in for Microsoft Excel or Lotus 1-2-3. Palisade Corporation,

Newfield, N.Y.

Priest, D.S., and Brown, E.T. 1983. Probabilistic stability analysis

of variable rock slopes. Transactions of the Institute of Mining

and Metallurgy, 92: A1A12.

Souli, M., Montes, P., and Silvestri, V. 1990. Modelling spatial

variability of soil parameters. Canadian Geotechnical Journal,

27: 617630.

Tang, W.H., Yucemen, M.S., and Ang, A.H.S. 1976. Probabilitybased short-term design of slopes. Canadian Geotechnical Journal, 13: 201215.

Tobutt, D.C. 1982. Monte Carlo simulation methods for slope stability. Computers and Geosciences, 8: 199208.

U.S. Army Corps of Engineers. 1992. Reliability assessment of

navigation structures. Engineering Technical Letter 1110-2-532,

U.S. Army Corps of Engineers, Washington, D.C.

U.S. Army Corps of Engineers. 1995. Introduction to probability

and reliability methods for use in geotechnical engineering. Engineering Technical Letter 1110-2-547, U.S. Army Corps of Engineers, Washington, D.C.

Vanmarcke, E.H. 1977a. Probabilistic modeling of soil profiles.

Journal of the Geotechnical Engineering Division, ASCE, 103:

12271246.

Vanmarcke, E.H. 1977b. Reliability of earth slopes. Journal of the

Geotechnical Engineering Division, ASCE, 103: 12471265.

Vanmarcke, E.H. 1980. Probabilistic stability analysis of earth slopes.

Engineering Geology, 16: 2950.

Vanmarcke, E.H. 1983. Random fields: analysis and synthesis. MIT

Press, Cambridge, Mass.

Whitman, V.W. 1984. Evaluating calculated risk in geotechnical

engineering. Journal of Geotechnical Engineering, ASCE, 110:

145188.

Wolff, T.F. 1996. Probabilistic slope stability in theory and practice. In Proceedings of Uncertainty 96. Geotechnical Special

Publication 58, Vol. 2, pp. 419433.

Wolff, T.F., and Harr, M.E. 1987. Slope design for earth dams. In

Proceedings of the 5th International Conference on Applications

of Statistics and Probability in Soil and Structural Engineering,

Vancouver, B.C., May 2529, Vol. 2, pp. 725732.

Yucemen, M.S., and Al-Homoud, A.S. 1990. Probabilistic threedimensional stability analysis of slopes. Structural Safety, 9: 120.

List of symbols

b slice width

B bias correction factor

C cohesion

Dtill depth of till

E[] mean

f variance reduction factor

F(x) cumulative probability distribution function of variable x

FS factor of safety

2002 NRC Canada

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:18 AM

Composite Default screen

El-Ramly et al.

g(x1, x2,

, xn)

h

hcr

hfill

hL

hM

l1, l2, l3

L

m

n

Pu

r

ro

ru

Su

SuL

SuM

t

tcr

ti

u

V

W

x

xi

X(z)

z

Z0

Z1

performance function

total height of slice

height in clay crust

height in embankment fill

height in lacustrine clay

height in marine clay

segments of slip surface of lengths l1, l2, and l3

length over which soil parameters are averaged

number of iterations in Monte Carlo simulation

number of observations

probability of unsatisfactory performance

separation distance between two variables in the space

of a random field

autocorrelation distance

pore-pressure ratio

undrained shear strength

undrained shear strength of lacustrine clay

undrained shear strength of marine clay

trend component for variable x

clay crust thickness

trend component at location i

pore-water pressure

input variable adjusted for statistical uncertainty and bias

total height of slice

random variable

value of random variable x at location i

local average of variable x over a length z

distance

separation distance between two intervals z, z

distance from the beginning of the first interval to the

beginning of the second interval

681

Z2 distance from the beginning of the first interval to the

end of the second interval

Z3 distance from the end of the first interval to the end of

the second interval

angle to the horizontal of slice base

reliability index

scale of fluctuation

residual component, equal to difference between x and t

i residual component at location i

friction angle

effective friction angle

fill fill friction angle

variable representing soil stratigraphy

bulk unit weight

fill fill unit weight

soil parameter

Bjerrum vane correction factor

L , M Bjerrum vane correction factor for SuL and SuM, respectively

correlation coefficient between two variables

[] standard deviation

2 [] variance

2 z variance of X(z)

z interval of length z

z interval of length z

(z) variance function

Appendix A

This appendix illustrates the general structure of the

spreadsheet model used in the probabilistic analysis of the

James Bay dyke.

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:19 AM

Composite Default screen

682

Fig. A1. Spreadsheet model, section 1: dyke geometry, stratigraphy, and input variables.

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:19 AM

Composite Default screen

El-Ramly et al.

683

Fig. A2. Spreadsheet model, section 2: tables of factor of safety computations. G.S., ground surface; S.S., slip surface.

I:\cgj\Cgj39\Cgj-03\T02-034.vp

Tuesday, May 14, 2002 10:37:20 AM

- interaction_effectsTransféré parapi-3744914
- Ch5DWTransféré parJefri Kurniawan Laskar WongKito
- 20375369Transféré parpleyvaze
- HASIL SPSSTransféré parYunita hartati
- PSY1004_06Transféré parJean-paul Agboh
- ForecastingTransféré parMiguel Hernandez Ballena
- Uts Statistik. Muhammad Puadi (17702251002) Ptk bTransféré parLuqman Pramana Sudibya
- 2. Bai Thuc Hanh Excel Va SPSSTransféré parZoKute Tran
- 13-14_MAPSS_Regresie multIPLA_04112013Transféré parEduard Iosipescu
- Libro1.xlsxTransféré parCésar García Delacruz
- TD3Transféré parRadu Vija
- spss1Transféré parM Rofiq
- amfg (Autosaved)Transféré par2C Akuntansi
- Notule Grenier 37Transféré parverdos
- Lampiran SPSSTransféré parEga Marga Putra
- chap2adTransféré parAssane Dabre
- Annale de probabilité DEUG & LICENCE.pdfTransféré parjeffy
- 1m02tdTransféré parRadwane Lourhmati
- cours07.pdfTransféré parDjidelSaid
- m09bbueTransféré parsoufianeeslaouy
- echantillonageTransféré parOthmane Bén
- Cours Acp, Un ExempleTransféré parSlaheddine Khlifi
- Test de StudentTransféré parMohamed Chike
- annexe BTransféré parapi-3737025
- Data Uji Ranking b AjaTransféré parApriliaa
- 8l6y5-ECE_2_DS_6_EML_2011Transféré parogmios_94
- [MA101] Math Matiques Pour l Ing Nieur - Partiel - 2005Transféré parSara Sara
- Cours Du Statistique DescriptiveTransféré parDjamel Rihia
- 06coursTransféré parSimo Allouch
- Définition VarianceTransféré parHalim Genie Civil

- panduan_224_20190214092101.pdfTransféré parNurrisma Puspitasari
- Panduan Abdimas Dana Lokal ITS 2019Transféré parHaryo Armono
- Tips and Trick in MS WordTransféré parHaryo Armono
- Pedoman-Pelaksanaan-DDRGTransféré parPutu Perdana
- Panduan Teknis _ Insentif Buku Ajar Terbit Tahun 2018Transféré parAnnas Marzuki Sulaiman
- Panduan-PPTI-2019Transféré parArie Kurniawan
- Paper for Martech 2004Transféré parHaryo Armono
- Panduan-Pendanaan-Inovasi-Industri-2019.pdfTransféré parHaryo Armono
- Penerimaan Proposal RISPRO INVITASI 2019.pdfTransféré parHaryo Armono
- Scope of Work Manajemen Konsultan untuk Proyek MCP.docxTransféré parHaryo Armono
- Wave Transmission on Sumerged Breakwaters Made of Hollow Hemispherical Shape Artificial ReefsTransféré parJuan Carlos Astargo Suarez
- Panduan Rispro Kompetisi 2019Transféré pargustri09
- Permenpan Nomor 36 Tahun 2018Transféré parAndri Unair
- Scaling the weight of HSAR.pdfTransféré parHaryo Armono
- WMO702Transféré parbepewe
- How to Add XYgisTransféré parHaryo Armono
- Catatan Kuliah JMTTransféré parReza Vitali
- Buku Panduan Pelaksanaan Penelitian Dan Pengabdian Kepada Masyarakat Edisi XI Tahun 2017Transféré pararfankaf
- Penanggulangan erosi pantaiTransféré parHaryo Armono
- Panduan Penelitian Dana Lokal ITS Tahun 2017 Edisi RevisiTransféré parHaryo Armono
- Some Surfer Map TipsTransféré parHaryo Armono
- SNI 03-3423-1994.pdfTransféré parHaryo Armono
- SNI 03-3423-1994.pdfTransféré parHaryo Armono
- Primo RskTransféré parHaryo Armono
- surfergrids.pdfTransféré parHaryo Armono
- 04 armono paper issm 1998.pdfTransféré parHaryo Armono
- Desain JettyTransféré parMeyjack
- Skala Model FisikTransféré parGalih Habsoro Sundoro
- Info Pub Lik 20120813115526Transféré parRehak Geuleuh Hoekk Hejo
- 03 Dimensional AnalysisTransféré parHaryo Armono

- topologie 2018.pdfTransféré parAlinour Amena
- Série 1 - Tableaux (Vecteurs-Matrice) - SolutionTransféré parmaki
- Cylce3.BO.Transféré parmyriam
- cours_georiemTransféré parSaid Moustakim
- CH5_MITransféré parWissem Dhaouadi
- Notes sur l'analyse des séries chronologiquesTransféré paramib69
- TD TorseursTransféré parAyuxoç Àsreng
- ANOVA 01Transféré parAnonymous PLxbsV
- Codification Et Representation de 1Transféré parzaki
- Exercice Serie NumeriqueTransféré parapi-3842942
- resume-integ-generalise.pdfTransféré parMøhãmëd BåkKäri
- 03 Théorème de GaussTransféré parLahoucine Ghourchane
- Asservissement de PressionTransféré parYuba Mis n'Arrif
- Logique TdTransféré parchichoua
- Interpolation Orthogonale ( cours math 4 usthb )Transféré parlaperouse2008
- cours statistique probabilitéTransféré parAli Hachimi Kamali
- Tp 1Transféré parWalid Moussa
- geometrie-espaceTransféré parluxiole
- Td1 CorrectionTransféré parabdelghani8moro
- pgcd-et-ppcm-cours1Transféré parzlimitoune
- CM1_probas_L2Transféré parOussama El B'charri
- Fonctions Circulaires Et Hyperboliques InversesTransféré parR-win
- CÁLCULO MENTAL Multiplicacion Decimales.Transféré parmarco granados
- Introductions Aux Calculs de ProbabilitésTransféré parSalma Lamzaouek
- 275 Exercices Et Problèmes Résolus De Mathematiques Superieures AnalyseTransféré parMariana Marian
- 7B Cooperatifs Coeur ArticleTransféré parAbdel ES-Samery
- Cours - Calculs de Primitives Et d'IntegralesTransféré parابراهيم العمري
- techniques_commande_avancee.pdfTransféré parBshAek
- Adosphère 1 - Guide Pédagogique (téléchargeable en ligneTransféré parZaustavivreme