Vous êtes sur la page 1sur 5

From Deterministic to Probabilistic Way of Thinking

in Structural Engineering
Prof. Ing. Pavel Marek, DrSc.
Department of Civil Engineering VŠB TU, Ostrava
Czech Republic
Prof. Jacques Brozzetti, Dr.
Ecole Nationale des Ponts et Chaussées,
Paris
France

Fall or Rise of the Prestige of Structural Engineering?


One of the speakers at the 1997 Congress of the American Society of Civil Engineers
pointed out a significant fall of prestige of structural engineering [1]. In the history of
engineering education at the civil engineering faculties in the U.S.A. this field used to occupy
the highest echelon; therefore, the speaker was trying to find the explanation of this fall. He
stated inter alia that the number of causes might include also the preference granted to other
specializations due to social interests concerned e.g. with the development of transport
networks and environment protection. In his opinion, however, the said fall might be
connected also with the development of computers and structural design codes, as it was
possible to hear that the education of structural engineers concentrated gradually on the
application of sophisticated software, requiring from the structural engineer merely the
introduction of adequate input data, while the computer spouted almost immediately the
dimensions of the structure, its reliability assessment and the whole documentation required by
respective standards. In this process the designer need not even know the details of
dimensioning or the substance of the reliability assessment. Is this opinion justified? Does the
development of computer-aided structural design really belong among the causes of the above
mentioned fall of prestige observed not only in the U.S.A., but in other countries as well?

Is the Structural Engineer the Creator of Structures


or merely an Interpreter of Codes ?
Together with the manufacturer and erector the designer has been, is and will always remain
the creator of the structure. His activities are based on professional knowledge, experience and
cooperation with related professions. Although the computer revolution is providing ever more
powerful instruments facilitating and accelerating his work, these instruments and how ever
perfect standards can never replace the designer-creator, responsible for effectiveness and
reliability of his work.
The design codes and standards cannot cover all situations, loading alternatives,
performance conditions, etc. the designer may encounter. Often he has to decide himself on the
basis of his own knowledge and experience in accordance with the „rules of the game“ of
reliability assessment. In respect of safety, serviceability and durability of structures the
development of codes and standards in the past few decades has resulted in a certain damping
of the creative role of the designer who has become merely an interpreter of the rules and
criteria formulated in the standard. „The rules of the game“ (i.e. the theoretical foundations of
reliability assessment) of the Partial Factor Design (PFD) are given in the standards excessively
simplified and the designer during his education is not acquainted with their substance. The
teachers often use the wording „the code states..“, thus avoiding the explanation of the
problems with which they are often not thoroughly acquainted themselves. To be accurate,
they cannot be thoroughly acquainted with them, as the commentaries and data explaining fully

1
and consistently the background of the codes, various simplifications and the influence of
calibration, are not available. The consequences of this development are that the students of
building faculties are educated primarily on the basis of the interpretation of standards and
codes and not in the engineering way of thinking and conceptual work. This fact must be
afforded full attention.
Let us recall the related experience with the introduction of the PFD concept into US
design codes. In the field of steel structures a standard based on the LRFD (Load and
Resistance Factor Design) method was issued in the U.S.A. in 1986. This method is actually an
analogy of the PFD method – see the Eurocodes. The coded was to replace the standard based
on the deterministic method of Allowable Stress Design (ASD). Although the issue of the
LRFD standard was preceeded by an extensive explanatory campaign and training courses
emphasizing the advantages of the LRFD method, today – 15 years after it has been introduced
– it is applied merely by one quarter of designers [2] while the prevailing majority of designers
in the country of tallest buildings, biggest bridges and other unique structures, has remained
faithful to the excessively simplified, but understandable deterministic ASD method. This
seemingly conservative attitude of designers is usually explained by unsatisfactory teaching of
the LRFD method at universities. However, the number of principal causes of reserve on the
part of experienced US designers may include their feeling that the LRFD method, developed
in the pre-computer era, has been submitted to the designers too late and that it no longer
provides qualitatively new possibilities corresponding with the computer era in respect of
reliability assessment.
From Slide-Rule Era to Computer Era in Structural Design
In the courses of steel, concrete and timber structures the students of civil engineering
faculties may hear from some of their instructors that due to the introduction of the Eurocodes
„nothing much will happen in the field of reliability assessment in the next few decades“. This
statement must be debated. The expansion of fast improving computers to the desk of every
structural designer has produced profound qualitative and quantitative changes which have no
analogy in the whole history of this field. The growing computer potential improves the
prerequisites for the „re-engineering“ of the whole design process (i.e. its fundamental re-
assessment and reworking) to adapt it to entirely new conditions and possibilities. We can
follow with admiration the fast development of software for the analysis and for dimensioning
of structures according to standards and for the production of their respective drawings. At
the same time it is necessary to emphasize that entirely unsatisfactory attention has been
afforded so far to the preparation of concepts and corresponding standards based on the
qualitatively improved method of reliability assessment corresponding to computer potential
available.
Since the early Sixties, many national and international specifications for structural design
based on deterministic concept have been replaced by a “semi-probabilistic” Partial Factors
Design (PFD) such as that found in the Eurocodes. The PFD concepts have been developed
using statistics, reliability theory and probability, however, without considering the computer
revolution. The interpretation of the assessment format in codes is somewhat similar to the
fully deterministic scheme applied in earlier codes except there are applied two partial factors
instead of a single factor. The application of PFD does not require the designer to understand
the rules hidden in the background of the codes. The semi-probabilistic background of the
reliability assessment procedure has been considered by those writing the codes, however, the
calibration and numerous simplifications introduced in the final format of codes affected the
concept in such a way that the concept is better to be called “prescriptive” instead of “semi-
probabilistic”, see [2]. The designer’s activities are limited to the interpretation of equations,
criteria, instructions, factors, and „black boxes“ contained in the codes. The reliability check

2
can be conducted using a calculator, slide rule or even long-hand, while the modern computer
serves only as a “fast calculator”. The actual probability of failure and the reserves in bearing
capacity cannot be explicitly evaluated using PFD codes. From the designer’s point of view,
the application of PFD in practice is still deterministic. A designer‘s direct involvement in the
assessment process is not assumed and, therefore, his/her creativity is suppressed.
Has the computer potential created the prerequisites for a qualitative improvement of the
partial factors method? The answer can be illustrated by the following analogy: Is it sufficient
to attach a high-efficient jet engine to the gondola of a balloon in order to achieve its
incomparably higher velocity and efficiency? It can be concluded that the combination of the
balloon and the jet engine will not create a higher quality means of transport. Analogously it is
possible to conclude that the partial factors method based on numerous limitations and
simplifying assumptions cannot be raised to a qualitatively higher level of the structural
reliability assessment concept by its combination with computer potential. It can be concluded
that the computer revolution opens the door to qualitatively new fully probabilistic structural
reliability concepts.

Application of Elite Research Results to Structural Design Codes


The results of elite research are usually applied to specific fields (offshore
structures, space programmes) by top-level experts. The conferences, however, lack the
papers by research scientists explaining their ideas of the application of their concepts to
standards and codes used by hundreds of thousands of designers in their everyday work.
Who will bring the message from the elite researchers to the rank and file designers ? An
understandable explanation of scientific methods of reliability assessment used in the
standards accepted by structural designers worldwide is a highly challenging task.
However, without its solution the results of elite research remain merely the object of
articles in scientific periodicals and the designer remains merely an interpreter of
„prescriptive“ codes.
Research affords attention to the development of risk engineering, „fuzzy sets“
and other methods, while the designer lacks a fundamental, understandable and
consistent method of determination of failure probability. Therefore, it is necessary to
reassess „the rules of the game“ of the reliability assessment, beginning with the load
definition: the present day load expression in standards and codes by the characteristic
value and load factors must be replaced with a qualitatively higher form enabling to take
into account also the loading history (such as the „load duration curves“ [3] ). With
reference to bearing capacity it is necessary to provide a „reference level“ applicable to
the computation of failure probability. Reliability must be expressed by a comparison of
the computed failure probability with design probability.
The awareness of designer is the necessary prerequisite for the practical
application of the probabilistic concept of reliability assessment. Let us turn our attention
to the education of students of civil engineering faculties and designers in post-graduate
courses.

Deterministic or Probabilistic Approach in the Education ?


Let us ask these questions: Is the approach applied in our courses, to the solution of technical
problems in structural design, deterministic or probabilistic? Are instructors infusing a
deterministic understanding into the “knowledge-base” of their students, or is the fact that we
are living in a world defined by random variables already accepted and applied in the
educational process? In courses such as Statistics and Probability Models in Structural
Engineering, the common textbooks are based on a “classical” approach to statistics and

3
probability theory [4]. Such an approach is limited to analytical and numerical solutions, and
does not allow for transparent analysis of reliability functions that depend on the interaction of
several random variables. The textbooks mostly remain silent on common real-world
problems, such as the probability of failure of a structural component exposed to variable load
combinations in which one might consider the contributions of variable yield stress, variable
geometrical properties and random imperfections. In structural design courses, the
interpretation and application of the existing codes are emphasized; however, students are
using the codes without a full understanding of the actual reliability assessment rules and of
the meanings of the factors used to express safety, durability and serviceability of structural
components.

Teaching Reliability
Advances in computer technology allow for using simulation, see for example [3]. The direct
Monte Carlo simulation technique has been applied to basic problems in statistics and
structural reliability assessment problems has been taught, for example, since 1989 at San Jose
State University, California, and since 1996 at the Department of Civil Engineering, VŠB TU
Ostrava, Czech Republic, at the graduate and undergraduate levels. The positive response of
the students, and their understanding encouraged the instructors. A team of undergraduate
students developed, for example, a study proving that the PFD method is not leading to a
consistent level of safety (see Probabilistic Engineering Mechanics 14, 1999, p.109 to 118,
“Parametric Study: LRFD vs. SBRA”). The new generation of civil engineers seems to be
anxious to apply advanced computer technology to its fullest including application of
simulation techniques in the analysis of multi-variable problems.
TERECO Project
With reference to the improvements expected in the field of structural reliability assessment
the training of students and designers ranks among the most important tasks. What starting
point should be chosen? A transition to the qualitatively higher probabilistic concepts will
require the designer to change his way of thinking, i.e. to replace his current “deterministic
thought-process” with the probabilistic one. The professional EC Committees consider the
training of designers in this respect highly desirable. For this reason it has been sponsored,
through the Leonardo da Vinci Agency in Brussels, the long-term TERECO Project (TEaching
REliability COncepts using simulation techniques, see [5]). The resulting product of the
Project involving the work of 33 authors from eight European countries and from the U.S. is
the textbook “Probabilistic Assessment of Structures Using Monte Carlo Simulation”[6]. The
book acquaints the reader with the basis of a fully probabilistic structural reliability assessment
concept, using as a tool the transparent SBRA method (Simulation-based Reliability
Assessment, see the textbook [3], and home pages www.itam.cas.cz/SBRA and and
www.fast.vsb.cz/science/sbra/default.htm). The concept allows for bypassing the “design-
point” approach as well as the load and resistance factors, and leads to the reliability check
expressed by Pf < Pd comparing the calculated probability of failure P f with the target design
probability Pd given in codes. The application of SBRA is explained in the book using 150
solved examples. On the attached CD-ROM the reader will find the input files and
computational tools enabling the duplication of the Examples on a PC, the pilot data-base of
mechanical properties (expressed by histograms) of selected structural steel grades, selected
histograms (loads, imperfections, and more), manuals for computer programs and selected
presentations of Examples (Microsoft PowerPoint). The book should serve in teaching
undergraduate and graduate students as an aid introducing the students to the strategy of the
fully probabilistic reliability assessment of elements, components, members and simple systems
using direct Monte Carlo simulation and modern PC computers.

4
Summary and Conclusions
The structural engineering profession needs new approaches if we want to provide the best
possible service to society. We have to consider the transition from the deterministic “way of
thinking” to open-minded probabilistic concepts accepting the random character of individual
variables involved as well as their interaction. Tools such as simulation techniques and
powerful personal computers will contribute to reaching such goals. Students find these
techniques easy to learn and thus they do not require the instructor to take a great deal of
classroom time to explain the theoretical background. Once in the computer lab, students can
explore to their hearts content and gain a fuller understanding of the effects of each parameter
on the variability of the final answer. With this understanding students are better informed to
make decisions about tradeoffs that need to be made, for example, between service life and
safety. The simulation technique should be included in the program of undergraduate and
graduate students and in corresponding textbooks to prepare them for the types of problems
they will encounter in the real world, especially for the application of probabilistic structural
reliability assessment concepts in the new generation of codes expected to be introduced in the
near future. Such approach will make the engineer the creator of the structure and may bring
the prestige of structural engineering back to one of the highest positions.
References:
[1] Sherman D.R. (1997). Education “Structural Engineering Practice” Keynote lecture.
ASCE Structures Congress XV, Portland. Not published.
[2] Iwankiw N. (2000). AISC, Chicago. Personal communication.
[3] Marek P., Guštar M., and Anagnos T. (1995). Simulation-based Reliability Assessment
for Structural Engineers. CRC Press, Inc., Boca Raton, Florida, U.S.A.
[4] Anagnos, T., and Marek, P. (1996). Application of Simulation Techniques in Teaching
Reliability Concepts. Proceedings: Conference “Frontiers in Education”, Salt Lake City,
Oct. ‘96.
[5] Brozzetti, J., Guštar, M., Ivanyi, M., Kowalczyk, R., Marek, P., Vaitkevicius, V., et al.
(1998-2001). TERECO–Teaching Reliability Concepts using Simulation. Leonardo da Vinci
Programme, European Commission, Project No. CZ/98/1/82502/PI/I.1.1.a/FPI.
[6] Marek P., Brozzetti J., and Guštar M. (ed.) (2001). Probabilistic Assessment of Structures
using Monte Carlo Simulation. Background, Exercises, Software. Published by ITAM CAS
CZ Academy of Sciences of Czech Republic, Prague. ISBN 80-86246-08-6 (in print).
.

Vous aimerez peut-être aussi