Académique Documents
Professionnel Documents
Culture Documents
Keith Linard
Senior Lecturer, School of Civil Engineering
University College UNSW (Australian defence Force Academy)
Email: keithlinard#@#yahoo.co.uk (Remove hashes to email)
SUMMARY
This paper addresses the practical issue of achieving quality assurance in system dynamics
modelling projects. It first provides a brief overview of system dynamics, then suggests criteria
for determining contexts where system dynamics modelling solutions might be appropriate. The
main focus of the paper is the development of a structured methodology for a system dynamics
project. It summarises the key validation tests that should be undertaken in a typical project and
outlines how a consultant (external or in-house) team might be constituted.
Introduction
What is System Dynamics
System dynamics as a management discipline developed in the 1950s with its origins in
engineering control theory (servo-mechanisms and cybernetics), although underlying systems
concepts have been applied rigorously for the past century across most disciplines.
In a nutshell, System Dynamics is the rigorous study of organisational problems, from a holistic or
systemic perspective, using the principles of feedback, dynamics and simulation.
System Dynamics is a methodology for understanding complex problems where there is dynamic
behaviour (quantities changing over time) and where feedback impacts significantly on system
behaviour. It provides a framework and rules for qualitative description, exploration and analysis
of systems in terms of their processes, information, boundaries and strategies, facilitating
quantitative simulation modelling and analysis for the design of system structure and control.
Computer simulation is central to the system dynamics discipline. Until 1987 the key software tool
available (Dynamo, a Fortran like language) required skilled programmers and was difficult for line
managers to use without significant support. This inhibited acceptance of the approach.
Powerful graphics software is now available for Macintosh and PC, which allows the modeller to
construct visual and symbolic representation of the system, facilitating both communication of
findings to management and knowledge capture from subject area experts.
1
Quality Assurance in System Dynamics Modelling
2
Quality Assurance in System Dynamics Modelling
System Behaviour
The essence of systems thinking is that structure influences behaviour or, put another way, when
placed in the same system, different people tend to produce similar results. Indeed, the fundamental
assumption of the federal public service reforms of the 1980s, the Financial Management
Improvement Program (FMIP), was predicated on the assumption that organisational breakdowns
and sub-optimal behaviour do not occur because of management stupidity, but rather are products
of the system. The FMIP reforms focused on modifying those aspects of the system, illustrated
in Figure 3, which encouraged, rewarded or enforced sub-optimal behaviour.
ACTIONS
Perceived standards
& practices ('culture')
Figure 3: Systems in the public sector with potential to 'teach dysfunctional behaviour
Some Terminology
The fundamental concept in systems thinking is feedback. In systems thinking every influence is
both cause and effect and the key to seeing reality systemically is to see circles of influence
(dynamic thinking) instead of straight lines (linear thinking). By tracing these flows of influence
we often see patterns which repeat themselves, making situations better or worse.
Feedback Loops: The two main building blocks of all system representations are reinforcing and
balancing feedback loops. Reinforcing loops generate exponential growth (positive reinforcement)
and collapse (negative reinforcement) and, as a result, are often represented in systems diagrams by
the snowball effect. Balancing processes generate resistance, maintain stability and help achieve
equilibrium. A reinforcing loop will always, sooner or later, come up against a limiting or
balancing effect.
Delays: A delay is an interruption between an action and its consequences. Delayed feedback is
the key cause of the dynamic behaviour in systems.
Leverage: Sometimes small, well focused actions can produce significant, enduring improvements
- if they are carried out in the right place. This is referred to as leverage. The problem with
leverage is that it is often difficult to find. This is because it is usually not close in space and time
to the symptoms of the problem. As a result, in order to find high-leverage changes it is necessary
3
Quality Assurance in System Dynamics Modelling
$ for
Additional s o Demand
capacity
Surplus
s Pressure for Capacity o
Budget $
o
s Price
Marketing of
surplus capacity s
"free" of charge s
'Congestion'
Cash
Maintenance & reserves
s expansion capacity
s Demand DE
LA
Y
"User pays" gives genuine market s
signals for demand and provides
reserves for expansion
3. The problem has a known history which can be described, both qualitatively and quantitatively.
4
Quality Assurance in System Dynamics Modelling
4. The problem exhibits time dynamics (e.g., time delay between some external trigger and the
impact on system demand, between changes in demand patterns and policy response or between
policy response and policy impact.)
5. Previous attempts have been made to address this problem, but results have been less effective
than desired.
But Why System Dynamics - What About Other Approaches
In some quarters there has been a tendency to set system dynamics modelling in opposition to other
forms of modelling (e.g., econometric modelling). This is a fruitless exercise.
What is important is that the appropriate paradigm, whether it be econometrics, operations research
or system dynamics, is used for a given problem. System dynamics can be misused as easily as
econometrics or any other modelling approach.
The circumstances where system dynamics may be appropriate were noted above. Using it outside
these criteria may be a costly mistake. On the other hand, many dynamic problem situations will
call for a combination of paradigms or time series analysis might be used to identify mathematical
relationships between some parameters, whilst system dynamics modelling might be implemented
simply for the critical feedback relationships.
In respect of preparedness modelling, the lack of convincing progress by other methodologies, the
fact that all the criteria for use of system dynamics are met and the evidence of the prototypes
developed at ADFA provides a sound case for its application. A further important reason for the
use of the system dynamics modelling tools is their value in communication with stakeholders. In a
complex area such as this, user confidence is critical.
The work at ADFA, however, combines where appropriate supporting operations research and
statistical tools.
5
Quality Assurance in System Dynamics Modelling
Ithink / STELLA, whilst probably the most popular package in the educational context, suffers from
two fatal disadvantages. First, it is not fully networkable in a PC environment; secondly, its array
capability is very limited, leading to excessive complexity in modelling and model presentation.
Powersim has demonstrated its ability to handle the complex modelling. Associated software now
permits publishing of models directly on intranets or the internet.
Traditional computer models, whether built in spreadsheets or high level computer languages,
tend to focus on the mathematical correlation, hiding the underlying conceptual framework. This
creates problems in validating the model logic with subject area experts and in communicating to
decision makers the reason for behavioural patterns. Statistical correlation, of course, is not
synonymous with causality
A1 0.3 A1*B2
?
A2 0.25 A2*B2 Conceptual framework
5*A1 0.4 A3*B3
A1-25 0.4 A4*B4
Sum(x) Sum(y)
100 0.3 30
Sum(z)
1000 Relationships
1000 0.25 250
System Dynamics Modelling starts with the conceptual framework, particularly the feedback
relationships. The logic is mapped diagrammatically, facilitating communication with both
subject area expert and decision makers. The logic map automatically generates the structure of
the underlying mathematics. The tools facilitate deeper level learning.
Numbers
Relationships
6
Quality Assurance in System Dynamics Modelling
7
Quality Assurance in System Dynamics Modelling
The diagram below illustrates the tools typically used in the discipline as one moves from
qualitative conceptualisation of the problem to development of rigorous forensic models and
management decision tools.
8
Quality Assurance in System Dynamics Modelling
9
Quality Assurance in System Dynamics Modelling
Stage 2:
Problem Conceptualisation State problem contexts, symptoms Confirm understanding of business
and patterns of behaviour over time, with client
and past solutions
Tools include: Confirm understanding of the
Identify basic organisation structures
problem with the client
text & graphs core business processes
wire diagrams optimisation objective functions Confirm organisation performance
causal loop diagrams (outcome performance measures) measures with the client
influence diagrams patterns of resource behaviour
concept mapping over time
SSM rich pictures system boundaries
hexagons time horizon of study
cognitive mapping
Identify feedback relationships
social surveys
key resource states
statistical data
key resource flows
past review reports
key delays
key interrelation-ships
Restate problem, e.g. using SSM
VOCATE framework (Vision,
Owners, Clients, Actors, Transform-
ation processes, Environment)
Stage 3:
Model Formulation Initial Prototype(s)
MAP - MODEL - SIMULATE -
VALIDATE - REITERATE
Tools include: High level system map: Basic Confirm basic logical structure and
single dimension stock-flow model model functioning with client
System dynamics software of key business processes
Confirm key variables
Output graphs & tables from 20 - 40 variables
the SYSTEM DYNAMICS Confirm business rules
key stocks (resources) & flows
model(s)
key delays
key auxiliaries
key targets / goals / performance
indicator(s)
key information or material
feedbacks
key delays
Run simulation - validate
10
Quality Assurance in System Dynamics Modelling
Stage 4:
Model Development Detailed Prototype
MAP - MODEL - SIMULATE -
VALIDATE -REITERATE
Tools include: Iteratively elaborate model, Confirm basic structure and logic
challenging with subject area experts
System dynamics software system boundaries Confirm key variables with subject
stocks, flows, converters area experts
complexity / simplicity in
representing business rules Confirm business rules with subject
Introduce multi-dimensional area experts
arrays where applicable
Identify & build key policy levers
& reports
variables under control of
decision makers
output reports of relevance to
decision makers
Stage 5:
Stage 6:
Model Handover Installation & Training Installation & Training
Stage 7:
Model in use Experience in use of model identifies
need for fine-tuning.
Significant iteration can be expected between stages 3 - 5. Periodically, issues settled in stage 2
may need to be revisited, especially the key performance indicators.
Details of the actions to be undertaken in stages 2 to 5 are addressed in some detail by Richardson
and Pugh (1980), Wolstenholme et al (1993) and Roberts et al (1994).
11
Quality Assurance in System Dynamics Modelling
12
Quality Assurance in System Dynamics Modelling
Structure Verification Tests: Because the foundation for model behavior is the model's structure,
the first test in validating a model is whether the structure of the model matches the structure of the
system being modelled. Every element of the model should have a real-world counterpart, and
every important factor in the real system should be reflected in the model. Although this may seem
like a simple, obvious test, it may not be so. For example, descriptions of how all of the structural
parts of real systems are tied together rarely exist. More often than not, such descriptions must be
based on the concepts, or mental models, of people familiar with the system. Further, important
parts of some systems may lie unrecognized prior to modelling.
Parameter Verification Tests: Parameter values in a model often may be tested in a straightforward
manner, e.g., against historical data. However, even in technical defence systems, the available
data may be suspect. For example, equipment maintenance logs may reflect the total time the
equipment has been in the workshop, not the actual time it was being worked on. Also, in many
instances, there may be variables that are not usually quantified, but that are perceived to be critical
to the system being modelled. These elements must be included in the model. If productivity, for
example, is an important element in assessing optimal workforce structure, it must be included in
the model, and its relationship to other pertinent parts of the system must be specified
quantitatively. Clearly if the model output is strongly sensitive to such qualitative inputs, further
rigorous analysis of those inputs may be required.
13
Quality Assurance in System Dynamics Modelling
Boundary Adequacy Test: Model boundaries must match the purpose for which the model is
designed, if the model is to be used with confidence; that is, the model must include all of the
important factors affecting the behavior of interest. In practice, boundaries tend to shift as the
developers' and users' understanding of a problem evolves with the model's development. As model
purpose shifts, changes in the model's boundaries may be required. In many problems, a simple
model with limited boundaries may be expanded, or disaggregated, from time to time, as the model
is used to address problems in greater detail.
Extreme Conditions Test: Tests under extreme conditions, that is applying very high or very low
values to variables, often exposes structural faults or inadequacies and incomplete or erroneous
parameter relationships. The criterion is whether the pattern of system behaviour or values of
dependent variables remain valid (e.g. do not go negative if negativity is theoretically impossible)
or remain plausible. The ability of a model to function properly under extreme conditions
contributes to its utility as a policy evaluation tool as well as user confidence.
Dimensional Consistency Tests of model equations is an important structural test which simply
checks that the dimensions or units for the left hand side of an equation are identical with those on
the right hand side. Errors in dimensional consistency can easily creep into model equations during
model development and, subsequently, during revisions.
Behavior Reproduction Test: The tests relating to model behavior are less technical and, for many
users, more convincing than the structural tests. Foremost among these tests is the comparison of
model behavior with the behavior of the system being modeled. A model whose behavior has little,
or nothing, in common with that of the system of interest generates little, or no, confidence. Where
historical time series data are available, the model must be capable of producing similar data. In
this test, it is again important to keep in mind the purpose of the model -- including the time span of
the areas of behavior that are of interest. Where historical data are very poor or nonexistent, the test
may be one of reasonableness.
Behavior Anomaly Test: When model behavior does not replicate the behavior of the real system,
model structure, parameter values, boundaries, or similar factors must be considered suspect.
Something may have been omitted, improperly specified, or assigned incorrect values. Whether
due to faults in the model or in the real system, the resolution of the discrepancies found through
the anomalous behavior test bolsters confidence and validity.
Behavior Sensitivity Test: Most, but certainly not all, systems are stable. Small, reasonable
changes in a model's parameter values, should normally not produce radical behavior changes.
Typically these are introduced using the STEP, RAMP, PULSE or SINEWAVE functions of the
software. If the model's behavior is not seriously affected by plausible parameter variations,
confidence in the model is increased. On the other hand, dynamic simulation models are often used
to search for parameters that can effect behavior changes. The criterion in the sensitivity test is that
any sensitivity exhibited by the model should not only be plausible, but also consistent with
observed, or likely, behavior in the real system.
Behavior Prediction Test: A fundamental use of dynamic simulation models is predicting how a
system would behave if various policies of interest were implemented. Dynamic simulation models
offer significant advantages when used in this role; they provide a consistent basis for the
predictions. This basis is a consolidation of judgment, experience, and intuition that has been tested
against historical evidence. Confidence in the model is reinforced if the model not only replicates
long-term historical behavior, but also replicates the behaviour in existing systems where policy
changes have been implemented.
Extreme Policy Test: Similar to extreme conditions test, these tests introduce radical policies into
the model to see if the behavior of the model is consistent with what would be expected under these
14
Quality Assurance in System Dynamics Modelling
conditions.. The criterion again is whether the pattern of system behaviour or values of dependent
variables remain valid or plausible.
Statistical Character Test: The behaviour reproduction test (q.v.) is somewhat qualitative. In some
circumstances it may prove desirable to test rigorously behaviour reproduction. This may be done
by modifying model structure to produce formal statistics on key model output variables which can
be compared with real data.
15
Quality Assurance in System Dynamics Modelling
Principal system Implements the detailed system dynamics Technical expert in the 5 stages of
dynamics analyst modelling methodology (steps 2 to 6 in model development listed in Figure
Figure 6: 6.
problem conceptualisation Technical expert in the use of the
model formulation modelling software
model development Technical expertise in the use of
model validation ancillary software (spreadsheets,
model handover statistical packages, database
Liases directly with the clients subject area systems etc)
experts to confirm the various stages of the
model development are valid Significant domain knowledge of
the area being modelled
Assistant system Undertakes less complex aspects of the Intermediate level of expertise in the
dynamics analyst modelling, including 5 stages of model development
listed in Figure 6.
background research
documentation of all model parameters Intermediate level of expertise in the
and equations use of the modelling software
development of user help screens Basic domain knowledge of the area
development of program menu and being modelled
navigation tools
development of user documentation
development of draft project report
_____________________
16