Vous êtes sur la page 1sur 16

Quality Assurance in System Dynamics Modelling

Keith Linard
Senior Lecturer, School of Civil Engineering
University College UNSW (Australian defence Force Academy)
Email: keithlinard#@#yahoo.co.uk (Remove hashes to email)

SUMMARY
This paper addresses the practical issue of achieving quality assurance in system dynamics
modelling projects. It first provides a brief overview of system dynamics, then suggests criteria
for determining contexts where system dynamics modelling solutions might be appropriate. The
main focus of the paper is the development of a structured methodology for a system dynamics
project. It summarises the key validation tests that should be undertaken in a typical project and
outlines how a consultant (external or in-house) team might be constituted.

Introduction
What is System Dynamics
System dynamics as a management discipline developed in the 1950s with its origins in
engineering control theory (servo-mechanisms and cybernetics), although underlying systems
concepts have been applied rigorously for the past century across most disciplines.
In a nutshell, System Dynamics is the rigorous study of organisational problems, from a holistic or
systemic perspective, using the principles of feedback, dynamics and simulation.
System Dynamics is a methodology for understanding complex problems where there is dynamic
behaviour (quantities changing over time) and where feedback impacts significantly on system
behaviour. It provides a framework and rules for qualitative description, exploration and analysis
of systems in terms of their processes, information, boundaries and strategies, facilitating
quantitative simulation modelling and analysis for the design of system structure and control.
Computer simulation is central to the system dynamics discipline. Until 1987 the key software tool
available (Dynamo, a Fortran like language) required skilled programmers and was difficult for line
managers to use without significant support. This inhibited acceptance of the approach.
Powerful graphics software is now available for Macintosh and PC, which allows the modeller to
construct visual and symbolic representation of the system, facilitating both communication of
findings to management and knowledge capture from subject area experts.

1
Quality Assurance in System Dynamics Modelling

Figure 1: Logical Relationships Developed Graphically

Obviously, models of complex problems require complex mathematics. Models of problems


involving change over time and feedback require the solving of multiple differential equations.
This new generation of graphically oriented software (e.g. the Powersim software used in these
simulations) automatically generates the structure of the nth order differential equations necessary
for solving complex feedback problems, cutting development time dramatically and reducing the
likelihood of errors. Mathematical knowledge is still critical in fleshing out the interrelationships
between parameters, and it is still possible to build erroneous equations.

2
Quality Assurance in System Dynamics Modelling

Figure 2: Structure of Equations Automatically Generated from Logic Map

System Behaviour
The essence of systems thinking is that structure influences behaviour or, put another way, when
placed in the same system, different people tend to produce similar results. Indeed, the fundamental
assumption of the federal public service reforms of the 1980s, the Financial Management
Improvement Program (FMIP), was predicated on the assumption that organisational breakdowns
and sub-optimal behaviour do not occur because of management stupidity, but rather are products
of the system. The FMIP reforms focused on modifying those aspects of the system, illustrated
in Figure 3, which encouraged, rewarded or enforced sub-optimal behaviour.

Legal, regulatory &


budgetary environment

ACTIONS
Perceived standards
& practices ('culture')

Management systems &


organisational architecture

Figure 3: Systems in the public sector with potential to 'teach dysfunctional behaviour

Some Terminology
The fundamental concept in systems thinking is feedback. In systems thinking every influence is
both cause and effect and the key to seeing reality systemically is to see circles of influence
(dynamic thinking) instead of straight lines (linear thinking). By tracing these flows of influence
we often see patterns which repeat themselves, making situations better or worse.
Feedback Loops: The two main building blocks of all system representations are reinforcing and
balancing feedback loops. Reinforcing loops generate exponential growth (positive reinforcement)
and collapse (negative reinforcement) and, as a result, are often represented in systems diagrams by
the snowball effect. Balancing processes generate resistance, maintain stability and help achieve
equilibrium. A reinforcing loop will always, sooner or later, come up against a limiting or
balancing effect.
Delays: A delay is an interruption between an action and its consequences. Delayed feedback is
the key cause of the dynamic behaviour in systems.
Leverage: Sometimes small, well focused actions can produce significant, enduring improvements
- if they are carried out in the right place. This is referred to as leverage. The problem with
leverage is that it is often difficult to find. This is because it is usually not close in space and time
to the symptoms of the problem. As a result, in order to find high-leverage changes it is necessary

3
Quality Assurance in System Dynamics Modelling

to identify underlying structures in the system. Computer simulation is a key to identification of


leverage points.
As noted above, these concepts formed the foundation of the FMIP management reforms of the
1980s. Figure 4 illustrates, using very simplified causal loop diagrams, the rationale for moving to
user pays for inter-departmental services.

$ for
Additional s o Demand
capacity
Surplus
s Pressure for Capacity o
Budget $
o
s Price
Marketing of
surplus capacity s
"free" of charge s
'Congestion'
Cash
Maintenance & reserves
s expansion capacity
s Demand DE
LA
Y
"User pays" gives genuine market s
signals for demand and provides
reserves for expansion

Figure 4: Causal Loop Diagram of 'User Pays' Dynamics


The left hand loop depicts the generic situation then extant across the public service in relation both
to the internal provision of free management services, such as typing, statistical services etc, and
to inter-departmental free services such as commonwealth cars, Foreign Affairs international
communication system, building services etc. The left-hand loop shows self-reinforcing behaviour
in relation to these free services. Being free, more of the service is demanded than is really
warranted. This leads to congestion, creating an argument on the part of the supply organisation for
more resources. The increase in resources leads to temporary over capacity, so the agency markets
its improved level of service, leading to increased demand . . . and so on. A second dimension to
this loop, not shown, is that the service provider, being in a monopoly position, could largely
dictate the level and quality of service given.
An understanding of the feedback interrelationships pointed to possible leverage points. First,
sheeting home to the consumer the fact that services cost would (arguably) lead to better
assessment of the quantum of such services really required. Secondly, allowing the recipient to
retain the revenue from user charges would permit more appropriate determination of investment
priorities.
The user pays solution, right hand loop, introduced price as the lever which leads to two balancing
loops.

Contexts Where System Dynamics Modelling Solutions Might Be Appropriate


System dynamic is not appropriate to every problem context. System dynamics may be appropriate
if the problem has the following broad characteristics:

1. The issue is important to organisational objectives.

2. The problem is of an on-going nature rather than a one-time event.

3. The problem has a known history which can be described, both qualitatively and quantitatively.

4
Quality Assurance in System Dynamics Modelling

4. The problem exhibits time dynamics (e.g., time delay between some external trigger and the
impact on system demand, between changes in demand patterns and policy response or between
policy response and policy impact.)

5. Previous attempts have been made to address this problem, but results have been less effective
than desired.
But Why System Dynamics - What About Other Approaches
In some quarters there has been a tendency to set system dynamics modelling in opposition to other
forms of modelling (e.g., econometric modelling). This is a fruitless exercise.
What is important is that the appropriate paradigm, whether it be econometrics, operations research
or system dynamics, is used for a given problem. System dynamics can be misused as easily as
econometrics or any other modelling approach.
The circumstances where system dynamics may be appropriate were noted above. Using it outside
these criteria may be a costly mistake. On the other hand, many dynamic problem situations will
call for a combination of paradigms or time series analysis might be used to identify mathematical
relationships between some parameters, whilst system dynamics modelling might be implemented
simply for the critical feedback relationships.
In respect of preparedness modelling, the lack of convincing progress by other methodologies, the
fact that all the criteria for use of system dynamics are met and the evidence of the prototypes
developed at ADFA provides a sound case for its application. A further important reason for the
use of the system dynamics modelling tools is their value in communication with stakeholders. In a
complex area such as this, user confidence is critical.
The work at ADFA, however, combines where appropriate supporting operations research and
statistical tools.

Choice of Simulation Software


There are four key graphically oriented system dynamics software packages available in the
marketplace, all of which are in use at ADFA:
Powersim (Norway)
Ithink / STELLA (US)
Vensim (US)
Cosmos /Cosmic (UK).
The Directorate of Army Research & Analysis evaluated the first three of these packages in 1993
and found that Powersim was significantly more useful, recommending that Army standardise on its
use. Developments since that time reinforce this conclusion.
The different packages have advantages and disadvantages reflecting the particular niches they aim
at. The following brief comments relate to their applicability to the Federal budget policy
environment. For a more detailed review reference to the DARA evaluation is suggested.
Cosmos / Cosmic and Vensim are much more geared to background technical use. The fact that
Armys powerful MRU (manpower required in uniform) model, built in Vensim, has fallen into
total disuse within 2 years of its creator leaving is a reflection of its lack of user friendliness.

5
Quality Assurance in System Dynamics Modelling

Ithink / STELLA, whilst probably the most popular package in the educational context, suffers from
two fatal disadvantages. First, it is not fully networkable in a PC environment; secondly, its array
capability is very limited, leading to excessive complexity in modelling and model presentation.
Powersim has demonstrated its ability to handle the complex modelling. Associated software now
permits publishing of models directly on intranets or the internet.

Traditional computer models, whether built in spreadsheets or high level computer languages,
tend to focus on the mathematical correlation, hiding the underlying conceptual framework. This
creates problems in validating the model logic with subject area experts and in communicating to
decision makers the reason for behavioural patterns. Statistical correlation, of course, is not
synonymous with causality

A1 0.3 A1*B2
?
A2 0.25 A2*B2 Conceptual framework
5*A1 0.4 A3*B3
A1-25 0.4 A4*B4

Sum(x) Sum(y)
100 0.3 30
Sum(z)
1000 Relationships
1000 0.25 250

500 0.4 200


Spreadsheet models:
75 0.4 30 assume uni-directional linear causality
1675 510
emphasise numerical inputs and outputs
logical relationships between numbers
2185 are hidden and difficult to follow
Numbers conceptual framework is obscure

System Dynamics Modelling starts with the conceptual framework, particularly the feedback
relationships. The logic is mapped diagrammatically, facilitating communication with both
subject area expert and decision makers. The logic map automatically generates the structure of
the underlying mathematics. The tools facilitate deeper level learning.

Numbers

Relationships

System Dynamic models:


address delayed feedback causality
Conceptual
emphasise meaning and relationships
Framework
conceptual framework is 'mapped'
logic is developed & displayed
diagramatically
numbers are kept in background &
are readily called upon

Figure 5: "Spreadsheet" versus System Dynamics Models

6
Quality Assurance in System Dynamics Modelling

System Dynamics is More than Software Simulation


Over the past three decades System Dynamics has been applied to such areas as project
management, business development, government policy analysis, environmental change, economic
development, military strategic and tactical analysis. They have been applied in multi-million and
multi-billion dollar court cases, where they have been developed to withstand scrutiny from
hostile expert witnesses. Especially since the mid 1980s there has been a corresponding growth
in the sophistication of tools and methodologies being developed and applied including computer
simulation tools, soft systems methodology, causal loop diagramming, chaos theory, statistical
analysis and interactive learning environments.

7
Quality Assurance in System Dynamics Modelling

The diagram below illustrates the tools typically used in the discipline as one moves from
qualitative conceptualisation of the problem to development of rigorous forensic models and
management decision tools.

Figure 6: A palette of systems thinking tools

8
Quality Assurance in System Dynamics Modelling

System Dynamics Modelling Development Methodology


System Dynamics and Traditional IT Methodologies
System dynamics modelling differs dramatically from traditional computer application
development, and hence the diverse computer business systems frameworks (including standards
such as AS 3563 (Software Quality Management System) and proprietary methodologies such as
SSADM, JSD, IBM Business Systems Planning, James Martin Information Engineering etc) are
not directly applicable, although elements of these certainly have application.
System dynamics modelling generally takes place in a consulting environment where the client
recognises that there is a problem and the purpose of the modelling is to assist development of an
understanding of the problem context. Accordingly, scope definition, model specification and
model building tend to be an iterative or cyclical process, with mutual learning between the
modeller and the client.
As yet no SDM development methodology has gained widespread acceptance. The following are
the key texts which address the area to some extent:
Checkland, P. and J. Scholes: Soft Systems Methodology in Action. Wiley, Chichester,
1990.
Wolstenholme, E: System Enquiry - A System Dynamics Approach. Wiley, Chichester,
1990.
Wolstenholme, E., S. Henderson & A. Gavine: The Evaluation of Management
Information Systems - A Dynamic and Holistic Approach. Wiles, Chichester, 1993.
Richardson, G. and A. Pugh: Introduction to System Dynamics Modelling. Productivity
Press, Portland, 1981.
Roberts, N. et al: Introduction to Computer Simulation - A System Dynamics Modelling
Approach. Productivity Press, Portland, 1994.
In the absence of any formal methodology, the following framework has been developed through
the writers experience as an amalgam of traditional IS planning methodologies, the Lotus
Accelerated Value Method, the Soft Systems Methodology and approaches suggested by the
system dynamics community. It is consistent with the corporate IS strategic planning guidelines
developed by the Federal Department of Finance. The process iterates through 7 broad stages

Figure 7: Seven Step System Dynamics Modelling Methodology

Stage Model Focus Client Focus


Stage 1:
Project Planning Outcome objectives for the Confirm scope and deliverables with
modelling project client
Tools include: Project scoping clarify clients understanding of
system dynamics
text & flow charts deliverables
seek realistic expectations from
CPM & GANTT charts timeframe
modelling
budget templates budget
risk templates skills required
risk assessment
team specification

9
Quality Assurance in System Dynamics Modelling

Stage 2:
Problem Conceptualisation State problem contexts, symptoms Confirm understanding of business
and patterns of behaviour over time, with client
and past solutions
Tools include: Confirm understanding of the
Identify basic organisation structures
problem with the client
text & graphs core business processes
wire diagrams optimisation objective functions Confirm organisation performance
causal loop diagrams (outcome performance measures) measures with the client
influence diagrams patterns of resource behaviour
concept mapping over time
SSM rich pictures system boundaries
hexagons time horizon of study
cognitive mapping
Identify feedback relationships
social surveys
key resource states
statistical data
key resource flows
past review reports
key delays
key interrelation-ships
Restate problem, e.g. using SSM
VOCATE framework (Vision,
Owners, Clients, Actors, Transform-
ation processes, Environment)

Stage 3:
Model Formulation Initial Prototype(s)
MAP - MODEL - SIMULATE -
VALIDATE - REITERATE

Tools include: High level system map: Basic Confirm basic logical structure and
single dimension stock-flow model model functioning with client
System dynamics software of key business processes
Confirm key variables
Output graphs & tables from 20 - 40 variables
the SYSTEM DYNAMICS Confirm business rules
key stocks (resources) & flows
model(s)
key delays
key auxiliaries
key targets / goals / performance
indicator(s)
key information or material
feedbacks
key delays
Run simulation - validate

Where there are a variety of core


processes operating in the
organisation, each of these may need
to be developed as independent sub-
models

10
Quality Assurance in System Dynamics Modelling

Stage 4:
Model Development Detailed Prototype
MAP - MODEL - SIMULATE -
VALIDATE -REITERATE

Tools include: Iteratively elaborate model, Confirm basic structure and logic
challenging with subject area experts
System dynamics software system boundaries Confirm key variables with subject
stocks, flows, converters area experts
complexity / simplicity in
representing business rules Confirm business rules with subject
Introduce multi-dimensional area experts
arrays where applicable
Identify & build key policy levers
& reports
variables under control of
decision makers
output reports of relevance to
decision makers

Stage 5:

Model Validation Quality Assurance Confirm model outputs with subject


area experts
Undertake validation and verification
tests outlined in Figure 8. Independent testing
Iteratively revise model

Stage 6:
Model Handover Installation & Training Installation & Training

Stage 7:
Model in use Experience in use of model identifies
need for fine-tuning.

Significant iteration can be expected between stages 3 - 5. Periodically, issues settled in stage 2
may need to be revisited, especially the key performance indicators.
Details of the actions to be undertaken in stages 2 to 5 are addressed in some detail by Richardson
and Pugh (1980), Wolstenholme et al (1993) and Roberts et al (1994).

11
Quality Assurance in System Dynamics Modelling

Validation and Verification of System Dynamics Models


Need for Validation
All models are a simplification of the real world. To that extent all models are wrong. What is
important is that the models are useful, that they capture the key elements of real world
behaviour, so that the patterns of behaviour, the direction of change and the magnitude of change
predicted by the model under specified conditions broadly mimic the expected real world
behaviour. Achieving these, however, is not sufficient. There is an additional, critical, dimension
to usefulness. The clients or users of the model must have confidence in the model.
Validation and verification testing are designed to ascertain whether a model is useful or merely
wrong. Procedures for testing or validating system dynamics models are discussed in a wide
range of literature, including:
Barlas, Y., 1989. Multiple Tests for Validation of System Dynamics Type of Simulation
Models, European Journal of Operational Research, Vol. 42, No. 1, pp. 59-87.
Barlas, Y., 1994. Model Validation in System Dynamics, Proceedings of the 1994
International System Dynamics Conference, Methodological Issues Vol. 1, pp. 1-10,
Stirling, Scotland.
Barlas, Y., 1996. Formal aspects of model validity and validation in system dynamics.
System Dynamics Review, Vol 12, No 3. pp. 183-210.
Bell, J.A. and P.M. Senge. 1980. Methods for Enhancing Refutability in System
Dynamics Modelling. TIMS Studies in the Management Sciences, 14 (1), 61-73.
Forrester JW, Senge P, Tests for building confidence in system dynamics models, in A.
Legasto et al. (eds.), TIMS Studies in the Management Sciences (System Dynamics),
North-Holland, The Netherlands, 1980, pp. 209- 228
Graham, A.K. 1980. Parameter Estimation in System Dynamics Modelling. In J. Randers
(Ed.), Elements of the System Dynamic Method. (pp. 143-161). Cambridge, MA:
Productivity Press.
Grcic, B and Munitic, A., 1997. System Dynamics Approach to Validation, Proceedings of
the 1997 International System Dynamics Conference, Istanbul, Turkey.
Peterson, D and Eberline, R, 1994. Reality Check: a bridge between systems thinking and
system dynamics. System Dynamics Review, Vol 10, Nos 2/3.
Richardson GP, Pugh A, Introduction to System Dynamics Modelling with DYNAMO, MIT
Press, Cambridge, MA, 1981
Sargent, R.G., Verification and Validation of Simulation Models, in System Dynamics,
North Holland Publishing Company, Amsterdam, 1980.
Sterman, J.D., A Skeptic's Guide to Computer Models, in Grant, L. et al., Foresight and
National Decisions, University Press of America, Boston, 1988.
Tank-Nielsen, C., Sensitivity Analysis in System Dynamics, in Elements of the System
Dynamics Method, J. Randers (ed), Productivity Press, Connecticut, 1980.
Figure 8 summarises from the above literature the key tests which would seem to be relevant to
assuring the client that the preparedness models are useful. These tests are elaborated below. Not
all these tests are relevant to every situation. However, at least the first six tests should normally be
applied, and the others as appropriate to the particular situation.

12
Quality Assurance in System Dynamics Modelling

Figure 8: Validation and Verification Tests of Model Adequacy


Structure verification the model structure is consistent with relevant descriptive
knowledge of the system;
Parameter verification the parameters are consistent with relevant descriptive (and,
where available, numerical) knowledge of the system;
Boundary adequacy all important concepts for addressing the policy problem are
endogenous to (included in) the model;
Extreme conditions each equation makes sense, even when inputs take on extreme
values;
Dimensional consistency all equations are dimensionally consistent;
Behaviour reproduction the model generates behaviour modes, phasing, frequencies and
other characteristics of the behaviour of the real system;
Behaviour anomaly anomalous behaviour occurs under standard parameter values,
or anomalous behaviour arises if a key assumption is deleted
Behaviour sensitivity the model behaviour is appropriately sensitive to plausible
variations in input parameters;
Behaviour prediction the model plausibly describes the results of new policy.
Extreme policy the model behaves properly when subjected to extreme policies
or test inputs;
Statistical character the model output has the same statistical character as the
output of the real system;

Structure Verification Tests: Because the foundation for model behavior is the model's structure,
the first test in validating a model is whether the structure of the model matches the structure of the
system being modelled. Every element of the model should have a real-world counterpart, and
every important factor in the real system should be reflected in the model. Although this may seem
like a simple, obvious test, it may not be so. For example, descriptions of how all of the structural
parts of real systems are tied together rarely exist. More often than not, such descriptions must be
based on the concepts, or mental models, of people familiar with the system. Further, important
parts of some systems may lie unrecognized prior to modelling.

Parameter Verification Tests: Parameter values in a model often may be tested in a straightforward
manner, e.g., against historical data. However, even in technical defence systems, the available
data may be suspect. For example, equipment maintenance logs may reflect the total time the
equipment has been in the workshop, not the actual time it was being worked on. Also, in many
instances, there may be variables that are not usually quantified, but that are perceived to be critical
to the system being modelled. These elements must be included in the model. If productivity, for
example, is an important element in assessing optimal workforce structure, it must be included in
the model, and its relationship to other pertinent parts of the system must be specified
quantitatively. Clearly if the model output is strongly sensitive to such qualitative inputs, further
rigorous analysis of those inputs may be required.

13
Quality Assurance in System Dynamics Modelling

Boundary Adequacy Test: Model boundaries must match the purpose for which the model is
designed, if the model is to be used with confidence; that is, the model must include all of the
important factors affecting the behavior of interest. In practice, boundaries tend to shift as the
developers' and users' understanding of a problem evolves with the model's development. As model
purpose shifts, changes in the model's boundaries may be required. In many problems, a simple
model with limited boundaries may be expanded, or disaggregated, from time to time, as the model
is used to address problems in greater detail.
Extreme Conditions Test: Tests under extreme conditions, that is applying very high or very low
values to variables, often exposes structural faults or inadequacies and incomplete or erroneous
parameter relationships. The criterion is whether the pattern of system behaviour or values of
dependent variables remain valid (e.g. do not go negative if negativity is theoretically impossible)
or remain plausible. The ability of a model to function properly under extreme conditions
contributes to its utility as a policy evaluation tool as well as user confidence.
Dimensional Consistency Tests of model equations is an important structural test which simply
checks that the dimensions or units for the left hand side of an equation are identical with those on
the right hand side. Errors in dimensional consistency can easily creep into model equations during
model development and, subsequently, during revisions.
Behavior Reproduction Test: The tests relating to model behavior are less technical and, for many
users, more convincing than the structural tests. Foremost among these tests is the comparison of
model behavior with the behavior of the system being modeled. A model whose behavior has little,
or nothing, in common with that of the system of interest generates little, or no, confidence. Where
historical time series data are available, the model must be capable of producing similar data. In
this test, it is again important to keep in mind the purpose of the model -- including the time span of
the areas of behavior that are of interest. Where historical data are very poor or nonexistent, the test
may be one of reasonableness.
Behavior Anomaly Test: When model behavior does not replicate the behavior of the real system,
model structure, parameter values, boundaries, or similar factors must be considered suspect.
Something may have been omitted, improperly specified, or assigned incorrect values. Whether
due to faults in the model or in the real system, the resolution of the discrepancies found through
the anomalous behavior test bolsters confidence and validity.
Behavior Sensitivity Test: Most, but certainly not all, systems are stable. Small, reasonable
changes in a model's parameter values, should normally not produce radical behavior changes.
Typically these are introduced using the STEP, RAMP, PULSE or SINEWAVE functions of the
software. If the model's behavior is not seriously affected by plausible parameter variations,
confidence in the model is increased. On the other hand, dynamic simulation models are often used
to search for parameters that can effect behavior changes. The criterion in the sensitivity test is that
any sensitivity exhibited by the model should not only be plausible, but also consistent with
observed, or likely, behavior in the real system.
Behavior Prediction Test: A fundamental use of dynamic simulation models is predicting how a
system would behave if various policies of interest were implemented. Dynamic simulation models
offer significant advantages when used in this role; they provide a consistent basis for the
predictions. This basis is a consolidation of judgment, experience, and intuition that has been tested
against historical evidence. Confidence in the model is reinforced if the model not only replicates
long-term historical behavior, but also replicates the behaviour in existing systems where policy
changes have been implemented.
Extreme Policy Test: Similar to extreme conditions test, these tests introduce radical policies into
the model to see if the behavior of the model is consistent with what would be expected under these

14
Quality Assurance in System Dynamics Modelling

conditions.. The criterion again is whether the pattern of system behaviour or values of dependent
variables remain valid or plausible.
Statistical Character Test: The behaviour reproduction test (q.v.) is somewhat qualitative. In some
circumstances it may prove desirable to test rigorously behaviour reproduction. This may be done
by modifying model structure to produce formal statistics on key model output variables which can
be compared with real data.

Consultant Team Role/Function Descriptions


How a consultant approaches and staffs a project will vary according to their particular mode of
operation and the particular staff available, their mix of skills and their competing duties. The
following is indicative only to assist in time / cost estimates.

Figure 9: Role / Function Descriptions

Role/Function Description of Responsibilities Skills/Experience/ Background


Required
Consultant Responsible for contract administration Proven project management
Project Manager expertise in defence contracts.
Ensures in-budget, on-time, on-quality
delivery of contracted items Proven expertise in managing IS
projects
Plans the program, allocates and directs staff
and other resources to accomplish tasks, and Basic understanding of system
maintains control over the program dynamics modelling.
Tracks project status and works to identify Basic understanding of the
and resolve road blocks modelling domain (i.e.
preparedness)
Ensures client satisfaction
Owns and builds long-term client
relationship
Provides independent review and assessment
to ensure that proper methods are followed:
work conforms to contract
risks are discussed and mitigations
achieved
Action plans and targets are
communicated to team
Results are communicated to client

15
Quality Assurance in System Dynamics Modelling

Principal system Implements the detailed system dynamics Technical expert in the 5 stages of
dynamics analyst modelling methodology (steps 2 to 6 in model development listed in Figure
Figure 6: 6.
problem conceptualisation Technical expert in the use of the
model formulation modelling software
model development Technical expertise in the use of
model validation ancillary software (spreadsheets,
model handover statistical packages, database
Liases directly with the clients subject area systems etc)
experts to confirm the various stages of the
model development are valid Significant domain knowledge of
the area being modelled
Assistant system Undertakes less complex aspects of the Intermediate level of expertise in the
dynamics analyst modelling, including 5 stages of model development
listed in Figure 6.
background research
documentation of all model parameters Intermediate level of expertise in the
and equations use of the modelling software
development of user help screens Basic domain knowledge of the area
development of program menu and being modelled
navigation tools
development of user documentation
development of draft project report

_____________________

16

Vous aimerez peut-être aussi