Vous êtes sur la page 1sur 124

Lars Strandén, Claus Trebbien-Nielsen

Nordtest project 1551-01:

Standards for Software in


Safety-Related Applications
2

Abstract
Within the Nordtest project 1551-01 an evaluation has been carried out concerning the
similarities and dissimila rities for a number of safety-related software standards. The
evaluation and comparison of the safety-related standards have been carried out by two
Nordic parties, DELTA (Denmark) and SP (Sweden).

Six different standards have been selected within the scope of the project:
• EN 954 – Safety of machinery - Safety-related parts of control systems
• IEC 61508 – Functional safety of electrical / electronic / programmable electronic
safety-related systems
• IEC 61713 – Software dependability through the software lifecycle processes
• RTCA/DO-178B – Software Considerations in Airborne Systems and Equipment
Certification
• IEC 60601-1-4 – Medical Electrical Equipment – Part 1: General Requirements
for Safety – 4. Collateral Standard: Programmable Electrical Medical Systems
• EN 50128 – Railway Applications: Software for Railway Control and Protection
Systems
These standards are well known and generally accepted. For comparing standards a
methodology has been defined that considers a standard from different views. These are
used for evaluating and categorising the standards one at a time. For processes the SPICE
(ISO/IEC TR 15504) definition has been used. An aspect that is especially considered is
testing. For testing, the related information of the standards has been analysed and
compared separately.

The result of this project is twofold: First to perform the analysis of the selected standards
and, second, to specify a methodology that could be used for other standards and being
extensible. The general conclusion of this work is that standards vary to a high extent
(apart from different application areas) in scope, details, language, strictness, guidance,
and generality. But we can also see that several standards have many aspects in common
and that the application specif ic parts of standards do not rule out the possibility of using
standards in other application areas.

Key words: Safety, Safety-related Control Systems, Validation, Test, SPICE, Safety-
related software standards, Standard comparison, Standard evaluation, Safety of
machinery, NORDTEST
3

Contents
Abstract 2
Contents 3
Preface 5
Summary 7

1 Introduction 9

2 Definitions 10

3 Scope 11
3.1 Standards for evaluation 11
3.2 Interpreting standards 12
3.3 View of standards 13
3.4 Evaluation 14

4 Standard view contents 15


4.1 Summary 15
4.2 Mappings 15
4.3 Extracted version 20
4.4 Metrics 21
4.5 Judgement 22

5 Single standard evaluation 23


5.1 EN 954 23
5.2 IEC 61508 27
5.3 IEC 61713 35
5.4 RTCA/DO-178B 39
5.5 IEC 60601-1-4 43
5.6 EN 50128 47

6 The testing perspective 53


6.1 General 53
6.2 EN 954 53
6.3 IEC 61508 54
6.4 IEC 61713 56
6.5 RTCA/DO-178B 57
6.6 IEC 60601-1-4 58
6.7 EN 50128 58

7 Summarized and referenced standards 60


7.1 DEF STAN 00-55 60
7.2 DEF STAN 00-56 60
7.3 ENV 1954 61
7.4 EN 50126, EN 50129 61
7.5 IEC 300-3-9 61
7.6 IEC 60880 61
7.7 IEC 61511 62
7.8 IEC 62061 62
7.9 ISO 9000 62
7.10 ISO 12207 63
7.11 ISO 13849-1 63
7.12 MIL-STD-882D 63
4

8 Conclusions 64
8.1 Methodology 64
8.2 Feasibility of SPICE 64
8.3 The test perspective 65
8.4 Could standards be compared? 65
8.5 How to use the results 66

9 References 67

10 Annex 1: Project 69
10.1 Overall objectives 69
10.2 Work organisation 69
10.3 Presentation of results 69
10.4 Data of participating partners 69

11 Annex 2: Extracted version of EN 954-1 71

12 Annex 3: Extracted version of EN 954-2 75

13 Annex 4: Extracted version of IEC 61508-1 77

14 Annex 5: Extracted version of IEC 61508-2 85

15 Annex 6: Extracted version of IEC 61508-3 90

16 Annex 7: Extracted version of IEC 61713 94

17 Annex 8: Ex tracted version of RTCA/DO-178B 101

18 Annex 9: Extracted version of IEC 60601-1-4 111

19 Annex 10: Extracted version of EN 50128 115


5

Preface
This work was made possible by financial support from NORDTEST (Nordtest project
1551-01). The work has been carried out by Claus Trebbien-Nielsen DELTA (Denmark)
and Lars Strandén SP (Sweden).

The project and the preliminary results have been presented at RTiS (Real Time in
Sweden) conference in Halmstad on August 22, 2001 and at the international testing
conference EuroSTAR in Stockholm on November 21, 2001.

This work will support Nordic industry, especially SME (Small and Medium sized
Enterprises), in finding new markets where they are qualified to work. Based on their
current experience, the survey will help them to find similarities between the standards
they are used to work with and standards for other products. The result of the project will
also support Nordic industry in its conformity assessment.

In Annexes 2 – 10 extracted versions of the evaluated standards are given. These have
been produced by starting from the original text and then removing examples, guidelines,
details and motivations and further by focusing on aspects related to processes, artefacts
and referenced standards. The ratio of the extracted information to the original is 1:10 or
higher. Thus, it is very important to emphasize that the extracted version cannot be used
as a substitute for the original standard. Instead the original standards can be bought:
• from IEC, start at www.iec.ch
• from ISO, start at www.iso.ch
• or from national standardisation organisations (in Sweden SIS, start at
www.sis.se)
or downloaded for free where applicable.

For SPICE (ISO/IEC TR 15504) summaries of processes are given. Again these are not
enough and cannot be used as a substitute for the original.

The chapter Introduction gives an introduction to why it is difficult to compare standards


directly and why assistance is of value for SME.

The chapter Definitions explains shortly some words and expressions that are used in this
report apart from those defined in the standards themselves.

The chapter Scope defines the standards to be evaluated and the principal methodology
for evaluation.

The chapter Standard view contents describes in detail the methodology used including
definition of SPICE (ISO/IEC TR 15504) processes.

The chapter Single standard evaluation gives the results of the work for each evaluated
standard.

The chapter The testing perspective gives the evaluation results concerning tests i.e. at
verification and validation.

The chapter Summarized and referenced standards lists a number of significant safety-
related standards with summaries but not evaluated in this work.

The chapter Conclusions summarises and evaluates the results of this work.
6

Annex 1 describes the project aspects of this work.

Annexes 2-10 give the extracted versions of each evaluated standard. These have been
produced by starting from the original text and then removing examples, guidelines,
details and motivations and further by focusing on aspects related to processes, artefacts
and referenced standards.

Different readers are interested in different aspects and guidance is given below. In any
case the chapters Introduction, Definitions, Scope and Conclusions should be read.
• If you are interested in the methodology of how to compare standards, study
Standard view contents and at least one standard of Single standard evaluation
and of Annexes 2-10.
• If you are interested in understanding a specific standard, study the corresponding
sub chapter in Single standard evaluation, The testing perspective and the
corresponding annex within Annexes 2-10.
• If you are interested in testing, study The testing perspective.
• If you are interested in the contents of a specific standard (related to processes,
artefacts and referenced standards) study the corresponding annex within Annexes
2-10.

The results of this work will significantly ease studying the original standard as a whole.
7

Summary
The overall purpose of the project was to ease the understanding of standards for safety-
related applications and to transfer a judgement of the standards to the reader without
him/her having to study them in detail. This will be a help especially for SME where
limited resources are available for standard surveys. The report was written jointly by SP
and DELTA and exists in two forms: a word document (this one) and an HTML-version
available using Internet.

Six standards have been selected from different application areas, different degrees of
strictness, different levels of details, and from different scopes. The standards are:
• EN 954 – Safety of machinery - Safety-related parts of control systems
• IEC 61508 – Functional safety of electrical / electronic / programmable electronic
safety-related systems
• IEC 61713 – Software dependability through the software lifecycle processes
• RTCA/DO-178B – Software Considerations in Airborne Systems and Equipment
Certification
• IEC 60601-1-4 – Medical Electrical Equipment – Part 1: General Requirements
for Safety – 4. Collateral Standard: Programmable Electrical Medical Systems
• EN 50128 – Railway Applications : Software for Railway Control and Protection
Systems
There is no intention of being complete. Instead the standards are chosen to reflect
different types but with the common basic issue: safety-related software.

Generally we can say that analysing and comparing text masses written by humans for
humans is not a strict science and thus not easy. As a conclusion we cannot have a single
way of handling text. Instead it is necessary to use both a more formal approach i.e. based
on some kind of objective measure and an informal approach i.e. some kind of evaluation
and/or judgement. Thus a methodology has to be defined and this is made in this work by
defining a number of views i.e. different ways of looking at standards:

• SUMMARY – a short form of the standard


• MAPPINGS – to analyse how the standard maps to a predefined set of processes
(according to definitions made in SPICE i.e. ISO/IEC TR 15504), artefacts and
referenced standards
• EXTRACTED VERSION – a condensed version where the information relevant
to mappings has been extracted (see Annexes 2-10)
• METRICS – a measure describing the character of the standard
• JUDGEMENT – a final conclusion

For mapping to processes SPICE is used. SPICE defines a superset of processes of those
defined in ISO 12207 and can be included under ISO 9000. The SPICE processes are
general and relatively complete and used as reference for comparing and evaluating the
process requirements of the evaluated standards.

The extracted versions are available in the annexes. They have been created starting from
the original standard texts and removing motivations, details, examples, guidelines etc
and concentrating the information to filter out information only relevant to processes,
artefacts, and referenced standards. The original chapter numbering has been kept so it is
possible to compare the standard with the original text in an easy way.

For comparing different standards the views listed above are used but in this work the
standards are also evaluated according to a test perspective i.e. to what extent and with
8

which quality is the verification and the validation handled in the standards. Since testing
is a single aspect these considerations have been gathered in one chapter The testing
perspective.

The following general conclusions from this work apply.


• There seems to be more software standards available than necessary. Several of
these have many aspects in common and even the application oriented standards
do not have much information that really is specific to the application area.
• It is not easy to evaluate text that is assumed to be unambiguous but is not. The
choice of words, examples, guidelines etc affect the reader’s opinion directly and
indirectly.
• SPICE processes seem to be relatively complete and thus SPICE can be used as a
process framework.
• The simple metric defined (counting artefacts, processes, and referenced
standards) gives a relevant opinion of the standard but should not be trusted too
much (as with any other metric). The metric shows a naïve view and it is not
enough for decisions.
• The methodology defined here could easily be expanded in order to include new
standards, define new mappings, define new metrics etc.

Some short comments on each standard are given below.


• EN 954 has nothing to say about software development. It should be used for
system requirements.
• IEC 61508 is the most complete (general) standard and could be used for
developing application area standards.
• IEC 61713 gives guidelines for increasing dependability (according to its
definition) and is not strict.
• RTCA/DO-178B is written as a guideline but interpreted as requirements.
• IEC 60601-1-4 is dedicated to medical equipment (collateral standard).
• EN 50128 is one of a set of standards and can be used generally even if defined
for railway applications.
9

1 Introduction
There are many standards and EC directives that specify safety requirements and system
behaviour at fault. Also there are many standards for safety-related applications in use
today that specify how development of software and systems should be made in order to
adhere to the requirements in specific industrial areas. For example railway, avionics,
medical systems, are covered by a number of standards to ensure their reliability and
safety. However, the requirements in these standards are not co-coordinated, and
differences apply for different sectors of industry. For example, the standards differ in
scope and level of details and thus it is difficult to interpret and identify similarities and
differences across the standards, e.g. in connection with test. Further, since products are
continuously getting more complex, more than one standard may be applicable and so the
requirements of different standards can be identical, complement, exclude or even
contradict each other.

Small and Medium sized Enterprises (SME) may have difficulties to find the
requirements applicable for their product. So a valuable insight would be to be able to
evaluate a standard on its own and in relation to other standards. However it is not trivial
to define a procedure for this work. In its most generic sense the comparison work is to
evaluate and compare two text masses containing partly different information, with
different strictness and seen from different views. The written text in itself (i.e. the
implementation) contributes with complicating aspects. Some examples are listed below.

• The text is written for humans and cannot be formalized into a version leaving no
space for interpretations.
• The use of must, shall, should etc makes it difficult to isolate and compare
requirements. Further, within a standard both should and shall can exist for the
same item.
• The standards have different principle views e.g. if handling of documentation is
a separate process or if it is a sub process of several processes.
• The meaning of process, activity, procedure and when they can be applied.
• The standards have different scopes e.g. one can concern the contents of a
product and another the way to produce it.

From above we can generally say that we cannot have a single way of analysing and
comparing standards. Instead, it is necessary to use both a more formal approach i.e.
based on some kind of objective measure and an informal approach i.e. some kind of
evaluation and/or judgement. This will be further described below.

For the evaluated standards we can summarise the most important contents aspects as:
• development processes and procedures, qualities and the contents of them
• safety functions and the qualities of them
• functional classification schemes e.g. regarding criticalit y
• references to other standards that shall be fulfilled
• guidelines of how to use the standard itself
• rationale concerning the standard itself, including relations to other standards

Apart from the selected standards, the process standard SPICE is used as a means for
evaluation. SPICE is a superset of ISO 12207 and can be included under ISO 9000. The
SPICE processes are used as reference to compare and evaluate the process requirements
of the different standards. However the SPICE capability levels are not used since they
are generally not significant when analysing the evaluated standards of this work.
10

2 Definitions
Most of the standards evaluated have their own terminology and definitions. Since they
can differ, and if in doubt, it is necessary to check the considered standard. Here only
definitions are given that are specific to this report.

Artefact Something produced. In general documents (including source code).

COTS Commercial Off the Shelf

Dependence The amount of independence required between different persons, groups,


organisations e.g. an assessor should not also be the software developer.

Extract A concentrated form of the standard showing only specific aspects of


interest.

HW Hardware

Mapping set Mapping specific aspects of a standard to a predefined set. Artefacts,


Processes, and Referenced standards sets exist.

Metric A value representing a quality with some degree of significance.

SIL Safety Integrity Level. The definition depends on the standard under
consideration.

SME Small and Medium sized Enterprises.

SPICE The standard ISO/IEC TR 15504 defining a set of about 40 processes and
also capability levels for how to apply them.

SW Software
11

3 Scope
3.1 Standards for evaluation
The six specially selected standards are listed below wit h a short description. However,
there exist many other standards specifying the development of software and systems.
The six standards have been chosen because they belong to the most widely used
standards for safety-related programmable electronic systems. The standards also cover
different application areas and differ also by different levels of details, strictness and
focus (e.g. concerning processes, functional contents etc). Note that there is no attempt of
being complete i.e. the set of analysed and referenced standards may be representative
but others exist as well.

3.1.1 EN 954
Safety of machinery – Safety-related parts of control systems
Part 1: General principles for design
Part 2: Validation

The European standard EN 954 safety of machinery “Safety-related parts of control


systems” gives little guidance on software, but specifies the validation process. The
system behaviour at fault is categorised in 5 categories.

3.1.2 IEC 61508


Functional safety of electrical/electronic/programmable electronic safety-related systems
Part 1: General requirements
Part 2: Requirements for electrical/electronic/programmable electronic safety-related
systems
Part 3: Software requirements

The international standard IEC 61508 “Functional safety of electrical/electronic/


programmable electronic safety-related systems” is intended to be a basic safety standard
applicable to all kinds of industry. It covers the complete safety life cycle, and would
need interpretation to develop sector specific standards. The safety classification is
divided into 4 Safety Integrity Levels (SIL) depending on the result of the risk analysis.
Methods for development at a certain SIL are then recommended. For this work parts 1-3
are of interest (part 4-7 also exist).

3.1.3 IEC 61713


Software dependability through the software lifecycle processes – Application guide

The international standard IEC 61713 “Software dependability through the software
lifecycle processes – Application guide” was published in June 2000, and has not yet
been extensively adopted in industry. The acquisition process, the supply process, the
development process, the operation process and the maintenance process are described
with respect to dependability. No division into levels of risk or criticality is made.

3.1.4 RTCA/DO-178B
Software Considerations in Airborne Systems and Equipment Certification
12

The aviation standard RTCA/DO-178B ”Software Considerations in Airborne Systems


and Equipment Certification” is the most important standard for certification of software
used in commercial aircrafts in the US and in the EC. Five different software levels (A, B,
C, D and E) are specified.

3.1.5 IEC 60601-1-4


Medical Electrical Equipment – Part 1: General Requirements for Safety – 4. Collateral
Standard: Programmable Electrical Medical Systems

Medical systems are subject to standard IEC 60601-1-4 “Medical Electrical Equipment
Part 1: General Requirements for Safety – 4. Collateral Standard: Programmable
Electrical Medical Systems”. The risk management process is connected to the software
development process.

3.1.6 EN 50128
Railway Applications: Software for Railway Control and Protection Systems

The European standard EN 50128 “Railway Applications: Software for Railway Control
and Protection Systems” is used in railway industry all over Europe. Five software
integrity levels (SIL) are specified.

3.2 Interpreting standards


Even though standards are often assumed to be clear and unambiguous they are generally
not. Words like “may”, “should”, “can”, “must” and “shall”, in many cases leave room
for interpretation. Further, combinations of them can lead to difficulties. For example, if a
requirement is stated as “shall” in an introduction chapter and later on as “should” how is
this inconsistency solved (see EN954-1 chapter 6.1 and 6.3 about categorie s)?

RTCA/DO-178B explicitly states that “must” and “shall” have been removed in order to
allow alternative methods. Also included in many standards are examples, suggestions
and guidelines. In IEC 61508 there exist parts of the standard (however not included in
this work) with hundreds of pages describing guidelines. In EN 50128 classification is
made as Recommended and Highly Recommended, how rigorous is this? Even if
processes and documents are recommended, how should the quality of them be related to
the standard?

Apart from difficulties with the word interpretations and clearness there exist differences
with respect to contents that make comparisons difficult. For example EN 954-1 is mainly
focused on functional contents and not processes while e.g. RTCA/DO-178B is a process
oriented standard.

We can thus list a number of aspects that makes standard comparison troublesome.
• Interpretation of words. Intentionally or unintentionally vague. Different
interpretation by different standards and persons.
• Inconsistent use of words, sentences at different places in the standard.
• Standards contain different scopes.
• Different levels of detail e.g. EN 954-1 only mention verification plan but
RTCA/DO-178B describes many details of it.
13

• Different use of processes e.g. if documentation is a sub process of others or is a


process on its own.
• Different emphasis e.g. if a two pages guideline is used describing a “should”
requirement it seems so important that it can be interpreted as a “shall”
requirement.
• References to other standards. In many cases the amount and use is not clearly
stated.

Thus there is no single useful formal way for the analysis.

3.3 View of standards


Since standards are not directly comparable (as described above) we cannot hope for a
single method of evaluation. Instead we have to approach the problem from different
views and put the results together in order to create judgements. Below is a figure
showing the views that are applied in this work. Reading and understanding the whole
standard is of course the ultimate view but will not simplify matters which is the main
concern in this work.

STANDARD

EXTRACT

SUMMARY

MAPPINGS

JUDGEMENT

METRICS
- General perspective
- Test perspective

As shown in the figure an orthogonal aspect is the area of interest (or perspective) in the
figure General and Test perspectives, which can be applied to each view. However, the
significance varies. Since Test perspective is a subset of General, General is performed
first (for all views) and the Test perspective can be made by extracting relevant
information.

3.3.1 Summary
A standard can be summarized in plain English expressing the opinion of the reviewer of
what is important. If the reviewer is experienced the summary will give the essence and
the character of the standard.
14

3.3.2 Extracted version


By removing unnecessary information (according to some criteria) from a standard it is
possible to get a more concentrated view which can be read in shorter time making it
easier to get a better understanding. In this work this will be made in connection with
mappings and will be discussed in more detail below.

3.3.3 Mappings
In this case certain aspects of the analysed standard are mapped to specific sets. Three
sets are defined in this work: Processes, Artefacts and Referenced standards. The sets will
be defined and discussed in detail below. The quality of the mappings gives a
characterisation of the analysed standard. For example, the process set is defined using
SPICE processes and the analysed standard normally specifies a subset of the processes
defined in SPICE.

3.3.4 Metrics
Metrics can be defined and used for comparisons, however, simple values are very coarse
and it is important to be very careful when drawing conclusions from them. There might
be other metrics e.g. related to the sentences/words used but in this work metrics will be
related to mappings as will be described below.

3.3.5 Judgement
A judgement can be given in plain English expressing the opinion of the evaluator(s).
Typical aspects can be applicability, strengths, weaknesses, how easy it is to use,
consequences of using it and general quality of the standard.

3.4 Evaluation
The evaluation takes place by first studying each of the six standards and creating the
views, as defined above, for each evaluated standard. The extracted version is especially
important since it gives the basis for some of the others. The status after this phase is:
• each of the six standards has been evaluated separately
• all views are complete seen from a General perspective
• the mapping sets Processes, Artefacts and Referenced standards are complete

The next phase is comparing standards for studying similarities. Comparisons can be
made by comparing the different views described above. For Mappings (when not
including qualities) and Metric views comparisons can be made by simply comparing
subsets contents and metric values. For the other views it is necessary to compare
concentrated information in plain English.

By extracting information related to test from the two phases it is possible to get an
evaluation seen from a Test perspective.
15

4 Standard view contents


4.1 Summary
The summary of the standard should be made short and contain the following aspects:
• main purpose and use
• focus / type of standard
• defined criticality levels (if applicable)
• special aspects that affect the contents and structure of the standard

4.2 Mappings
4.2.1 Framework
Mappings are used for making information more explicit and preparing for standard
comparisons. As described above three mapping sets are defined: Processes, Artefacts
and Referenced standards. Since SPICE (see [1]) is defined as an overall standard for
software development SPICE will be the start for the Process mapping set. If necessary it
is extended by more processes along with the evaluation of standards. For the other two
mapping sets the situation is different. Initially these sets are empty and then built up
successively with each analysed standard. But for all three mapping sets it can be
assumed (which of course has to be verified) that the sets, after evaluating the six
standards, are relatively stable and complete. Thus it is possible that few modifications
occur when more standards are analysed.

The mapping sets are described below.


• Processes: A corresponding description according to SPICE. A quality can be
attached to specify how well a process is mapped to a SPICE process.
• Artefacts: In principle just a description of the title of the artefact. The reason is
that different standards vary so much in detail that the artefact information is
moved to the quality that describes the relation between the evaluated standard
and the Artefact mapping set.
• Referenced standards: A reference to an included standard that is necessary for
the completeness of the standard under evaluation. How a standard shall be
referenced depends on the same aspects as the evaluation in general e.g. is there a
“shall”, “should” or “may” condition? The kind of strictness and other aspects are
moved to the qualities that describe the relation between the evaluated standard
and the Referenced standard mapping set.

The picture below shows the structure.


16

REFERENCED
STANDARDS

STANDARD
STANDARD
1
STANDARD
1
PROCESSES 1
ARTEFACTS

PROCESS PRODUCT
PROCESS
1 PRODUCT
1
PROCESS
1 PRODUCT
1
1 1
with quality

with quality with quality

STANDARD TO
EVALUATE

Since an arrow only shows that a relation exists i.e. there is an item of the evaluated
standard that can be mapped to a mapping set, further information is needed. As
described above, for this a quality is attached to the arrow and for each mapped item it
contains the corresponding information. The mapping may include different properties as
shown below.
• Extent; how much can be mapped.
• Properties; what are the properties of the item in the standard compared to what is
defined in the mappin g set.
• Conditions; if the mapping is always valid or if it varies.
• Strictness; if “shall”, “should” or “may” is used.

Thus, qualities are generally expressed in plain English and have to be compared
informally. If just looking at the mappings without qualities it is possible to get a more
simplified view. This will be discussed under Metrics below. However, we can note some
consequences directly. Without the quality aspects it can be difficult to see:
• the safety-related aspects such as models for criticality, safety functions etc
• the application type
• the functional information

Even if it looks like a disadvantage, it is an advantage when trying to find similarities


between standards. The reason is that too much details can blur general conclusions.

From the Referenced standards mapping set we see the recursive nature of this analysis. If
necessary a referenced standard can be included for evaluation if it can be seen as part of
the total standard scope.

As a summary the mapping sets framework:


• is easily extendable both for new standards and for new mapping sets
• makes it possible to trade simplicity versus information quality
• is useful for comparisons
17

4.2.2 Quality specification decision


For the evaluation made in this report the qualities listed below are used.
• For Artefacts: Required or recommended.
• For Processes: SPICE process that can be considered a superset of the process
under consideration.
• For Referenced standards: Those that are explicitly referenced as normative.

4.2.3 SPICE
SPICE defines a software development reference model and the list below shows all
defined processes according to SPICE. Note the existence of support processes that act
independently of the others. In SPICE there is also a capability reference model defined
but it is not used in this evaluation (it reflects the maturity of a company and is more used
for assessments and not directly connected to software development).

SPICE contains five groups of processes.


• Customer-Supplier (CUS) - launching and landing projects to everybody’s
satisfaction.
• Engineering (ENG) - The product’s journey from ideas to final operation.
• Support (SUP) - Giving evidence of compliance and managing artefacts.
• Management (MAN) - Knowing what to do and make sure it happens.
• Organisation (ORG) - Setting the scene right, knowing it objectively and being
able to change.

Below is a list of processes for each group. The definitions and descriptions are
quotations from reference [1]. The information is not enough for applied work and cannot
be used as a substitute for the original.

CUS - Customer Supplier process category


• CUS.1 Acquisition process: The purpose of the Acquisition process is to obtain
the product and/or service that satisfies the need expressed by the customer. The
process begins with the identification of a customer need and ends with the
acceptance of the product and/or service needed by the customer.
o CUS.1.1 Acquisition preparation process: The purpose of the Acquisition
preparation process is to establish the needs and goals of the acquisition.
o CUS.1.2 Supplier selection process: The purpose of the Supplier
selection process is to choose the organization that will be responsible for
the implementation of the project identified in CUS.1.1.
o CUS.1.3 Supplier monitoring process: The purpose of the Supplier
monitoring process is to monitor the supplier's activities during the
development of the software product and/or service.
o CUS.1.4 Customer acceptance process: The purpose of the Customer
acceptance process is to approve the supplier's deliverable when all
acceptance conditions are satisfied.
• CUS.2 Supply process: The purpose of the Supply process is to provide software
to the customer that meets the agreed requirements
• CUS.3 Requirements elicitation process: The purpose of the Requirements
elicitation process is to gather, process, and track evolving customer needs and
requirements throughout the life of the software product and/or service so as to
18

establish a requirements baseline that serves as the basis for defining the needed
software work products.
• CUS.4 Operation process: The purpose of the Operation process is to operate the
software product in its intended environment and to provide support to the
customers of the software product.
o CUS.4.1 Operational use process: The purpose of the Operational use
process is to ensure the correct and efficient operation of the software
product for the duration of its intended usage and in its installed
environment.
o CUS.4.2 Customer support process: The purpose of the Customer
support process is to establish and maintain an acceptable level of service
to the customer to support effective use of the software product.
Assistance and consultation to the customer is provided as requested to
support the operation of the software product.

ENG - Engineering process category


• ENG.1 Development process: The purpose of the Development process is to
transform a set of requirements into a functional software product or software-
based system that meets the customer’s stated needs.
o ENG.1.1 System requirements analysis and design process: The purpose
of the System requirements analysis and design process is to establish the
system requirements (functional and non-functional) and architecture,
identifying which system requirements should be allocated to which
elements of the system and to which releases.
o ENG.1.2 Software requirements analysis process: The purpose of the
Software requirements analysis process is to establish the requirements of
the software components of the system.
o ENG.1.3 Software design process: The purpose of the Software design
process is to define a design for the software that implements the
requirements and can be tested against them.
o ENG.1.4 Software construction process: The purpose of the Software
construction process is to produce executable software units and to verify
that they properly reflect the software design.
o ENG.1.5 Software integration process: The purpose of the Software
integration process is to combine the software units, producing integrated
software items and to verify that the integrated software units properly
reflect the software design.
o ENG.1.6 Software testing process: The purpose of the Software testing
process is to test the integrated software producing a product that will
satisfy the software requirements.
o ENG.1.7 System integration and testing process: The purpose of the
System integration and testing process is to integrate the software
component with other components, such as manual operations or
hardware, producing a complete system that will satisfy the customers’
expectations expressed in the system requirements. The resources
allocated to system integration should include someone familiar with the
software component.
• ENG.2 System and software maintenance process: The purpose of the System
and software maintenance process is to manage modification, migration and
retirement of system components (such as hardware, software, manual operations
and network if any) in response to customer requests. The origin of requests
might be a discovered problem or the need for improvement or adaptation. The
objective is to modify and/or retire existing systems and/or software while
preserving the integrity of organizational operations
19

SUP - Support process category


• SUP.1 Documentation process: The purpose of the Documentation process is to
develop and maintain documents that record information produced by a process
or activity.
• SUP.2 Configuration management process: The purpose of the Configuration
management process is to establish and maintain the integrity of all the work
products of a process or project.
• SUP.3 Quality assurance process: The purpose of the Quality assurance process
is to provide assurance that work products and processes of a process or project
comply with their specified requirements and adhere to their established plans.
• SUP.4 Verification process: The purpose of the Verification process is to confirm
that each software work product and/or service of a process or project properly
reflects the specified requirements.
• SUP.5 Validation process: The purpose of the Validation process is to confirm
that the requirements for a specific intended use of the software work product are
fulfilled.
• SUP.6 Joint review process: The purpose of the Joint review process is to
maintain a common understanding with the customer of the progress against the
objectives of the contract and what should be done to help ensure development of
a product that satisfies the customer. Joint reviews are at both project
management and technical levels and are held throughout the life of the project.
• SUP.7 Audit Process: The purpose of the Audit process is to independently
determine compliance of selected products and processes with the requirements,
plans and contract, as appropriate.
• SUP.8 Problem resolution process: The purpose of the Problem resolution
process is to ensure that all discovered problems are analysed and resolved and
that trends are recognized.

MAN - Management process category


• MAN.1 Management process: The purpose of the Management process is to
organize, monitor, and control the initiation and performance of any processes or
functions within the organization to achieve their goals and the business goals of
the organization in an effective manner.
• MAN.2 Project management process: The purpose of the Project management
process is to identify, establish, coordinate and monitor activities, tasks and
resources necessary for a project to produce a product and/or service meeting the
requirements.
• MAN.3 Quality management process: The purpose of the Quality management
process is to monitor the quality of the project's products and/or services and to
ensure that they satisfy the customer. The process involves establishing a focus
on monitoring the quality of product and process at both the project and
organizationa l level.
• MAN.4 Risk management process: The purpose of the Risk management process
is to identify and mitigate the project risks continuously throughout the life-cycle
of a project. The process involves establishing a focus on monitoring of risks at
both the project and organizational levels.

ORG - Organization process category


• ORG.1 Organizational alignment process: The purpose of the Organizational
alignment process is to ensure that the individuals in the organization share a
common vision, culture and understanding of the business goals to empower
them to function effectively. Although business re-engineering and Total Quality
Management have a much broader scope than that of software process, software
20

process improvement occurs in a business context and, to be successful, must


address business goals.
• ORG.2 Improvement process: The purpose of the Improvement process is to
establish, assess, measure, control and improve a software life cycle process. As
a result of successful implementation of this process:
o ORG.2.1 Process establishment process: The purpose of the Process
establishment process is to establish a suite of organizational processes
for all software life cycle processes as they apply to its business
activities.
o ORG.2.2 Process assessment process: The purpose of the Process
assessment process is to determine the extent to which the organization's
standard software processes contribute to the achievement of its business
goals and to help the organization focus on the need for continuous
process improvement.
o ORG.2.3 Process improvement process: The purpose of the Process
improvement process is to continually improve the effectiveness and
efficiency of the processes used by the organization in line with the
business need.
• ORG.3 Human resource management process: The purpose of the Human
resource management process is to provide the organization and projects with
individuals who possess skills and knowledge to perform their roles effectively
and to work together as a cohesive group.
• ORG.4 Infrastructure process: The purpose of the Infrastructure process is to
maintain a stable and reliable infrastructure that is needed to support the
performance of any other process. The infrastructure may include hardware,
software, methods, tools, techniques, standards, and facilities for development,
operation, or maintenance.
• ORG.5 Measurement Process: The purpose of the Measurement process is to
collect and analyze data relating to the products developed and processes
implemented within the organizational unit, to support effective management of
the processes, and to objectively demonstrate the quality of the products.
• ORG.6 Reuse process: The purpose of the Reuse process is to promote and
facilitate the reuse of new and existing software work products from an
organizational and product/project perspective.

4.3 Extracted version


When analysing a single standard (by extracting information) the connection as defined
by the evaluated standard is maintained between processes, artefacts and referenced
standards. The extracted versions are available in the annexes. The original chapter
numbering has been kept in order to compare with the original text in an easy way. The
size of the extracted versions is one tenth or less of the original standards. The principles
for extraction are given below for each mapping set.

4.3.1 Processes
For evaluating a single standard with respect to processes the following information is
extracted:
• strictness (shall or should)
• description of processes and activities (sub processes) but no details, only so
much information is given that it is possible to understand the meaning
• if and in what way processes and/or activities co-operate
21

• if and to what extent processes and activities are affected by classifications

The following information is NOT extracted:


• no guidelines, suggestions nor examples
• not how the processes and activities are performed
• no motivations for statements
• no definitions

4.3.2 Artefacts
For evaluating a single standard with respect to artefacts the following information is
extracted:
• strictness (shall or should)
• description but no details, only so much information is given that it is possible to
understand the meaning
• description of safety critical contents (functions) but no details
• if and in what way artefacts co-operate
• if and to what extent artefacts and their contents are affected by classifications

The following information is NOT extracted:


• no guidelines, suggestions nor examples
• not how the artefacts are generated
• not the quality of the artefacts
• no motivations for statements
• no definitions

4.3.3 Referenced standards


For evaluating a single standard with respect to referenced standards the following
information is extracted:
• strictness (shall) i.e. referenced normative standards

4.4 Metrics
Metric values used in this report are represented by a Kiviat diagram. Three axes exist
where A denotes the number of artefacts, S is the number of referenced (normative)
standards and P is the number of processes. Each axis is normalized according to the
maximum number counted for the six evaluated standards. The example below shows a
standard that is extensive and does not reference many others.

Artefacts (A)

Referenced
Standard (S)

Processes (P)
22

There is in general an opposite relation between referenced standards and


artefacts/processes. It is not likely that a standard that references many others also has a
lot to state concerning artefacts and processes. Using the metrics we can generally say:
• if a standard is small or big i.e. if a standard has little or much influence.
• if a standard is a “stand-alone” standard or requires support from other standards.
• if focus is on artefacts or processes or both.

However, it is important to keep in mind that simple values cannot represent complex
descriptions but metrics can be valuable for giving hints.

4.5 Judgement
Judgement is a kind of conclusion and concerns e.g. applicability, strengths, weaknesses,
how easy the standard is to use, consequences of using it and general qualities of the
standard. The input to this view is the other views and aspects directly from the standard.
23

5 Single standard evaluation


5.1 EN 954
5.1.1 Summary
The European standard EN 954 “Safety of machinery: Safety-related parts of control
systems” is designed to give general principles for design, guidance and assessment of
control systems and complies with EU machinery directive. The standard can also be
used as input for more product specific standards (Type-C). The standard may be
replaced by IEC 62061 later on.

5.1.1.1 EN 954 –1

This part of the standard considers safety functions that are defined as functions that
enable the system to achieve a safe state at fault. A safety function could be a separate
function added for increasing safety. The system behaviour at fault is classified in 5
categories. These are based on resistance to faults and do not consider probability values.
The choice of category depends on the application and also the part of system under
consideration. The selected category can be used for assessment. The standard is general
and covers e.g. electrical, hydraulic, pneumatic and mechanical systems. The standard
refers to several other (normative) standards.

When using the standard it is necessary to check to what extent the risk can be reduced
for each applied safety-related part e.g. for an emergency stop.

For handling safety a five step process is defined:


1. Hazard analysis and risk assessment
2. Decide measures for risk reduction by control means
3. Specify safety requirements for the safety-related parts of the control system
4. Design
5. Validation
The process is iterative i.e. when gaining knowledge at a later step it may be necessary to
go back and update.

A list of safety functions with extra requirements and relation to standards is given
containing e.g. Stop function, Emergency stop function, Manual reset, Start and restart,
Local control function, Muting, Manual suspension of safety functions, Unexpected start-
up, Indications and alarms, Response time, Safety-related parameters, Man-machine
interface, Fluctuations, loss and restoration of power sources. For all, references and
additional requirements are listed (where applicable).

Categories, as mentioned above, are defined according to fault resistance. At the lowest
level the safety function can be lost at fault, at the highest level accumulated faults could
be handled. Each category is described and guidance and examples are given.

Validation of safety-related parts of the system is specified by a validation plan.


Validation can be made by analysis and testing. Important is that normal, and foreseeable
abnormal cases are considered.

The following annexes are included:


• questionnaire for the design process
• guidance for the selection of categories
24

• list of some significant faults and failures for various technologies


• relationship between safety, reliability and availability for machinery
• bibliography

5.1.1.2 EN 954 –2

This part of the standard is exclusively defined for validation but it is not enough for
programmable systems. This part complements part 1 by giving the conditions for
validation. The purpose is to validate the safety-related parts and their associated
category. Requirements on documentation are given according to category.

The following annexes are included:


• validation tools for mechanical systems
• validation tools for pneumatic systems
• validation tools for hydraulic systems
• validation tools for electrical systems
Each annex is structured the same way: Introduction, List of basic safety principles, List
of well tried safety principles, List of well tried components, Fault lists and fault
exclusions.

5.1.2 Extracted version


See Annex 2 and 3.

5.1.3 Mappings
The explicitly defined artefacts are listed below.
• Validation plan
• Validation report
• User information
• Test plan
• Test records
• Fault list

The explicitly defined processes are mapped to SPICE processes according to the table
below.
25

Processes in standard SPICE processes


CUS.1 Acquisition
CUS.1.1 Acquisition preparation
CUS.1.2 Supplier selection
CUS.1.3 Supplier Monitoring
CUS.1.4 Customer Acceptance
CUS.2 Supply
CUS.3 Requirements Elicitation
CUS.4 Operation
CUS.4.1 Operational use
CUS.4.2 Customer support
- Process for selection and design of ENG.1 Development
safety measures
ENG.1.1 System requirements analysis and
design
ENG.1.2 Software requirements analysis
ENG.1.3 Software design
ENG.1.4 Software construction
ENG.1.5 Software integration
ENG.1.6 Software testing
ENG.1.7 System integration and testing
ENG.2 System and software maintenance
SUP Support process category
SUP.1 Documentation
SUP.2 Configuration management
SUP.3 Quality assurance
SUP.4 Verification
- Validation SUP.5 Validation
SUP.6 Joint review
SUP.7 Audit
SUP.8 Problem resolution
MAN.1 Management
MAN.2 Project management
MAN.3 Quality Management
MAN.4 Risk Management
ORG.1 Organizational alignment
ORG.2 Improvement process
ORG.2.1 Process establishment
ORG.2.2 Process assessment
ORG.2.3 Process improvement
ORG.3 Human resource management
ORG.4 Infrastructure
ORG.5 Measurement
ORG.6 Reuse

The normative referenced standards are:


• EN 292-1
• EN 292-2:1991/A1
• EN 418
• EN 457
• EN 614-1
26

• EN 842
• EN 981
• EN 982
• EN 983
• prEN 999
• EN 1037
• EN 1050
• EN 60204-1
• EN 60447
• EN 60529
• EN 60721-3-0
• IEC 50 (191):1990

5.1.4 Metrics
Below is the Kiviat diagram shown. Three axes exist where A denotes the number of
artefacts, S is the number of referenced (normative) standards and P is the number of
processes. Each axis is normalized according to the maximum number counted for the six
evaluated standards.

5.1.5 Judgement
This standard starts from a functional requirement point of view and from the EU
machinery directive perspective. The safety-related aspects come from requirements
concerning avoidance of human harm (also e.g. ergonomic aspects are included) and
damage to machinery. The standard lists many safety functions that could be considered
for development and assessment and thus the standard serves as a reference. The
classification of categories makes it possible to classify and assess the system without
considering the implementation. On the other hand, since e.g. probabilities are not
defined the development cannot directly be controlled by category and so further
requirements are necessary. The standard requirements are complemented with many
guidelines and contains e.g. lists of safety principles, fault lists etc in the annexes.

Different types of techniques are described e.g. pneumatic, hydraulic implementation and
thus programmable systems is only one of several ways to implement safety functions.
Thus there is not much information concerning software development. For example,
Annex C describes hardware and mechanical fault types but not software faults.

From the mappings we can see, as expected, that mapping to SPICE is not natural. Just a
few processes are mentioned and the artefacts of them are expressed in general terms.
Many standards are referenced and this is one sign that the standard has to be
27

complemented with other standards. For assessments the standard as such is more suitable
e.g. when verifying claimed categories for an application.

The standard is easy to read (it is not extensive).

5.2 IEC 61508


5.2.1 Summary
The international standard IEC 61508 “Functional safety of electrical/electronic/
programmable electronic safety-related systems” is intended to be a basic safety standard
applicable to all kinds of industry. It is a general (generic) standard covering the complete
safety life cycle, and would need interpretation for developing sector specific standards (a
major objective of this standard). The included parts in this work are 1, 2 and 3. There are
also parts of the standard not included in this work:
• 4: Definitions and abbreviations
• 5: Examples of methods for the determination of safety integrity levels
• 6: Guidelines on the application of parts 2 and 3
• 7: Overview of techniques and measures

The figure below shows the relation between the analysed parts.

Part 1:
Overall system

Part 2:
Programmable system
Part 2:
Hardware

Part 3:
Software

5.2.1.1 IEC 61508-1: General requirements

This part handles the overall system issues and in principle only programmable systems,
however, it is pointed out that safety could rely on other kinds of systems as well e.g.
mechanical. The main purpose is to handle systems where safety functions are carried out
by programmable electronic systems. For this type of systems an overall safety life cycle
is defined.

Four Safety Integrity Levels (SIL 1-4), based on risk, are defined according to probability
of failure. Two different groups of values exist depending on amount of demands (low
and high/continuous demands). For low demands SIL limits range from 10-1 to 10-5 for a
demand and for high/continuous demands range from 10-5 to 10-9 faults per hour.

An important aspect is how to produce adequate documentation and several requirements


and guidelines are given. For functional safety management guidelines are listed e.g.
concerning persons, organization, way of work etc.
28

The overall safety life cycle is the basis of the standard. The life cycle consists of 16
phases and contains the flow of actions, however, it is stressed that iterations between
them are normal.

The following phases are defined: Concept, Overall scope definition, Hazard and risk
analysis, Overall safety requirements, Safety requirements allocation, Overall operation
and maintenance planning, Overall safety validation planning, Overall installation and
commissioning planning, Programmable system realisation, Other technology realisation
(outside the scope of this standard), External risk reduction facilities (outside the scope of
this standard), Overall installation and commissioning, Overall safety validation, Overall
operation maintenance and repair, Overall modification and retrofit, and
Decommissioning or disposal. Programmable system realisation is further specified by
including hardware and software aspects.

For each phase description is given in short form (table) and the following is specified:
• objectives
• scope
• input
• output
Verification, Management of functional safety, Functional safety assessment are not
listed in short form but in expanded form (separate chapters).

An important phase is the Hazard and risk analysis. The analysis shall include fault
conditions and foreseeable misuse. This includes also human factors and abnormal or
infrequent modes of operation. Since Safety Integrity Levels are based on probabilities
the likelihood of hazardous events has to be calculated.

The phase Safety requirements allocation is important and the reasons are listed below.
• Safety functions are allocated to the system.
• Safety Integrity Level is specified for each safety function. Note that this is made
from a system/requirement point of view without considering software and
hardware aspects.
Guidelines are given for how to handle SIL and combinations of safety functions.

For realisation of programmable systems Part 2 and Part 3 are referenced.

Verification takes place at all phases and the means are reviews, analyses and tests.

Functional safety assessments also take place at all phases and are made in order to verify
the achieved safety level. Personnel competence and independence (for person,
department or organization) are described and for independence HR (Highly
Recommended) and NR (Not Recommended) are used and related to Safety Integrity
Level and to consequences.

The annexes contain:


• an example of documentation structure for overall, programmable system and
software safety lifecycles
• discussion of competence of persons with respect to a number of different aspects
e.g. Safety Integrity Level

5.2.1.2 IEC 61508-2: Requirements for


electrical/electronic/programmable electronic safety- related
systems
29

This part is a refinement of Part 1 and applicable for systems that contain at least one
programmable sub system. Thus it is necessary to study Part 1 first (there are also direct
references to Part 1). The system should be based on a decomposition into sub systems.

The programmable system safety life cycle (Programmable system realisation phase)
consists of 6 sub phases: Safety requirement specification (concerning functions and
integrity levels), Safety validation planning, Design and development, Integration,
Installation commissioning operation maintenance, and Safety validation.

For each phase of the safety lifecycle described here a description is given in short form
(table) and the following is specified:
• objectives
• scope
• input
• output
The life cycle also includes Modification, Verification, and Functional safety assessment
aspects (the same as in Part 1).

More detailed requirements are stated concerning e.g. performance, modes of operation,
hardware/software interaction, probability of critical hardware faults, fault avoidance,
handling faults, safety and non-safety function independence,

Safety Integrity Level (SIL) is split up for hardware and software. For hardware, tables
show highest possible SIL according to chosen fault tolerance and safe failure fraction i.e.
the fraction of faults that leads to a non-dangerous situation. Further, the tables are
specified for two categories A and B defined according to available information
concerning the constituent components. Guidelines are given for how to decide maximum
hardware SIL depending e.g. on hardware architecture.

Guidelines are given for how to compute probabilities for random hardware faults.
Aspects included are e.g. diagnostic test interval, hardware architecture etc. Also handling
of fault avoidance, control of systematic faults (e.g. software, operator faults etc), and
fault detection is described.

A number of implementation aspects are listed e.g. rates of failure, diagnostic coverage,
hardware fault tolerance, safe failure fraction, highest possible SIL, and proven in use or
not. Data communication is handled separately.

Within this part references are given to Part 3 concerning software but programmable
systems and hardware aspects are handled in this part. This part also includes hardware
architecture, sensors and actuators.

The annexes contain a number of techniques and measures as listed below.


• Hardware techniques and measures for diagnostic tests and maximum diagnostic
coverage possible for them (control of random faults). These are mapped to
components such as memory, sensors, CPU, bus etc. For each component
requirements are given for different safe failure fractions. Reference is given to
Part 7.
• Techniques and measures to control systematic faults caused by hardware and
software design, environmental aspects, and operation. These are related to SIL
with qualifiers for importance (Highly recommended, Recommended, Not
recommended, No opinion) and for effectiveness (Low, Medium, High,
Mandatory). Guidance for effectiveness is given. Reference is given to Part 3 and
7.
30

• Techniques and measures to avoid systematic faults caused during requirement


specification, design and development, integration, operation and maintenance,
and safety validation. These are related to SIL with qualifiers for importance
(Highly recommended, Recommended, Not recommended, No opinion) and for
effectiveness (Low, Medium, High, Mandatory). Guidance for effectiveness is
given. Reference is given to Part 3 and 7.
• Guidelines is given for calculation of diagnostic coverage and safe failure
fraction.

5.2.1.3 IEC 61508-3: Software requirements

This part is a refinement of Part 1 and 2 and focuses on software for a programmable
system. Thus it is necessary to study Part 1 and 2 first (there are also direct references to
Part 1 and 2). Partitioning should be made into software modules.

Management of functional safety in Part 1 is refined to Software quality management


system where further aspects are added such as change control, how to achieve SIL, and
how to maintain configuration items.

The software safety life cycle (Programmable system realisation phase) consists of 6 sub
phases: Software safety requirement specification (concerning functions and integrity
levels), Software safety validation planning, Software design and development,
Programmable electronics integration, Software operation and modification, and Software
safety validation.

Software design and development are further specified in: Architecture, Support tools and
programming languages, Detailed design and development, Detailed code
implementation, Software module testing, and Software integration testing.

For each phase of the software safety lifecycle described here a description is given in
short form (table) and the following is specified:
• objectives
• scope
• input
• output
The lifecycle also includes Software modification, Software verification, and Software
functional safety assessment aspects (including the same aspects as in Part 1). Direct
reference is made to V-model for development, however, iterations are necessary at
changes.

An important aspect is here to match hardware and software architecture and


implementation.

The annexes contain a number of techniques and measures as listed below.


• Techniques and measures for Requirement specification, Architecture design,
Support tools and programming languages, Detailed design, Module testing and
integration, Integration, Safety validation, Modification, Verification, and
Functional safety assessment. These are related to SIL with qualifiers for
recommendation (Highly recommended, Recommended, Not recommended, No
opinion). Reference is given to Part 7.
• Detailed techniques and measures for the following aspects: Design and coding
standards, Dynamic analysis and testing, Functio nal and black-box testing,
Failure analysis, Modelling, Performance testing, Semi-formal methods, Static
analysis, and Modular approach. These are related to SIL with qualifiers for
31

recommendation (Highly recommended, Recommended, Not recommended, No


opinion). Reference is given to Part 7.

5.2.2 Extracted version


See Annex 4, 5 and 6.

5.2.3 Mappings
The explicitly defined artefacts are listed below.
• Description (overall concept)
• Description (overall scope definition)
• Description (hazard and risk analysis)
• Specif ication (overall safety requirements)
• Description (safety requirements allocation)
• Plan (overall operation and maintenance)
• Plan (overall safety validation)
• Plan (overall installation)
• Plan (overall commissioning)
• Realisation of E/E/PE safety-related syste ms
• Report (overall installation)
• Report (overall commissioning)
• Report (overall safety validation)
• Log (overall operation and maintenance)
• Request (overall modification)
• Report (overall modification and retrofit impact analysis)
• Log (overall modification and retrofit)
• Report (overall decommissioning or disposal impact analysis)
• Plan (overall decommissioning or disposal)
• Log (overall decommissioning or disposal)
• Plan (safety)
• Plan (verification)
• Report (verification)
• Plan (functional safety assessment)
• Report (functional safety assessment)
• Specification (E/E/PES safety requirements)
• Plan (E/E/PES safety validation)
• Description (E/E/PES architecture design)
• Specification (programmable electronic integration tests)
• Specification (integration tests of programmable electronic
• and non programmable electronic hardware)
• Description (hardware architecture design)
• Specification (hardware architecture integration tests)
• Specification (hardware modules design)
• Specifications (hardware modules test)
• Hardware modules
• Report (hardware modules test)
• Report (programmable electronic and software integration test)
• Report (programmable electronic and other hardware integration test)
32

• Instruction (user)
• Instruction (operation and maintenance)
• Report (E/E/PES safety validation)
• Instruction (E/E/PES modification procedures)
• Request (E/E/PES modification)
• Report (E/E/PES modification impact analysis)
• Log (E/E/PES modification)
• Plan (E/E/PES safety)
• Plan (E/E/PES verification)
• Report (E/E/PES verification)
• Plan (E/E/PES functional safety assessment)
• Report (E/E/PES functional safety assessment)
• Specification (software safety requirements)
• Plan (software safety validation)
• Description (software architecture design)
• Specification (software architecture integration tests)
• Specification (programmable electronic and software integration tests)
• Instruction (development tools and coding manual)
• Description (software system design)
• Specification (software system integration tests)
• Specification (software module design)
• Specification (software module tests)
• List (source code)
• Report (software module test)
• Report (code review)
• Report (software module test)
• Report (software module integration test)
• Report (software system integration test)
• Report (software architecture integration test)
• Report (programmable electronic and software integration test)
• Instruction (user)
• Instruction (operation and maintenance)
• Report (software safety validation)
• Instruction (software modification procedures)
• Request (software modification)
• Report (software modification impact analysis)
• Log (software modification)
• Plan (software safety)
• Plan (software verification)
• Report (software verification)
• Plan (software functional safety assessment)
• Report (software functional safety assessment)

The explicitly defined processes are mapped to SPICE processes according to the table
below.
33

Processes in standard SPICE processes


CUS.1 Acquisition
- Concept CUS.1.1 Acquisition preparation
CUS.1.2 Supplier selection
CUS.1.3 Supplier Monitoring
CUS.1.4 Customer Acceptance
- Overall installation and CUS.2 Supply
commissioning planning
- Overall installation and
commissioning
CUS.3 Requirements Elicitation
- Decommissioning or disposal CUS.4 Operation
- Overall operation and maintenance CUS.4.1 Operational use
planning
- Overall operation, maintenance and
repair
- Overall modification and retrofit
- E/E/PES operation, and
maintenance procedures
- E/E/PES modification
- SW operation and modification
procedures
CUS.4.2 Customer support
- Realisation: E/E/PES and other ENG.1 Development
means
- Overall scope definition ENG.1.1 System requirements analysis and
- Hazard and risk analysis design
- Overall safety requirements
- Safety requirements allocation
- E/E/PES safety requirements
specification
- E/E/PES design and development
- SW safety requirements ENG.1.2 Software requirements analysis
specification
- SW design and development 1(5)
- SW design and development 2(5) ENG.1.3 Software design
- SW design and development 3(5) ENG.1.4 Software construction
- SW design and development 4(5) ENG.1.5 Software integration
- SW design and development 5(5) ENG.1.6 Software testing
- E/E/PES integration ENG.1.7 System integration and testing
- Programmable electronics
integration (HW and SW)
- SW modification ENG.2 System and software maintenance
SUP Support process category
SUP.1 Documentation
SUP.2 Configuration management
SUP.3 Quality assurance
- Verification SUP.4 Verification
- E/E/PES verification
- SW verification
- Overall safety validation planning SUP.5 Validation
- Overall safety validation
- Functional safety assessment
34

- E/E/PES safety validation planning


- E/E/PES safety validation
- E/E/PES functional safety
assessment
- SW safety validation planning
- SW safety validation
- SW functional safety assessment
SUP.6 Joint review
SUP.7 Audit
SUP.8 Problem resolution
MAN.1 Management
MAN.2 Project management
MAN.3 Quality Management
MAN.4 Risk Management
ORG.1 Organizational alignment
ORG.2 Improvement process
ORG.2.1 Process establishment
ORG.2.2 Process assessment
ORG.2.3 Process improvement
ORG.3 Human resource management
ORG.4 Infrastructure
ORG.5 Measurement
ORG.6 Reuse

The normative referenced standards are:


• ISO/IEC Guide 51,
• IEC Guide 104
• IEC 60050(371)
• IEC 60300-3-2
• IEC 61000-1-1
• IEC 61000-2-5
• IEEE 352

5.2.4 Metrics
Below is the Kiviat diagram shown. Three axes exist where A denotes the number of
artefacts, S is the number of referenced (normative) standards and P is the number of
processes. Each axis is normalized according to the maximum number counted for the six
evaluated standards.

P
35

5.2.5 Judgement
This is a huge application independent standard (part 1-3 180 pages, 4-7 exist also) for
system and software development and defined to be a general (generic) standard that
could be used for creating more specific standards. One example is EN 50128 and there
are others. This standard is generally accepted and widely used.

The reason why it is so big is that many details and a lot of guidance are included. The
size as such could be a problem but the clear structure between the three first parts
improves the situation i.e. there is a clear top-down procedure defined. Also the structure
is similar at the top level (including other means than programmable systems), at the
programmable system level (including hardware) and at the software level. Important is
also the functional safety assessment that occurs on each level. At all three levels the top-
down approach is visible e.g. when allocating safety requirements, sub systems and sub
software module decomposition etc.

For each lifecycle description (one in each part) there is a figure (with boxes for the
phases) and an associated table with lifecycle phase, objectives, scope, requirement sub
clause, inputs and outputs. This makes the scope easier to grasp and connections between
the parts are more obvious. Each box of the figure is described thoroughly in separate
chapters.

The SIL levels are defined in Part 1 as probabilities decided in advance by looking at the
role of the developed application. Thus there is quantity that is mapped to and guiding the
implementation. Look up tables (showing e.g. techniques, measures, and documentation)
are easy to use when SIL levels have been decided.

Parameterised software is not particularly pointed out and not subcontractor roles and
COTS either.

The standard lists many processes and many artefacts (maximum for both of the six
evaluated standards). The mapping to SPICE is not completely obvious. The number of
referenced standards (not so few) does not reflect the fact that this is a “stand alone”
standard.

The language is direct and clear (it tells you what to do) and the contents is not difficult to
grasp. For the extracted version it was relatively easy to separate the guidance from the
rest.

5.3 IEC 61713


5.3.1 Summary
The international standard IEC 61713 “Software dependability through the software
lifecycle processes – Application guide” was published in June 2000, and has not yet
been extensively adopted in industry. The purpose is to give guidelines for software
development for airborne systems. Thus there are no “shall” in the text.

Here dependability means availability performance and the relevant factors for it are:
reliability performance, maintainability performance, and maintenance support
performance. Thus safety critical aspects are not the major concern and so no division
into levels of risk or criticality is made. This standard is a support to IEC 60300-3-6. The
36

basis of this standard is the software life cycle processes defined in ISO/IEC 12207 and
thus the definition is close to SPICE.

Primary software lifecycle processes (i.e. the acquisition process, the supply process, the
development process, the operation process, and the maintenance process) are mainly
considered and described. Support and organization processes are only shortly
commented. The following roles are used: Acquirer, Supplier, Developer, Operator, and
Maintainer.

For primary software life-cycle processes, activities that inf luence dependability are listed
and guidelines are given.
• For acquisition process: Specification of dependability requirements, Selection of
supplier, Preparation of contracts, Supplier monitoring, Acceptance and
completion.
• For supply process: Initiation, Preparation of response, Contract, Planning,
Execution and control, Review and evaluation, Delivery and completion,.
• For development process: Process implementation, System requirement analysis,
System architectural design, Software requirements analysis, Software
architectural design, Software detailed design, Software coding and testing,
Software integration, Software qualification testing, System integration, System
qualification testing, Software installation, Software acceptance support,
• For operation process: Process implementation, Operational testing, System
operation, User support.
• For maintenance process: Process implementation, Problem and modification
analysis, Modification implementation, Maintenance review/acceptance,
Migration, Software retirement.

For supporting processes (documentation, configuration management, quality assurance,


verification, validation, joint review, audit, and problem resolution) and organizational
processes (management, infrastructure, improvement, and training) short motivations are
given for their importance for increasing dependability.

The annexes contain:


• mapping between ISO/IEC 12207 processes and IEC 60300-2 elements and tasks
• activities for different roles in different processes

5.3.2 Extracted version


See Annex 7.

5.3.3 Mappings
The explicitly defined artefacts are listed below.
• Contract
• Project management plan
• System engineering management plan-SEMP
• Software development plan-SDP
• Work breakdown structure (WBS)
• Software QA plan
• Reliability program plan
• Software FMECA
37

• Software safety plan


• Computer security plan
• Subcontractor management plan
• Verification and validation plan
• Risk management plan
• Training of personnel – (training plan)
• Migration plan
• Retirement plan
• User documentation changes report
• Problem report
• Operational procedures changes record
• Verification results
• SW acceptance report
• User documents
• System qualification report
• Installation documentation
• Operation manual

The explicitly defined processes are mapped to SPICE processes according to the table
below.
38

Processes in standard SPICE processes


- Acquisition process CUS.1 Acquisition
CUS.1.1 Acquisition preparation
CUS.1.2 Supplier selection
CUS.1.3 Supplier Monitoring
CUS.1.4 Customer Acceptance
- Supply process CUS.2 Supply
CUS.3 Requirements Elicitation
- Operation process CUS.4 Operation
CUS.4.1 Operational use
CUS.4.2 Customer support
- Development process ENG.1 Development
ENG.1.1 System requirements analysis and
design
ENG.1.2 Software requirements analysis
ENG.1.3 Software design
ENG.1.4 Software construction
ENG.1.5 Software integration
ENG.1.6 Software testing
ENG.1.7 System integration and testing
- Maintenance process ENG.2 System and software maintenance
SUP Support process category
- Documentation process SUP.1 Documentation
- Configuration management process SUP.2 Configuration management
- Quality assurance process SUP.3 Quality assurance
- Verification process SUP.4 Verification
- Validation process SUP.5 Validation
- Joint review process SUP.6 Joint review
- Audit process SUP.7 Audit
- Problem resolution process SUP.8 Problem resolution
- Management process MAN.1 Management
MAN.2 Project management
MAN.3 Quality Management
MAN.4 Risk Management
ORG.1 Organizational alignment
- Improvement process ORG.2 Improvement process
ORG.2.1 Process establishment
ORG.2.2 Process assessment
ORG.2.3 Process improvement
- Training processes ORG.3 Human resource management
- Infrastructure process ORG.4 Infrastructure
ORG.5 Measurement
ORG.6 Reuse

The normative referenced standards are:


• IEC 60050(191)
• IEC 60300-2
• IEC 60300-3-6
• IEC 61160,
• ISO/IEC 12207
• ISO 8402
39

5.3.4 Metrics
Below is the Kiviat diagram shown. Three axes exist where A denotes the number of
artefacts, S is the number of referenced (normative) standards and P is the number of
processes. Each axis is normalized according to the maximum number counted for the six
evaluated standards.

5.3.5 Judgement
This application independent standard does not directly relate to safety but by changing
the meaning of dependability to include safety the difference to other standards is
decreased. The procedure when developing this standard was to look at each activity and
give guidelines at each what could be improved with regard to dependability.

As seen above this standard maps directly to SPICE processes (as expected).

A significant aspect of this standard is the discussion of roles and these concern
acquisition, supply, development, operation, and maintenance.

This standard supports IEC 60300-3-6 by adding dependability aspects for all life cycle
activities. Thus this standard is not a “stand-alone” even if it does not reference many
other standards.

Since safety-related aspects are not considered directly there is no crit icality
classification.

Background information is clearly separated from the guidance. The language is clear and
is not difficult to grasp.

5.4 RTCA/DO-178B
5.4.1 Summary
The aviation standard RTCA/DO-178B ”Software Considerations in Airborne Systems
and Equipment Certification” is the most important standard for certification of software
used in commercial aircrafts in the US and in the EC. In this standard, Annex is
normative and Appendix is informative.

The purpose is to give guidelines for software development for airborne systems. Thus
there are no “shall” nor “must” in the text. The focus is on software life cycle processes;
40

definition of objectives, how to achieve them, and how to document that the objectives
have been fulfilled. Also the relation between system life cycle processes and software
life cycle processes is specified.

The following aspects are not included:


• operation
• organization
• applicant-supplier roles
• personnel qualification

Five different software levels (A, B, C, D and E) are specified based on consequences
ranging from catastrophic failure to no effect on safety issues. The levels are initially set
by the system safety assessment process. Guidelines are given for handling of safety
aspects using software levels. No software failure rates are considered. Aspects on user
modifiable software, option-selectable software, COTS, and field-loadable software are
included.

Three types of software life cycle processes are defined: software planning process,
software development processes (requirement, design, coding, and integration processes)
and integral processes (verification, configuration management, quality assurance and
certification liaison processes). The integral processes are performed concurrently with
software development processes. For example a development process can require that
certain integral processes are performed. The processes may be iterative. Guidance on
how to handle processes is given. For each process the objectives, the activities and
sometimes further aspects e.g. documents are defined. The only specific airborne related
process is the certification liaison processes.

Requirements are split up in three types high-level, low-level and derived. The latter
defines requirements that are not directly traceable to high-leve l requirements e.g.
implementation requirements.

A relatively large part of the standard concerns verification (analysis, tests etc).

Documentation (called software life cycle data) is gathered in a separate chapter. For each
document a list of included aspects is given. Some specific airborne related documents
exist.

In a separate chapter (additional considerations) tool qualification, alternative methods,


software reliability models, and product service history are considered.

Below is the contents of the annexes described.


• For each of software planning process, software development processes
(requirement, design, coding, and integration processes) and integral processes
(verification, configuration management, quality assurance and certification
lia ison processes) tables are given containing:
- objectives and outputs according to software level (A to D, level E is not
relevant)
- if independence is required i.e. if independent testing person/group/organization
is necessary
• Glossary

5.4.2 Extracted version


41

See Annex 8.

5.4.3 Mappings
The explicitly defined artefacts are listed below.
• Control data (such as minutes, memoranda)
• Plan for Software Aspects of Certification
• Software Development Plan
• Software Verification Plan
• Software Configuration Management Plan.
• Software Quality Assurance Plan
• Software Requirements Standards
• Software Design Standards
• Software Code Standards
• Software Requirements Data
• Design Description
• Source Code
• Executable Object Code
• Software Verification Cases and Procedures
• Software Verification Results
• Software Life Cycle Environment Configuration Index (SECI)
• Software Configuration Index (SCI)
• Problem Reports
• Software Configuration Management Records
• Software Quality Assurance Records.
• Software Accomplishment Summary
• Tool Qualification Plan
• Tool Operational Requirements
• Tool Accomplishment Summary
• User installation guides
• User manuals.
• Product service history

The explicitly defined processes are mapped to SPICE processes according to the table
below.
42

Processes in standard SPICE processes


CUS.1 Acquisition
CUS.1.1 Acquisition preparation
CUS.1.2 Supplier selection
CUS.1.3 Supplier Monitoring
CUS.1.4 Customer Acceptance
CUS.2 Supply
CUS.3 Requirements Elicitation
CUS.4 Operation
CUS.4.1 Operational use
CUS.4.2 Customer support
ENG.1 Development
- System design process ENG.1.1 System requirements analysis and
design
- Software requirements process ENG.1.2 Software requirements analysis
- Software design process ENG.1.3 Software design
- Software coding process ENG.1.4 Software construction
ENG.1.5 Software integration
ENG.1.6 Software testing
- Integration process ENG.1.7 System integration and testing
ENG.2 System and software maintenance
SUP Support process category
SUP.1 Documentation
- Software configuration SUP.2 Configuration management
management process
- Software quality assurance process SUP.3 Quality assurance
- Software verification process SUP.4 Verification
- System safety assessment process SUP.5 Validation
- Certification liaison process
SUP.6 Joint review
SUP.7 Audit
SUP.8 Problem resolution
MAN.1 Management
- Software planning process MAN.2 Project management
- Tool qualification process
MAN.3 Quality Management
MAN.4 Risk Management
ORG.1 Organizational alignment
ORG.2 Improvement process
ORG.2.1 Process establishment
ORG.2.2 Process assessment
ORG.2.3 Process improvement
ORG.3 Human resource management
ORG.4 Infrastructure
ORG.5 Measurement
ORG.6 Reuse

The normative referenced standards are:


• None.
43

5.4.4 Metrics
Below is the Kiviat diagram shown. Three axes exist where A denotes the number of
artefacts, S is the number of referenced (normative) standards and P is the number of
processes. Each axis is normalized according to the maximum number counted for the six
evaluated standards.

5.4.5 Judgement
This application specific standard is directed towards avionics but noticeable only in just
a few separate chapters. For example, certification and probably also the emphasis on
user modifiable software are application specific. Thus this standard can actually be used
quite generally. Even though this standard points out that it contains only guidelines
(must and shall are removed from the standard, lists are not necessarily complete,
examples are just examples etc.) it seems that the impact of the standard is stronger. One
reason is that the standard is widely used and generally accepted.

One peculiar thing is that there are no normative referenced standards. Even though the
purpose is to be “stand alone” not everything is stated in the standard. For example, it
covers most aspects of software development, but not organization aspects.

Parameterised software is described.

The layout of the standard is according to processes but since all documents are listed in a
separate chapter one could alternatively start from this aspect instead.

The standard is not that easy to grasp directly. One reason is the reference touch of the
standard. In a way a reader has to know it before he/she can read it i.e. in practise he/she
has to read it several times. Another reason is that it does not clearly point out what to do
instead it describes different qualities (as a guideline would do). This also makes it
difficult to generate the extracted version (the most difficult one of the six evaluated
standards).

Since categories are based on software criticality more facts are needed for guiding
development, for example a highly critical software but extremely unlikely to fail must be
handled in some other way.

That the standard is safety-related is shown by the addressed application type i.e. aircrafts
and by the definition of software levels.

5.5 IEC 60601-1-4


44

5.5.1 Summary
Medical systems are subject to standard IEC 60601-1-4 “Medical Electrical Equipment –
Part 1: General Requirements for Safety – 4. Collateral Standard: Programmable
Electrical Medical Systems”. This is a collateral standard to IEC 60601-1. This standard
assumes that a process is followed.

The contents concern requirement specification, architecture, implementation and design,


modifications, verification and validation, marking and accompanying documentation.
Not covered aspects are hardware manufacturing, software replication, installation and
commissioning, operation and maintenance, and decommissioning.

The standard specifies what to do but not how to do it. Thus it does not consider software
or hardware aspects i.e. implementation issues. The standard is focused on risk handling.
The list below shows the considered aspects.
• Documentation.
• Risk management plan.
• Development lifecycle. It shall be selected but there is no guidance on which to
choose.
• Risk management process. It includes risk analysis and risk control.
• Qualification of personnel.
• Requirement specification.
• Architecture.
• Design and implementation.
• Verification.
• Validation. Independence aspects are addressed.
• Modification.
• Assessment.
Iterations are expected.

A fundamental aspect is the Risk management file where all safety-related information is
placed. The contents is shown in a figure together with a more detailed view of the Risk
management summary (which is included in the Risk management file).

The annexes contain:


• informative aspects concerning terminology, rationale, risk concepts (with
guidelines, risk management process example), development life-cycle (with
guidelines, v-model, suggested documentation), example structures.

5.5.2 Extracted version


See Annex 9.

5.5.3 Mappings
The explicitly defined artefacts are listed below.
• Instructions for use
• Risk management file
• Quality records
• Risk management summary
• RISK management plan
45

• Verification plan
• Validation plan
• PEMS requirement specification
• Subsystem (e.g. PESS) requirement specification
• PEMS architecture specification
• PESS architecture specification
• Subsystem design specification
• Subsystem test specification
• Assessment report

The explicitly defined processes are mapped to SPICE processes according to the table
below.
46

Processes in standard SPICE processes


CUS.1 Acquisition
CUS.1.1 Acquisition preparation
CUS.1.2 Supplier selection
CUS.1.3 Supplier Monitoring
CUS.1.4 Customer Acceptance
CUS.2 Supply
CUS.3 Requirements Elicitation
CUS.4 Operation
CUS.4.1 Operational use
CUS.4.2 Customer support
- Development process ENG.1 Development
ENG.1.1 System requirements analysis and
design
ENG.1.2 Software requirements analysis
ENG.1.3 Software design
ENG.1.4 Software construction
ENG.1.5 Software integration
ENG.1.6 Software testing
ENG.1.7 System integration and testing
ENG.2 System and software maintenance
SUP Support process category
SUP.1 Documentation
SUP.2 Configuration management
SUP.3 Quality assurance
- Verification SUP.4 Verification
- RISK management process SUP.5 Validation
- Validation
SUP.6 Joint review
SUP.7 Audit
SUP.8 Problem resolution
MAN.1 Management
MAN.2 Project management
MAN.3 Quality Management
MAN.4 Risk Management
ORG.1 Organizational alignment
ORG.2 Improvement process
ORG.2.1 Process establishment
ORG.2.2 Process assessment
ORG.2.3 Process improvement
ORG.3 Human resource management
ORG.4 Infrastructure
ORG.5 Measurement
ORG.6 Reuse

The normative referenced standards are:


• IEC 60601-1,
• IEC 60601-1-1
• IEC 60788
• ISO 9000-3
• ISO 9001
47

5.5.4 Metrics
Below is the Kiviat diagram shown. Three axes exist where A denotes the number of
artefacts, S is the number of referenced (normative) standards and P is the number of
processes. Each axis is normalized according to the maximum number counted for the six
evaluated standards.

5.5.5 Judgement
This is a collateral standard. It is completely focused on what to do and not how it shall
be made. Some (but not much) guidance is given in annexes. The layout is close to a
checklist and very easy to read. Actually it is very close to the extracted version i.e.
focusing on processes, artefacts and referenced standards and is the easiest one to extract
of the six evaluated standards.

Development is the focus, thus installation, operation, maintenance, commissioning,


decommissioning are not included.

From the mappings we can see that this is a “small” standard and relies on other
concerning processes and artefacts. It does not map well to SPICE since processes is not
the main issue. Instead it is a specific standard for risk handling within medical
applications.

5.6 EN 50128
5.6.1 Summary
The European standard EN 50128 “Railway Applications: Software for Railway Control
and Protection Systems” is used in railway industry all over Europe. This standard is one
of a group of railway standards and together the purpose is to be complete within their
scope (for railway applications EN 50126 and EN 50129 should also be studied). Even if
not referenced, this standard is based on IEC 61508. This standard is primarily focused on
new software development.

Five software integrity levels (SIL) are specified 0-4 (4 is the most critical and 0 is non
safety-related software), but choice of SIL according to risk is not considered. The system
level is decided before applying this standard and documented in System safety
requirements specification together with the safety functions allocated to software. Here
the software SIL is handled and guidelines are included.
48

This standard is focused on methods for achieving safe systems and considers only
aspects directly related to software. The corresponding functional steps are: Definition of
requirements, Design development and testing, Hardware integration, Validation, and
Maintenance. Iterations are possible. The activities Verification, Assessment, and Quality
assurance run in parallel with the development.

Different roles are addressed: Supplier, Assessor, Designer, Developer, Verifier,


Validator.

This standard gives objectives and requirements for several aspects.


• Personnel and responsibilities. For example, training, independence of roles
related to SIL etc are stated.
• Lifecycle issues and documentation. For example, a lifecycle model shall be
selected i.e. it is not specified by the standard (however a recommended
development lifecycle model and including documents based on V-model is
given), tasks shall be defined specifying input, output, and activities.
• Systems configured by application data. Input and output documents are also
specified.

The following functional steps are defined. For each step, input and output documents are
specified.
• Software requirement specification
• Software architecture
• Software design and implementation
• Software/hardware integration
• Software validation
• Software maintenance

The following tasks run in parallel with the development. For each task, input and output
documents are also specified.
• Software verification and testing
• Software assessment
• Software quality assurance

A document cross reference table is given showing the relation between document
(created/used), phase and associated clause in this standard.

The annexes contains various aspects related to SIL.


• Criteria for documentation according to Lifecycle issues and documentation,
related to SIL with qualifiers (Mandatory, Highly recommended, Recommended,
Not recommended, No opinion).
• Clauses to be assessed, related to SIL with qualifiers (Mandatory, Highly
recommended, Recommended, Not recommended, No opinion).
• Techniques and measures for Software requirement specification, Software
architecture, Software design and implementation, Verification and testing,
Software/hardware integration, Software validation, Software assessment,
Software quality assurance, and Software maintenance. These are related to SIL
with qualifiers for recommendation (Highly recommended, Recommended, Not
recommended, No opinion). For each technique there is a reference to Annex B
(Bibliography of techniques) either directly or via another more detailed table.
• Bibliography of techniques, a list of 69 techniques with references for increasing
safety (44 pages).
49

If not using HR (Highly Recommended) for techniques and measures, as defined by the
SIL level, it must be motivated.

5.6.2 Extracted version


See Annex 10.

5.6.3 Mappings
The explicitly defined artefacts are listed below.
• System Requirements Specification
• System Safety Requirements Specification
• System Architecture Description
• System Safety Plan
• SW Quality Assurance Plan
• SW Configuration Management Plan
• SW Verification Plan
• SW Integration Test Plan
• SW/HW Integration Test Plan
• SW Validation Plan
• SW Maintenance Plan
• Data Preparation Plan
• Data Test Plan
• SW Requirements Specification
• Application Requirements Specification
• SW Requirements Test Specification
• SW Requirements Verification
• SW Architecture Specification
• SW Design Specification
• SW Arch. and Design Verification Report
• SW Module Design Specification
• SW Module Test Specification
• SW Module Verification Report
• SW Source Code
• SW Source Code Verification Report
• SW Module Test Report
• SW Integration Test Report
• Data Test Report
• SW/HW Integration Test Rep
• SW Validation Report
• SW Assessment Report
• SW Change Records
• SW Maintenance Records
• SW coding standards
• SW Requirements Verification Report

The explicitly defined processes are mapped to SPICE processes according to the table
below.
50

Processes in standard SPICE processes


CUS.1 Acquisition
CUS.1.1 Acquisition preparation
CUS.1.2 Supplier selection
CUS.1.3 Supplier Monitoring
CUS.1.4 Customer Acceptance
CUS.2 Supply
CUS.3 Requirements Elicitation
CUS.4 Operation
CUS.4.1 Operational use
CUS.4.2 Customer support
ENG.1 Development
ENG.1.1 System requirements analysis and
design
- Software requirements 1(2) ENG.1.2 Software requirements analysis
- Software architecture ENG.1.3 Software design
- Software design and
implementation 1(4)
- Software design and ENG.1.4 Software construction
implementation 2(4)
ENG.1.5 Software integration
- Software design and ENG.1.6 Software testing
implementation 3(4)
- Software/Hardware integration ENG.1.7 System integration and testing
- Software maintenance ENG.2 System and software maintenance
SUP Support process category
- Lifecycle issues and documentation SUP.1 Documentation
1(2)
SUP.2 Configuration management
- Software quality assurance SUP.3 Quality assurance
- Software requirements 2(2) SUP.4 Verification
- Software verification and testing
- Software assessment SUP.5 Validation
- Software validation
SUP.6 Joint review
SUP.7 Audit
SUP.8 Problem resolution
MAN.1 Management
- Lifecycle issues and documentation MAN.2 Project management
2(2)
- Software design and
implementation 4(4)
MAN.3 Quality Management
MAN.4 Risk Management
ORG.1 Organizational alignment
ORG.2 Improvement process
ORG.2.1 Process establishment
ORG.2.2 Process assessment
ORG.2.3 Process improvement
- Personnel and responsibilities ORG.3 Human resource management
ORG.4 Infrastructure
ORG.5 Measurement
ORG.6 Reuse
51

The normative referenced standards are:


• EN 50126
• EN 50129
• EN 50159-1
• EN 50159-2
• EN ISO 9001
• EN ISO 9000-3

5.6.4 Metrics
Below is the Kiviat diagram shown. Three axes exist where A denotes the number of
artefacts, S is the number of referenced (normative) standards and P is the number of
processes. Each axis is normalized according to the maximum number counted for the six
evaluated standards.

5.6.5 Judgement
For a complete overview EN 50126 and EN 50129 should also be considered. The focus
for this standard is on new development of software.

The clauses of this standard focus in some cases on what to produce, e.g. one chapter is
“Software requirements specification”, and in other cases on processes e.g. “Software
verification and testing”. “Software requirements specification” contains both functional
and test specifications. As a consequence it is not clear what the difference is between
processes, activities, procedures, tasks, phases etc. Area of interest is described instead at
some places. Some chapters contain different aspects that would be better if they were
separately handled e.g. chapter “Lifecycle issues and documentation” should have been
split up in two. No lifecycle model is specified but since two examples are given these are
assumed to be starting points. Taken together it is a little bit confusing and this can
actually be seen at the mapping to SPICE processes.

Otherwise the language is clear and the contents is easy to grasp. Each phase is described
by objectives, input, output and requirements which makes it easy to orientate. There is a
document cross-reference table that is useful for an overview and connects phases,
documents, clauses, and use and definition phase of documents.

The SIL classification is general and not quantitative.


52

Parameterisation and COTS are included. All phases concerning parameterisation are
gathered in a separate chapter i.e. as a parallel development line which could be practical
since it is a specific feature.

Different roles are included e.g. assessor, validator, designer etc and in many cases the
standard requirements start from a specific role.

The large number of listed techniques, even if not completely updated, gives a valuable
start for finding suitable methods in actual development.

Not much is railway specific and thus in principle the standard could be used for any
application.
53

6 The testing perspective


6.1 General
The testing perspective is in this report expanded to the following activities:
• verification
• validation
Both can include documentation, processes, analysis and testing etc. Thus testing is a
means for verification and validation.

Note that when a number of aspects are listed below, the list is not necessarily complete.
Instead the most important aspects are shown and aspects with special software interest.

6.2 EN 954
6.2.1.1 EN 954 –1

Validation is made for the specified safety functions, their realisation, and categories. As
a result redesign may have to be made i.e. an iterative procedure is assumed. Validation is
made using tests and analyses according to the validation plan (containing the validation
strategy). Only validation of safety-related parts is considered in the standard. Activities
and decisions shall be documented and test procedures and the results of them shall be
documented in a validation report.

Validation analysis can be made using e.g. fault lists, fault tree analysis, failure mode and
effect analysis, criticality analysis, checklists etc.

Validation testing includes functional testing of the safety functions. Important is that
normal, and foreseeable abnormal cases are considered. Validation testing shall also
verify categories. Suggested methods are: analysis based on circuit diagrams, practical
tests on circuits, and simulation. Also tests related to environmental parameters shall be
made concerning e.g. temperature, vibration etc.

6.2.1.2 EN 954 –2

Analysis should be started as early as possible in parallel with the design process. Here
mechanical, pneumatic, hydraulic and electrical technologies are included. The standard
points out that, in many cases, validation of programmable electronic system is not
complete in the standard.

An overview of the validation process is given. Details of the validation plan are listed
and a list of documents for validation is given. Emphasis is on hardware but general
software aspects are also included. Documentation requirements relative categories are
listed.

Guidance is given for analysis. If analysis is not sufficient testing is necessary. For tests a
test plan shall be made with accompanying test records. Guidance is given for how to
carry out testing.

Fault lists shall be made defining the faults to be considered (some might be excluded)
54

For each category, guidance is presented for validation according to the definition of
categories. Also guidelines for validation of environmental and maintenance requirements
are stated.

In annexes, for each of the different technologies, a description of validation tools


(interpreted as methods/principles) are presented. This includes:
• list of basic safety principles
• list of well-tried safety principles
• list of well-tried components
• fault lists and fault exclusions

6.3 IEC 61508


6.3.1.1 IEC 61508-1: General requirements

The following activities related to test are specified:


• overall safety validation planning (specific phase in overall safety lifecycle)
• overall safety validation (specific phase in overall safety lifecycle)
• verification (applied at several phases in overall safety lifecycle)

The objective of the Overall safety validation planning is to create an overall validation
plan. This standard specifies a number of aspects that shall be included such as: who will
validate and when, list of relevant modes of operation, specification of the safety-related
system, validation strategy, confirmation means including means for safety function and
safety integrity level confirmation, environment specification, policies and procedures for
evaluation.

The objective of Overall safety validation is to validate that the system fulfils overall
safety function requirements and overall safety integrity levels requirements. The
validation is made according to the overall validation plan taking into account safety
requirements allocation. Results shall be documented including safety functions being
validated (by test or by analysis), tools and equipment, results with judgement, and
configuration identification. The resulting action (e.g. a change request) shall also be
documented.

Verification is used for verifying (by review, analysis and/or test) that outputs of any
lifecycle phase fulfil objectives and requirements for that phase. For each such phase
there shall be a corresponding plan that shall be used for verification. The plan shall
contain criteria, techniques, and tools to be used. Results with judgement shall be
documented.

6.3.1.2 IEC 61508-2: Requirements for


electrical/electronic/programmable electronic safety- related
systems

The following activities related to test are specified:


• programmable system safety validation planning (specific phase in
programmable system safety lifecycle)
• programmable system safety validation (specific phase in programmable system
safety lifecycle)
• verification (applied at several phases in programmable system safety lifecycle)
55

The objective of the programmable system safety validation planning is to plan the
validation. The planning shall define the steps for making it possible to show that the
system fulfils safety requirements. The following aspects should be considered: safety
requirements, the steps to be applied at validation, required environment, evaluation
procedures and policies.

The objective of programmable system safety validation is to validate that the system
fulfils safety function requirements and safety integrity levels requirements. The
validation is made according to the validation plan and each safety function shall be
validated by test and/or analysis. For each safety function the following shall be
documented: tools, equipment and results with judgement. The resulting action (e.g. a
change request) shall also be documented. Techniques and measures related to SIL for
fault avoidance during validation are given in Annex B.

Verification is used for verifying that outputs of a phase are correct. For each such phase
there shall be a corresponding plan that shall be used for verification. The plan shall
contain criteria, techniques, and tools to be used. The plan shall consider: strategies and
techniques, test equipment, relevant activities, and evaluation of results. In each design
and development phase safety function and integrity level requirements shall be met. Test
cases, results with judgements shall be documented. It shall also be verified that safety
requirements are consistent with those defined at the overall level (see Part 1) and with
tests and documentation. The same consistency checks shall be made for design and
development verification.

6.3.1.3 IEC 61508-3: Software requirements

The V-model is shown defining the role of software validation and testing. The annexes
give recommendations for software module testing and integration, HW/SW integration,
software safety validation, and software verification. For some of these more detailed
information is given as well.

The following activities related to test are specified:


• software safety validation planning (specific phase in software safety lifecycle)
• software module testing
• software integration testing
• software safety validation (specific phase in software safety lifecycle)
• verification (applied at several phases in software safety lifecycle)

The objective of the software safety validation planning is to develop a plan for software
safety validation. The planning shall define the steps for making it possible to show that
the software fulfils safety requirements. The following aspects should be considered: who
will validate and when, list of relevant modes of operation, specification of the safety-
related software, validation strategy, confirmation means including means for safety
function and safety integrity level confirmation, safety requirements, required
environment, evaluation procedures and policies. An assessor shall review the validation
planning scope and contents (according to SIL).

Each software module shall be tested during design and results shall be documented.

Software integration testing shall be specified during design and performed at integration.
The specification shall contain: tests to be performed, test environment, test criteria, and
procedures for corrective actions. Results with judgements shall be documented. At
changes, an impact analysis shall be made in order to determine the necessary amount of
reverification.
56

The objective of software safety validation is to validate that the system fulfils software
safety requirements according to SIL. The validation is made according to the software
validation plan. For each safety function the following shall be documented: tools,
equipment, and results with judgement. The resulting action (e.g. a change request) shall
also be documented. Testing is the main validation method. Software tool for validation
shall be qualified.

Verification is used for verifying that outputs of a software phase are correct. For each
such phase there shall be a corresponding plan that shall be used for verification. The plan
shall contain criteria, techniques, and tools to be used. The plan shall consider: strategies
and techniques, verification tools, evaluation of results, corrective actions. Results with
judgement shall be documented.

Each software safety phase and its output should be verified according to specific criteria.
The list below shows the verification activities that shall be performed.
• Software safety requirements
• Software architecture
• Software system design
• Software module design
• Code
• Data
• Software module testing
• Software integration testing
• Programmable electronics integration testing
• Software safety requirements testing

It shall also be verified that safety requirements are consistent with those defined at the
higher level (see Part 2) and with software validation planning.

For software architecture the following should be considered: if the software architecture
fulfils requirements, if tests are adequate, and if partitioning is adequate. The same
aspects apply to software system design and software module design.

Code verification shall be made statically.

For data, verification details are given that shall be verified e.g. data structures.

6.4 IEC 61713


For software coding and testing the following should be considered:
• well documented and established testing procedures
• corrective action procedure
• tests for specific dependability requirements

For software qualification testing the following should be considered:


• qualification testing for specific dependability requirements
• test coverage
• technical reviews, audits, and change control reviews to ensure relevant tests

For system qualification testing the following should be considered:


• qualification testing for specific dependability requirements
• test coverage
57

For each case guidelines and motivations are given. A general description of verification
is given at the end of the standard.

6.5 RTCA/DO-178B
Software verification is an integral process i.e. could be applied at several places. Apart
from recognizing the interface between system and software, system verification aspects
are not included in the standard.

A software verification plan shall be made defining how verification shall be made. Test
planning and verification activities should be defined in the software planning process. A
significant part of the standard is devoted to verification thus pointing out the importance
of it.

Verification includes review, analysis and test. Five tables in the annex show objectives
and outputs related to software level (A-D) for verification. The outputs from verification
are stored in Software Verification Cases and Procedures and Software Verification
Results.

The following items are verified by reviews and analysis (for each a number of objectives
is listed).
• High-level requirements
• Low-level requirements
• Software architecture
• Source code
• Outputs of integration process i.e. verifying linking and loading data, memory
map.
• Test cases, procedures and results i.e. that code testing development and
performance were accurate and complete.

For testing, a diagram shows the testing process. Three types are defined:
Hardware/software integration testing, Software integration testing (integrating software
components), and Low-level testing (implementation). Requirement-based testing is
emphasized and two categories exist: normal range tests and robustness (abnormal) range
tests. Guidelines are given for requirement-based testing methods concerning
Hardware/software integration testing, Software integration testing, and Low-level
testing.

For test coverage analysis is a two-step process.


1. Requirement-based coverage analysis. To evaluate the quality of the requirement-
based implementation testing.
2. Structural coverage analysis. To determine code structures which have not been
exercised by requirement-based tests. Guidance is added.

Deficiencies and errors should be reported.

The following documents (life cycle data) are related to verification (for each there is a
number of aspects that should be included):
• software verification plan
• software verification cases and procedures
• software verification results
58

Guidelines are given for handling previously developed code, change of application or
development environment, and software verification tool qualification.

6.6 IEC 60601-1-4


The list below shows actions to be done at verification.
• Verification of implementation of safety requirements
• A verification plan shall be made documenting: strategies, activities, techniques,
tools, criteria.
• Results shall be documented and analysed.

The list below shows actions to be done at validation.


• Verification of the system safety
• A validation plan shall be made
• Results shall be documented and analysed
• Independence issues shall be handled

In Annex DDD the V-model is shown with validation and verification activities.

6.7 EN 50128
Two development life cycles are shown (one is the V-model) where verification and
validation activities are explicitly shown. The annexes give recommendations in rela tion
to SIL for documentation, verification and testing, Software validation, and standard
clauses to be assessed. For some of these more detailed information is given as well.

From the document cross-reference table the following documents related to testing shall
be made: SW verification plan, SW integration test plan, SW/HW integration test plan,
SW validation plan, Data test plan, SW requirement test specification, SW requirement
verification report, SW architecture and design verification report, SW module test
specification, SW module verification report, SW source code verification report, SW
module test report, SW integration test report, Data test report, SW/HW integration test
report, and SW validation report.

The SW requirement test specificatio n contains: test cases with inputs, outputs and
criteria for acceptance.

The SW module test specification defines the degree of test coverage and testing
procedures. The SW module test report contains test results (including test coverage) and
judgements.

The SW verification plan contains criteria techniques, tools to used and description of
activities. The plan shall address: test equipment, what shall be documented, evaluation of
results (including reliability, test coverage), and personnel.

The SW integration test plan contains: test cases, test environment, and test criteria.

Verification dependence issues shall be handled according to SIL. After each verification
activity a report shall be produced containing items not fulfilled and detected errors.

The SW requirement verification report contains: adequacy of software requirements


specification and tests specification.
59

The SW architecture and design verification report contains: adequacy of software


architecture specification, software design specification, and software integration test
plan.

The SW module verification report contains: adequacy of software module design


specification, software module test specification, decomposition into software modules,
and software module test reports.

The SW source code verification report contains results including conformance to


specifications and quality assurance according to SIL.

The SW integration test report contains results with judgements. If there is a failure
reasons for it shall be recorded.

The SW/HW integration test plan contains: test cases, test environment, and test criteria.

The SW/HW integration test report contains: test cases, results and judgement, and
opinion of verifier.

For validation, analysing and testing are the main activities. An independent party shall
evaluate the results of validation. The SW validation plan contains: strategy,
identification of steps for demonstrating adequacy of requirements, architecture, design,
and module design specifications.

The SW validation report contains: results (including test coverage) and judgements, and
identity of included items.
60

7 Summarized and referenced standards


Below is a list of standards that are safety-related but not evaluated in this work.

7.1 DEF STAN 00-55


Requirements for Safety Related Software in Defence Equipment

The UK Ministry of Defence has published this standard for programmable electronic
defence equipment. Two parts exist; Requirements and Guidance. Def Stan 00-55 is a
sector-specific standard for IEC 61508 Safety Integrity Level 4 software. Reference to
Def Stan 00-56 is included. The text below is a quotation from the standard. SRS means
Safety Related Software and SCS is Safety Critical Software.

“0.2 ….This standard places particular emphasis on describing the procedures necessary
for specification, design, coding, production and in-service maintenance and modification
of SCS (safety integrity level S4). It also details the less rigorous approaches that are
required to produce software of lower safety integrity levels (S1 to S3).”

“1.1 This Standard specifies the requirements for all SRS used in Defence Equipment. It
relates only to software and does not deal with safety of the whole system. Evidence of
the safety principles applied during the development of the SRS contributes to the overall
system safety case.
1.2 This Standard contains requirements for the tools and support software used to
develop, test, certify and maintain SRS through all phases of the project life cycles.”

The standard could be downloaded from http://www.dstan.mod.uk/

7.2 DEF STAN 00-56


Safety Management Requirements for Defence Systems

The UK Ministry of Defence has published this standard for programmable electronic
defence equipment. Two parts exist; Requirements and Guidance. This standard defines a
systematic process for the safety analysis of defence equipment. The text below is a
quotation from the standard.

“0.2 This standard provides uniform requirements for implementing a system safety
programme in order to identify hazards and to impose design techniques and management
controls to identify, evaluate and reduce their associated risks to a tolerable level.”

“1.2 The purpose of this Part of the Standard is to define the safety programme
management procedures, the analysis techniques and the safety verification activities
which are applicable during the project lifecycle. This is in order to minimize the
possibility of a system entering service with unacceptable safety characteristics.”

The standard could be downloaded from http://www.dstan.mod.uk/


61

7.3 ENV 1954


Internal and external fault behaviour of safety related electronic parts of gas
appliances

This European prestandard applies to programmable electronic systems for gas


installations including safety-relevant electronic actuators, sensors, converters etc.
Definitions and requirements are given for fault behaviour, fault detection, hardware
development and software development. Several definitions for complex electronics and
relating to error avoidance in controls using software are listed. A classification is
specified, including three classes, for evaluating the design of an electronic system.
Failure modes for use in FMEA of normal and complex electronic components are
specified.

7.4 EN 50126, EN 50129


Railway applications

• EN 50126 Railway applications - The specification and demonstration of


Reliability, Availability, Maintainability and Safety (RAMS)
• EN 50129 Railway applications - Safety related electronic systems for signalling

The three railway standards EN 50126, EN 50128 (evaluated in this report) and EN
50129 belong together and are extensions of IEC 61508. They describe processes to be
followed for safety of railway applications. A specific notion is the safety case which is a
number of arguments for making it possible to convince a third party e.g. safety authority
that the system is safe and can be used.

EN 50126 is the top-level document that defines the overall process. It defines Safety
Integrity Levels and the scope for EN 50128 and EN 50129.

EN 50129 defines the activities for developers, manufacturers, and assessors. In EN


50129 a process is defined for implementation of management of reliability, availability,
maintainability and safety, denoted by the acronym RAMS. EN 50129 also addresses the
approval process for individual systems.

EN 50128 is focused on software as discussed earlier. It is the function of EN 50126 and


EN 50129 to specify the safety functions allocated to software. EN 50128 specifies those
measures necessary to achieve these requirements.

7.5 IEC 300-3-9


Dependability Management - Part 3: Application Guide – Section 9: Risk analysis of
Technological Systems

This standard is focused on risk analysis and the purpose is to “provide guidelines for
selecting and implementing risk analysis techniques”. The standard does not focus on risk
management and risk control. A process is defined containing the following steps: scope
definition, hazard identification and consequence evaluation, risk estimation, verification,
documentation, and analysis update.

7.6 IEC 60880


Software for Computers in the Safety Systems of Nuclear Power Plants
62

This purpose of this standard is to give guidance for production, operation, and
maintenance of “highly reliable software required for computers to be used in the safety
systems of nuclear power plants for safety functions” and “This includes safety actuation
systems, safety system support features, and the protection systems.” The standard is
widely used and describes the whole software lifecycle. Seven lifecycle phases are
defined: Software requirements analysis and specification, Development, Verification,
Hardware software integration, System validation, Operations, and Maintenance and
modification.

7.7 IEC 61511


Functional safety - Safety Instrumented Systems for the process industry sector

This standard is an application of the generic standard IEC 61508 and used for the
process industry.
Part 1 - General framework, definitions, system software and hardware requirements
This part is equivalent in scope with Parts 1, 2, 3 and 4 of IEC 61508. This is the
normative part.

Part 2 - Guidelines on the application of IEC 61511-1


This part is equivalent in scope with Part 6 of IEC 61508

Part 3 - Guidelines on hazard & risk analysis


This part is equivalent in scope with Part 5 of IEC 61508

Preliminary versions exist and the three parts of the standard may be published in 2002.

7.8 IEC 62061


Safety of Machinery - Functional Safety - Electrical, Electronic and Programmable
Electronic Control Systems

This standard is an application of the basic safety publication IEC 61508 to the machinery
sector. Programmable electronic systems are handled more extensively in this standard
than in the European standard EN 954-1 for safety-related parts of machine control
systems. The work is in progress, and the standard should not be expected to be published
before 2003. Below is a quotation of the scope of the standard:

“This International Standard specifies requirements and makes recommendations for the
design, integration and validation of safety-related electrical, electronic and
programmable electronic control systems (SRECS) for machines (see note 1). It is
applicable to control systems used, either singly or in combination, to carry out safety
functions on machines which are not portable by hand while working, including a group
of machines working together in a coordinated manner.”

7.9 ISO 9000


Quality Management System
63

There is an overview called “Executive Overview - ISO 9000 Quality Management


System “ that could be found on e.g.
http://fox.nstn.ca/~cottier/overview/ISO_9000/iso.html and is a good introduction.

7.10 ISO 12207


Software life cycle processes

ISO 12207 is a generally accepted standard for software lifecycle processes. This
standard gives a general framework that has to be specified further. The standard defines
and groups processes to be used in a software life cycle.
• Primary Processes: Acquisition, Supply, Development, Operation, and
Maintenance.
• Supporting Processes: Documentation, Configuration Management, Quality
Assurance, Verification, Validation, Joint Review, Audit, and Problem
Resolution.
• Organization Processes: Management, Infrastructure, Improvement, and
Training.

The following roles are used: Acquirer, Supplier, Developer, Operator, Maintainer.

More information can be achieved from http://www.12207.com

7.11 ISO 13849-1


Safety of machinery - Safety-related parts of control systems

This is the international equivalent of the European standard EN 954-1. The contents is in
principle exactly the same, however, references are naturally different.

7.12 MIL-STD-882D
Mishap Risk Management (System Safety)

This standard could be downloaded from


http://www.afmc.wpafb.af.mil/organizations/HQ-AFMC/SE/systuff/882d.doc. The text
below is a quotation from it and defines the scope:

“1.1 Scope. This document outlines a standard practice for conducting system safety.
The system safety practice as defined herein conforms to the acquisition procedures in
DoD Regulation 5000.2-R and provides a consistent means of evaluating identified
risks. Mishap risk must be identified, evaluated, and mitigated to a level acceptable (as
defined by the system user or customer) to the appropriate authority and compliant with
federal (and state where applicable) laws and regulations, Executive Orders, treaties,
and agreements. Program trade studies associated with mitigating mishap risk must
consider total life cycle cost in any decision. When requiring MIL-STD-882 in a
solicitation or contract and no specific paragraphs of this standard are identified, then
apply only those requirements presented in section 4.”
64

8 Conclusions
8.1 Methodology
The different views are a practical way to approach a standard in shorter time than
otherwise would be possible. In this evaluation much of the vocabulary is borrowed from
the considered standard. Thus an extra help would be to study the terminology and
definitions often listed in the beginning of a standard. However, many general words have
a common meaning e.g. independence, safety integrity level (but definitions vary) etc.
The opposite is e.g. the definition of dependability, fault, and failure that unfortunately
varies with the standard.

The summary and judgement are plain English texts and describes the essence of the
standard. The mappings are formal but gives a direct overview of processes and artefacts
(by name) and referenced standards. A further improvement would be to include qualities
with the mappings as discussed under methodology earlier. Some examples are shown
below.

• For processes: the activities included, excluded, recommended, not recommended


etc.
• For artefacts: details included/excluded/added, shall or should requirements etc
• For referenced standards: whole or part of, shall or should requirements etc

The methodology could be expanded in other ways also. One way is to introduce a new
mapping set e.g. for techniques and measures. Another way is to take a new perspective,
other than general and test, e.g. management, organization etc.Adding a new standard for
analysis within the current scope could be done directly and without problems.

Metrics can be improved in several ways. We can think of clarity metrics such as
counting long words, words in sentences etc. Other metrics could be the number of
“shall” vs. “should”, the number of forward references within the standard (many will
indicate that “a person has to know the standard before reading it”) etc. It is very
important to point out that reliance on these metrics will have to be made with great care;
it is really impossible to reflect a quality for humans by stating one or more numbers. In
this perspective the used metric in this evaluation might be sufficient for its purpose.

Finally it should be noted that the driving force behind mappings and judgement are the
extracted standard versions. These are in some cases difficult to do but give a significant
indication of the clarity of the standard.

8.2 Feasibility of SPICE


As described earlier the capability model of SPICE is not used. The intention of the
capability model is to assess the maturity of work within a company. We can see that the
whole spectrum is represented here. EN 954 does not map well at all but IEC 61713 maps
extremely well and the rest of the standards is somewhere in between. One reason for not
mapping too well is that the evaluated standards focus on safety-related aspects and what
is considered a process in SPICE is mapped to several different processes/activities in a
given standard. The opposite relation, i.e. a process in a standard is mapped to more than
one SPICE processes, is not common and occurs only for IEC 61508 and EN 50128

SPICE also contains processes for organisation and management which is not addressed
at all or just briefly in the evaluated standards. But the benefits of SPICE are that it is
65

generally accepted, that it takes an overall view of all processes, and that both safety-
related and non safety-related development can be mapped. Note the close resemblance
with ISO/IEC 12207 that is widely used today.

8.3 The test perspective


Test is here a means for verification and validation and used together with analyses and
reviews. The proportion of these aspects relative to the corresponding standard as a whole
is approximately constant with the possible exceptions of RTCA/DO-178B and EN
50128 where they seem more emphasized. The test perspective can be handled further by
using mapping sets and metrics as described above for the general perspective.

However, for all standards verification and validation are important processes. Plans are
produced for performance and results are documented. Validation concerns both safety
functions and SIL. Most of the standards recognise that verification and validation are
integral/support processes i.e. they are applied at several (or all) phases included in the
software lifecycle.

As a general summary, validation and verification activities shall be planned, documented


and configuration handled since they concern the proof of the quality delivered.

8.4 Could standards be compared?


As shown above the selected standards differ to a large extent concerning strictness,
scope, focus, layout, guidance etc and it may be tempting to assume that all are unique in
some way. On the other hand most of them concern software development and much is
the same:

• there must be some kind of gathering of requirements


• there must be a design phase and mapping of requirements to implementation
• there must be integration between software and hardware
• there must be verification and validation
• there must be documentation
• there must be configuration management
• …. etc etc

If processes are iterative, follow V-model etc are of second concern and could, at least to
a significant extent, be considered independent. Thus there are more software
development standards around than necessary.

Note the strong influence of IEC 61508 as being a generic standard and useful for
developing more specific standards. There is a risk with such a big standard that e.g. a
developer follows the standard exactly and nothing more and does not reflect on if there
is anything missing, that is, something specific for his/her application development.

So, which standard of the evaluated one is the best? Since the standards are different in
scope we have to look at the set of evaluated standards for different roles. Some
conclusions concerning role s are presented below.

• For a software developer do not consider EN 954. IEC 61508 is the overall
answer but might be too much to start with if more limited aspects are of interest
e.g. at a first survey. IEC 61713 is too general since it only gives guidance and
focuses on dependability aspects (according to its definition). RTCA/DO-178B
66

and EN 50128 are more like each other even though they are used for different
application areas. One of these is probably a good compromise. A specific aspect
of EN 50128 is the extensive list of techniques and measures (69 all together) for
increasing safety. If EN 50128 is chosen EN 50126 and EN 50129 should also be
considered. IEC 60601-1-4 is too specialised if general software development is
considered.
• For a standard developer start with IEC 61508 if it concerns development and EN
954 if it concerns requirements for machinery.
• For assessing safety integrity level (or corresponding name) EN 954 is suitable
for principal behaviour (resistance to faults) and IEC 61508 for quantitative
values and requirements on techniques, measures and documentation.
• For process improvement and assessment use SPICE as the overall reference
(perhaps also its capability model).
• For specific medical equipment and if IEC 60601-1 is used IEC 60601-1-4 is the
choice.
• If parameterised software is essential, RTCA/DO-178B and EN 50128 are
primary candidates since they explicitly mention this aspect. They are also
primary candidates if verification is the main issue or if activities related to SIL
are of primary interest.

8.5 How to use the results


Why should someone read this report? We could see different needs for this and some of
them are described below.

Persons using a specific standard might find it insufficient and would like to use another
established standard instead adding a minimum of company specific requirements. A
natural way could then be to study IEC 61508 and SPICE. By conforming to these, much
information could be retrieved directly, e.g. via Internet, concerning practical devices and
guidance.

In a company, if more than one standard is used this gives rise to extra costs e.g.
concerning education of personnel, documentation etc. By comparing the standards
internally or by introducing a new more general standard it is possible to reach a common
view and achieve synergy effects.

If an enterprise wants to change its business goals to include new types of applications, a
comparison with the corresponding standards is necessary to see the influence on the
company organisation and methods.

Persons in a company realise that they do have to follow an established standard and they
do not know which one to choose. For example, could the same one be used for safety-
related and not safety-related applications?
67

9 References
[1] ISO/IEC TR 15504-1 up to –9 1998 (SPICE)
Information technology – Software process assessment

[2] ISO 13849-1 (EN 954-1) First edition 1999-11-15


Safety of machinery – Safety-related parts of control systems
Part 1: General principles for design

[3] prEN 954-2 Draft December 1999


Safety of machinery – Safety-related parts of control systems
Part 2: Validation

[4] IEC 61508 - 1 First edition December 1998


Functional safety of electrical/electronic/programmable electronic
safety-related systems
Part 1: General requirements

[5] IEC 61508 – 2 First edition May 2000


Functional safety of electrical/electronic/programmable electronic
safety-related systems
Part 2: Requirements for electrical/electronic/programmable
electronic safety- related systems

[6] IEC 61508 – 3 First edition December 1998


Functional safety of electrical/electronic/programmable electronic
safety-related systems
Part 3: Software requirements

[7] IEC 61713 First edition June 2000


Software dependability through the software lifecycle processes –
Application guide

[8] RTCA/DO-178B December 1, 1992


Software Considerations in Airborne Systems and Equipment
Certification

[9] IEC 60601-1-4 Edition 1.1 April 2000


Medical Electrical Equipment – Part 1: General Requirements for
Safety – 4. Collateral Standard: Programmable Electrical Medical
Systems

[10] EN 50128 March 2001


Railway Applications: Software for Railway Control and
Protection Systems

[11] Software Safety and Reliability


Debra S. Herrmann
IEEE Computer Society
ISBN 0-7695-0299-7

[12] Software Engineering Standards


James W. Moore
IEEE Computer Society
68

ISBN 0-8186-8008-3
69

10 Annex 1: Project
10.1 Overall objectives
The overall purpose of the project is to ease the understanding of standards for safety-
related applications and to transfer a judgement of the standards to the reader without
him/her having to study them in detail. This overall purpose can be split up in sub
objectives according to the following list:
1. To thoroughly analyse and compare six specially selected standards for safety-
related applications and to reference and briefly describe other standards for
safety-related applications.
2. To define a methodology for comparing standards and to show that it is
applicable for additional standards.
3. To evaluate standards for safety-related applications from a test perspective.

10.2 Work organisation


The work has been carried out by two Nordic parties, DELTA (Denmark) and SP
(Sweden) and with the following roles:

• SP (Sweden) as evaluator and project manager.


• DELTA (Denmark) as evaluator.

Since DELTA and SP have somewhat different scopes the work has been approached
from a process side and from an artefact side. This was of importance when describing
the informal aspects of standards. For example, if opinions differed significantly then it
was possible to conclude that a standard leaves room for interpretations and vice versa.

The project has been active from March 2001 until December 2001.

10.3 Presentation of results


The final report is written jointly by SP and DELTA and exists in two forms: a word
document (this one) and an HTML-version available using Internet.

Preliminary results have been presented at RTiS (Real Time in Sweden) conference in
Halmstad on August 22, 2001 and at the international testing conference EuroSTAR in
Stockholm on November 21, 2001.

10.4 Data of participating partners


DELTA, Denmark
DELTA is an independent, self-governing organisation. It holds authorisation from the
Danish Minister of Industry and Business as a technological service institute and is
affiliated to the Danish Academy of Technical Sciences.

DELTA has the following divisions in Denmark: Electronic Testing, Microelectronics,


Software Engineering, Light & optics, Acoustic & Vibration, Technological-Audiological
Laboratory, Jyske EMC.
70

DELTA has the following division in Norway: DELTA Electronic Testing, Norway.

DELTA has the following division in Sweden: DELTA Development Technology AB.

The safety-engineering group at DELTA is a part of The Electronic testing division. The
safety-engineering group is working with consultancy and is participating in projects with
customer’s development projects. The activities of the safety engineering group is mainly
concerned with the following directives:
• Machinery Directive
• Low Voltage Directive
• Atex Directive
• Medical Device Directive
• R&TTE Directive.

The work of the safety-engineering group is often undertaken in collaboration with other
groups, specialised in testing of e.g. Fire Alarm and security equipment and EMC testing.

SP, Sweden
SP Swedish National Testing and Research Institute, is the national institute for technical
evaluation, testing, certification, metrology and research.

Technical evaluation covers a large number of activities such as testing, calculations,


analysis, inspection, measurement, calibration etc., which are provided within a wide
range of technological and industrial fields. SP’s department of Physics and Electronics
has acquired a long experience in supporting development and services to verify
electronics with strong demands on function, reliability and safety. Environmental stress
testing from specification to type testing, electrical safety, classification of degree of
protection provided by enclosure, EMC and radio testing are all examples of our broad
competence.

Certification of products and quality systems in accordance with national and


international regulations includes both voluntary and mandatory certification over a wide
range of products and technologies.

Research and development, which are a major activity, consists of two closely lined parts.
One concentrates on fundamental research in the field of testing and metrology. The other
is concerned with applied work in technological and industrial sectors. Safety of
machinery is a sector in which much testing and research resources are devoted.

SP is involved in a considerable number of projects and comparative investigations in


conjunction with approximately one dozen of sister institutes. Part of this work is realised
within the EC research programmes.

In organisations such as Eurolab, and various EC, EFTA and UN bodies, the ongoing
work is aimed at harmonising requirements, test methods and applications in order to
achieve as technically relevant, efficient and reliable services as possible. In other bodies,
such as Nordtest and Euromet, work is devoted to technical development of method of
testing and calibration, carried out by experts from the various countries working closely
together.
71

11 Annex 2: Extracted version of EN 954-1


The extracted version has been produced by starting from the original standard text and
then removing examples, guidelines, details and motivations and further by focusing on
aspects related to processes, artefacts and referenced standards. The ratio of the extracted
information to the original is 1:10 or higher. Thus, the extracted version cannot be used
as a substitute for the original standard.

2 Normative references
• EN 292-1:1991,
Safety of machinery — Basic concepts, general principles for design — Part 1:
Basic terminology, methodology.
• EN 292-2:1991/A1:1995,
Safety of machinery — Basic concepts, general principles for design — Part 2:
Technical principles and specifications.
• EN 418,
Safety of machinery — Emergency stop equipment, functional aspects —
Principles for design.
• EN 457,
Safety of machinery — Auditory danger signals — General requirements, design
and testing (ISO 7731:1986 modified)
• EN 614-1,
Safety of machinery — Ergonomic design principles — Part 1: Terminology and
general principles.
• EN 842,
Safety of machinery — Visual danger signals — General requirements, design
and testing.
• EN 981,
Safety of machinery — System of auditory and visual danger and information
signals.
• EN 982,
Safety of machinery — Safety requirements for fluid power systems and their
components — Hydraulics.
• EN 983,
Safety of machinery — Safety requirements for fluid power systems and their
components — Pneumatics.
• prEN 999:1995,
Safety of machinery — The positioning of protective equipment in respect of
approach speeds of parts of the human body.
• EN 1037,
Safety of machinery — Prevention of unexpected start-up.
• EN 1050:1996,
Safety of machinery — Principles for risk assessment.
• EN 60204-1:1992,
Safety of machinery — Electrical equipment of machines — Part 1: General
requirements (IEC 204-1:1992, modified)
• EN 60447:1993,
Man-machine interface (MMI) — Actuating principles (IEC 447:1993)
• EN 60529,
Degrees of protection provided by enclosures (IP Code) (IEC 529:1989)
• EN 60721-3-0,
72

Classification of environmental conditions — Part 3: Classification of groups of


environmental parameters and their severities — Introduction (IEC 721-3-0:1984
+ A1:1987)
• IEC 50 (191):1990,
International Electrotechnical Vocabulary. Chapter 191: Dependability and
quality of service.

4 General considerations
4.1 Safety objectives in design
Designed and constructed so that the principles of EN 1050 are fully taken into account at
different modes.

4.2 General strategy for design


From the risk assessment (see EN 1050) of the machine, the designer shall decide the
contribution to the reduction of risk that needs to be provided by each safety-related part
of the control system. The hierarchy for the strategy in reducing risk is given in EN 292-
1:1991, clause 5. The designer shall declare category(ies) and special aspects concerning
the design. A specific design requirement is given for critical functions.

4.3 Process for selection and design of safety measures


4.3.1 General
This process is iterative, looping back through the procedure at any step.
4.3.2 Step 1: Hazard analysis and risk assessment
Identify the hazards present at the machine during all modes of operation and at each
stage in the life of the machine by following the guidance in EN 292-1 and EN 1050.
Assess the risk arising from those hazards and decide the appropriate risk reduction for
that application in accordance with EN 292-1 and EN 1050.
4.3.3 Step 2: Decide measures for risk reduction by control means
Decide the design measures at the machine and/or the provision of safeguards to provide
the risk reduction.
4.3.4 Step 3: Specify safety requirements for the safety-related parts of the control
system
Specify the safety functions (see clause 5 and other referenced documents) to be provided
in the control system. Specify how the safety functions will be met and select the
category(ies) for each part and combinations of parts within the safety-related parts of the
control system (see clause 6).
4.3.5 Step 4: Design
Design the safety-related parts of the control system according to the specification
developed in step 3 and to the general strategy for design in 4.2. List the features included
in the design which provide the rationale for the category(ies) achieved. Verify the design
at each stage to ensure that the safety-related parts fulfil the requirements from the
previous stage in the context of the specified safety function(s) and category(ies).
4.3.6 Step 5: Validation
Validate the achieved safety functions and category(ies) against the specification in step
3. Redesign as necessary (see clause 8). It is also necessary to validate the safety-related
parts of the control system in conjunction with the entire control system and as part of the
machine. Should be specified by the machine designer or the appropriate Type C safety
standard.

4.4 Principles for ergonomic design


Quality aspects on the interface between persons and the safety-related parts of control
systems are considered. The safety requirements for observing ergonomic principles
given in EN 292-2:1992, 3.6, should apply.
73

5 Characteristics of safety functions


The designer (or Type-C standard maker) shall include the necessary safety functions
from the list to achieve the measures of safety required of the control system for the
specific application. The designer (or Type-C standard maker) shall ensure that the
requirements of all these International Standards are satisfied for the selected safety
functions in the table. Additional requirements are stated for the safety functions: 5.2 Stop
function, 5.3 Emergency stop function, 5.4 Manual reset, 5.5 Start and restart, 5.6
Response time, 5.7 Safety-related parameters, 5.8 Local control function, 5.9 Muting,
5.10 Manual suspension of safety functions, 5.11 Fluctuations, loss and restoration of
power sources.

6 Categories
The safety-related parts of control systems shall (should is used in 6.2) be in accordance
with the requirements of one or more of the categories: B, 1, 2, 3, 4.

7 Fault consideration
If additional faults should be considered, they should be listed and the method of
validation should also be clearly described. The designer shall declare, justify and list all
fault exclusions.

8 Validation
8.1 General
The validation of the safety-related parts of control systems should contain management
and execution of validation activities (test specifications, testing procedures, analysis
procedures) and documentation (auditable reports of all validation activities and
decisions).

8.2 Validation plan


Contains validation strategy. The plan should be developed concurrently with the design
of the safety-related parts of the control system or can be specified by the relevant Type-
C standard. The plan should include a description of all the requirements for validation by
analysis and testing.

8.3 Validation by analysis

8.4 Validation by testing


Test is made of: specified safety functions (using results from risk assessment), specified
categories, dimensioning and compliance to environmental parameters.

8.5 Validation report


It is made at the conclusion of the validation process. Contents aspects are stated. The
results shall be documented and retained in an auditable form.

9 Maintenance
The provisions for the maintainability of the safety-related part(s) of a control system
shall follow the principles of EN 292-2:1991, 6.2.1 and EN 292-2:1991/A1:1995, annex
A, 1.6. All information for maintenance shall comply with EN 292-2:1991, 5.5.1 e).

10 Information to be provided to the user


The principles of EN 292-2:1991, clause 5 and other relevant documents, e.g. EN 60 204-
1:1992, clauses 18 und 19, shall be applied. Contents aspects are given for user
information.
• limits of the safety-related parts to the category(ies) selected and any fault
exclusions (including maintenance aspects for them)
74

• clear descriptions of the interfaces to the safety-related parts of control systems


and protective devices
• maintenance (see clause 9)
• maintenance checklists
• ease of accessibility and replacing of internal parts
• means for easy and safe trouble shooting
75

12 Annex 3: Extracted version of EN 954-2


The extracted version has been produced by starting from the original standard text and
then removing examples, guidelines, details and motivations and further by focusing on
aspects related to processes, artefacts and referenced standards. The ratio of the extracted
information to the original is 1:10 or higher. Thus, the extracted version cannot be used
as a substitute for the original standard.

This standard has the status of an application standard (Type B1).

2 Normative references
• EN 292-1: 1991,
Safety of machinery – Basic concepts, general principles for design – Part 1:
Basic terminology, methodology
• EN 954-1:1996,
Safety of machinery – Safety-related parts of control systems – Part 1: General
principles for design

3 Validation process
The validation shall demonstrate that each safety-related part meets the requirements of
EN 954-1in accordance with the validation plan.

Figure 1: Overview of the validation process

3.2 Validation plan


The validation plan shall identify and describe the requirements for carrying out the
validation process of the specified safety functions and their categories.

3.3 Documents for validation


Documents shall be included in the validation process: specifications, drawings and
specifications, block diagrams, circuit diagrams, functional description of the circuit
diagrams, time sequence diagrams, component list, analysis of all relevant faults, an
analysis of the influence of processed materials, category specific information, software
documentation.

Documents should include:


1) a specification which is clear and unambiguous and states the safety performance the
software is required to achieve, and
2) evidence that the software is designed to achieve the required safety performance, and
3) details of tests (in particular test reports) carried out to prove that the required safety
performance is achieved.

3.4 Validation record


Validation by analysis and testing shall be recorded.

4 Validation by analysis

5 Validation by testing
5.1 General
A test plan shall be produced. Test records shall be produced.

5.2 Environmental conditions


Environmental conditions, range of stable test conditions shall be recorded.
76

6 Fault lists
6.2 Specific fault lists
A specific fault list shall be generated as a reference document.

7 Validation of safety functions

8 Validation of categories

9 Validation of environmental requirements


Validation shall be made by analysis and, if necessary by testing. Guidance is given for
the validation process.

10 Validation of maintenance requirements


The validation process shall demonstrate by analysis that the maintenance requirements
as specified in EN 954-1: 1996, clause 9, paragraph 2, have been implemented.
77

13 Annex 4: Extracted version of IEC 61508-1


The extracted version has been produced by starting from the original standard text and
then removing examples, guidelines, details and motivations and further by focusing on
aspects related to processes, artefacts and referenced standards. The ratio of the extracted
information to the original is 1:10 or higher. Thus, the extracted version cannot be used
as a substitute for the original standard.

2 Normative references
• ISO/IEC Guide 51:1990,
Guidelines for the inclusion of safety aspects in standards
• IEC Guide 104:1997,
Guide to the drafting of safety standards, and the role of Committees with safety
pilot functions and safety group functions
• IEC 61508-2,
Functional safety of electrical/electronical/programmable electronic safety-
related systems – Part 2: Requirements for electrical/electronical/programmable
electronic safety-related systems 1)
• IEC 61508-3:1998,
Functional safety of electrical/electronical/programmable electronic safety-
related systems – Part 3: Software requirements
• IEC 61508-4:1998,
Functional safety of electrical/electronical/programmable electronic safety-
related systems – Part 4: Definitions and abbreviations
• IEC 61508-5:1998,
Functional safety of electrical/electronical/programmable electronic safety-
related systems – Part 5: Examples of methods for the determination of safety
integrity levels
• IEC 61508-6,
Functional safety of electrical/electronical/programmable electronic safety-
related systems – Part 6: Guidelines on the application of parts 2 and 3
• IEC 61508-7,
Functional safety of electrical/electronical/programmable electronic safety-
related systems – Part 7: Overview of techniques and measures

5 Documentation
5.2 Requirements
5.2.1 The documentation shall contain sufficient information, for each phase of the
overall, E/E/PES and software safety lifecycles completed.
5.2.4 Unless justifie d in the functional safety planning or specified in the application
sector standard, the information to be documented shall be as stated in the various clauses
of this standard.
5.2.9 The documents or set of information shall have a revision index (version numbers).
5.2.11 All relevant documents shall be revised, amended, reviewed, approved and be
under the control of an appropriate document control scheme.

6 Management of functional safety


6.2 Requirements
6.2.1 Consider the overall, E/E/PES or software safety lifecycle phases to be applied, the
information to be documented (see clause 5), the selected measures and techniques used
to meet the requirements of a specified clause or subclause (see IEC 61508-2, IEC 61508-
3 and 61508-6), the functional safety assessment activities (see clause 8). The procedures
for ensuring prompt follow-up and satisfactory resolution from hazard and risk analysis
78

(see 7.4), functional safety assessment (see clause 8), verification activities (see 7.18),
validation activities (see 7.8 and 7.14), configuration management (see 6.2.1 o), 7.16 and
IEC 61508-2 and IEC 61508-3).

Training of staff and the retraining of staff at periodic intervals shall be made.

Also procedures for analysing operations and maintenance performance, audits,


documentation activities, approval procedure and authority for modifications are
included.

6.2.2 The activities specified as a result of 6.2.1 shall be implemented and progress
monitored.
6.2.3 The requirements developed as a result of 6.2.1 shall be formally reviewed by the
organizations concerned, and agreement reached.
6.2.5 Suppliers providing products or services to an organization having overall
responsibility shall have an appropriate quality management system.

7 Overall safety lifecycle requirements


7.1 General
7.1.1 Introduction
7.1.1.3 This is termed the E/E/PES safety lifecycle and forms the technical framework for
IEC 61508-2. The software safety lifecycle forms the technical framework for IEC
61508-3.
7.1.1.4 Iteration, however, is an essential and vital part of development through the
overall, E/E/PES and software safety lifecycles.

Figure 2 – Overall safety lifecycle


Figure 3 – E/E/PES safety lifecycle (in realisation phase)
Figure 4 – Software safety lifecycle (in realisation phase)
Figure 5 – Relationship of overall safety lifecycle to E/E/PES and software safety
lifecycles

Concept
Overall scope definition
Hazard and risk analysis
Overall safety requirements
Safety requirements allocation
Overall operation and maintenance planning
Overall safety validation planning
Overall installation and commissioning planning
E/E/PE safety-related systems: realisation
Other technology safety-related systems: realisation
External risk reduction facilities: realisation
Overall installation and commissioning
Overall safety validation
Overall operation, maintenance and repair
Overall modification and retrofit
Decommissioning or disposal

Table 1 – Overall safety lifecycle: overview, only column for safety lifecycle phase is
shown here.

7.1.4 Requirements
79

7.1.4.1 If another overall safety lifecycle than figure 2 is used, it shall be specified during
the functional safety planning.
7.1.4.2 The requirements for the management of functional safety (see clause 6) shall run
in parallel with the overall safety lifecycle phases.
7.1.4.4 Each phase of the overall safety lifecycle shall be divided into elementary
activities with the scope, inputs and outputs specified for each phase.

7.2 Concept
The information and results concerning concept requirements shall be documented.

7.3 Overall scope definition


The information and results concerning overall scope definition requirements shall be
documented

7.4 Hazard and risk analysis


7.4.2 Requirements
7.4.2.1 A hazard and risk analysis shall be undertaken.
7.4.2.11 The information and results that constitute the hazard and risk analysis shall be
documented.
7.4.2.12 The information and results that constitute the hazard and risk analysis shall be
maintained for the EUC and the EUC control system throughout the overall safety
lifecycle, from the hazard and risk analysis phase to the decommissioning or disposal
phase.

7.5 Overall safety requirements


7.5.2 Requirements
7.5.2.2 The necessary risk reduction shall be determined for each determined hazardous
event.
7.5.2.3 For situations where an appropriate application sector international standard exists
then such standards may be used.
7.5.2.7 The specification for the safety functions and the specification for the safety
integrity requirements shall together constitute the specification for the overall safety
requirements.

7.6 Safety requirements allocation


7.6.2 Requirements
7.6.2.13 The information and results of the safety requirements allocation acquired
together with any assumptions and justifications made, shall be documented.

7.7 Overall operation and maintenance planning


7.7.2 Requirements
7.7.2.1 A plan shall be prepared.
7.7.2.2 The routine maintenance activities that are carried out to detect unrevealed faults
should be determined by a systematic analysis.

7.8 Overall safety validation planning


7.8.2 Requirements
7.8.2.1 A plan shall be developed.
7.8.2.2 The information from 7.8.2.1 shall be documented and shall constitute the plan for
the overall safety validation of the E/E/PE safety-related systems.

7.9 Overall installation and commissioning planning


7.9.1 Objectives
7.9.2 Requirements
80

7.9.2.1 A plan for the installation of the E/E/PE safety-related systems shall be developed.
7.9.2.2 A plan for the commissioning of the E/E/PE safety-related systems shall be
developed,
7.9.2.3 The overall installation and commissioning planning shall be documented.

7.10 Realisation: E/E/PES


7.10.2 Requirements
The requirements that shall be met are contained in IEC 61508-2 and IEC 61508-3.

7.11 Realisation: other technology


Not covered in this standard.

7.12 Realisation: external risk reduction facilities


Not covered in this standard.

7.13 Overall installation and commissioning


7.13.2 Requirements
7.13.2.1 Installation activities shall be carried out in accordance with the plan for the
installation of the E/E/PE safety-related systems.
7.13.2.3 Commissioning activities shall be carried out in accordance with the plan for the
commissioning of the E/E/PE safety-related systems.
7.13.2.4 The information documented during commissioning shall include documentation
of commissioning activities, references to failure reports, resolution of failures and
incompatibilities.

7.14 Overall safety validation


7.14.2 Requirements
7.14.2.1 Validation activities shall be carried out in accordance with the overall safety
validation plan for the E/E/PE safety-related systems.
7.14.2.2 All equipment used for quantitative measurements, as part of the validation
activities, shall be calibrated against a specification traceable to a national standard or to
the vendor specification
7.14.2.3 The information documented during validation shall include documentation in
chronological form of the validation activities, the version of the specification for the
overall safety requirements being used, the safety function being validated (by test or by
analysis), tools and equipment used, along with calibration data, the results of the
validation activities, configuration identification of the item under test, the procedures
applied and the test environment, discrepancies between expected and actual results.
7.14.2.4 When discrepancies occur between expected and actual results, the analysis
made, and the decisions taken on whether to continue the validation or issue a change
request and return to an earlier part of the validation, shall be documented.

7.15 Overall operation, maintenance and repair


7.15.2 Requirements
7.15.2.1 The following shall be implemented:
– the plan for maintaining the E/E/PE safety-related systems
– the operation, maintenance and repair procedures for the E/E/PE safety-related systems
(see IEC 61508-2)
– the operation and maintenance procedures for software (see IEC 61508-3)
7.15.2.2 Implementation of the items specified in 7.15.2.1 shall include initiation of the
following actions:
– the implementation of procedures
– the following of maintenance schedules
– the maintaining of documentation
81

– the carrying out, periodically, of functional safety audits (see 6.2.1 k))
– the documenting of modifications that have been made to the E/E/PE safety-related
systems
7.15.2.3 Chronological documentation of operation, repair and maintenance of the
E/E/PE safety-related systems shall be maintained.
7.15.2.4 The exact requirements for chronological documentation will be dependent on
the specific application and shall, where relevant, be detailed in application sector
standards.

7.16 Overall modification and retrofit


7.16.2 Requirements
7.16.2.1 Prior to carrying out any modification or retrofit activity, procedures shall be
planned (see 6.2.1).
7.16.2.2 The modification and retrofit phase shall be initiated only by the issue of an
authorized request under the procedures for the management of functional safety (see
clause 6).
7.16.2.3 An impact analysis shall be carried out.
7.16.2.4 The results described in 7.16.2.3 shall be documented.
7.16.2.6 All modifications that have an impact on the functional safety of any E/E/PE
safety-related system shall initiate a return to an appropriate phase of the overall,
E/E/PES or software safety lifecycles. All subsequent phases shall then be carried out in
accordance with the procedures specified for the specific phases in accordance with the
requirements in this standard.
7.16.2.7 Chronological documentation shall be established and maintained.

7.17 Decommissioning or disposal


7.17.2 Requirements
7.17.2.1 Prior to any decommissioning or disposal activity, an impact analysis shall be
carried out.
7.17.2.2 The results described in 7.17.2.1 shall be documented.
7.17.2.3 The decommissioning or disposal phase shall only be initiated by the issue of an
authorized request under the procedures for the management of functional safety (see
clause 6).
7.17.2.5 Prior to decommissioning or disposal taking place a plan shall be prepared.
7.17.2.6 If any decommissioning or disposal activity has an impact on the functional
safety of any E/E/PE safety-related system, this shall initiate a return to the appropriate
phase of the overall, E/E/PES or software safety lifecycles. All subsequent phases shall
then be carried out in accordance with the procedures specified in this standard for the
specified safety integrity levels for the E/E/PE safety-related systems.
7.17.2.7 Chronological documentation shall be established and maintained.

7.18 Verification
7.18.2 Requirements
7.18.2.1 For each phase of the overall, E/E/PES and software safety lifecycles, a plan for
the verification shall be established concurrently with the development for the phase.
7.18.2.3 The verification shall be carried out according to the verification plan.
7.18.2.4 Information on the verification activities shall be collected and documented as
evidence that the phase being verified has, in all respects, been satisfactorily completed

8 Functional safety assessment


8.2 Requirements
8.2.3 The functional safety assessment shall be applied to all phases throughout the
overall, E/E/PES and software safety lifecycles.
82

8.2.5 If tools are used as part of design or assessment for any overall, E/E/PES or
software safety lifecycle activity, they should themselves be subject to the functional
safety assessment.
8.2.7 The functional safety assessment activities for the different phases of the overall,
E/E/PES and software safety lifecycles shall be consistent and planned

Annex A
(Informative)
Example documentation structure

Overall safety lifecycle phase Information

Concept Description (overall concept)


Overall scope definition Description (overall scope definition)
Hazard and risk analysis Description (hazard and risk analysis)
Overall safety requirements Specification (overall safety requirements, comprising:
overall
safety functions and overall safety integrity)
Safety requirements allocation Description (safety requirements allocation)
Overall operation and maintenance Plan (overall operation and maintenance)
planning
Overall safety validation planning Plan (overall safety validation)
Overall installation and commissioning Plan (overall installation);
planning Plan (overall commissioning)
Realisation Realisation of E/E/PE safety-related systems (see IEC
61508-2
and IEC 61508-3)
Overall installation and commissioning Report (overall installation);
Report (overall commissioning)
Overall safety validation Report (overall safety validation)
Overall operation and maintenance Log (overall operation and maintenance)
Overall modification and retrofit Request (overall modification);
Report (overall modification and retrofit impact
analysis);
Log (overall modification and retrofit)
Decommissioning or disposal Report (overall decommissioning or disposal impact
analysis);
Plan (overall decommissioning or disposal);
Log (overall decommissioning or disposal)
Concerning all phases Plan (safety);
Plan (verification);
Report (verification);
Plan (functional safety assessment);
Report (functional safety assessment)

Table A.1 – Example documentation structure for information related to the overall safety
lifecycle
83

E/E/PES safety lifecycle phase Information


E/E/PES safety requirements Specification (E/E/PES safety requirements,
comprising:
E/E/PES safety functions and E/E/PES safety
integrity)
E/E/PES validation planning Plan (E/E/PES safety validation)
E/E/PES design and development
E/E/PES architecture Description (E/ E/PES architecture design,
comprising:
hardware architecture and software architecture);
Specification (programmable electronic integration
tests);
Hardware architecture Specification (integration tests of programmable
electronic
Hardware module design and non programmable electronic hardware)
Description (hardware architecture design);
Specification (hardware architecture integration tests)
Component construction and/or Specification (hardware modules design);
procurement Specifications (hardware modules test)

Hardware modules;
Report (hardware modules test)

Programmable electronic integration Report (programmable electronic and software


integration
test) (see table A.3)
E/E/PES integration Report (programmable electronic and other hardware
integration test)
E/E/PES operation and maintenance Instruction (user);
procedures Instruction (operation and maintenance)
E/E/PES safety validation Report (E/E/PES safety validation)
E/E/PES modification Instruction (E/E/PES modification procedures);
Request (E/E/PES modification);
Report (E/E/PES modification impact analysis);
Log (E/E/PES modification)
Concerning all phases Plan (E/E/PES safety);
Plan (E/E/PES verification);
Report (E/E/PES verification);
Plan (E/E/PES functional safety assessment);
Report (E/E/PES functional safety assessment)

Table A.2 – Example documentation structure for information related to the E/E/PES
safety lifecycle
84

Software safety lifecycle phase Information


Software safety requirements Specification (software safety requirements, comprising:
software safety functions and software safety integrity)
Software validation planning Plan (software safety validation)
Software design and development
Software architecture Description (software architecture design) (see table A.2 for
hardware architecture design description);
Specification (software architecture integration tests);
Specification (programmable electronic and software
integration tests);
Instruction (development tools and coding manual)
Software system design Description (software system design);
Specification (software system integration tests)
Software module design Specification (software module design);
Specification (software module tests)
Coding List (source code);
Report (software module test);
Report (code review)
Software module testing Report (software module test)
Software integration Report (software module integration test);
Report (software system integration test);
Report (software architecture integration test)
Programmable electronic Report (programmable electronic and software integration
integration test)
Software operation and Instruction (user);
maintenance Instruction (operation and maintenance)
procedures
Software safety validation Report (software safety validation)
Software modification Instruction (software modification procedures);
Request (software modification);
Report (software modification impact analysis);
Log (software modification)
Concerning all phases Plan (software safety);
Plan (software verification);
Report (software verification);
Plan (software functional safety assessment);
Report (software functional safety assessment)

Table A.3 – Example documentation structure for information related to the software
safety lifecycle
85

14 Annex 5: Extracted version of IEC 61508-2


The extracted version has been produced by starting from the original standard text and
then removing examples, guidelines, details and motivations and further by focusing on
aspects related to processes, artefacts and referenced standards. The ratio of the extracted
information to the original is 1:10 or higher. Thus, the extracted version cannot be used
as a substitute for the original standard.

2 Normative references
• IEC 60050(371):1984,
International Electrotechnical Vocabulary – Chapter 371: Telecontrol
• IEC 60300-3-2:1993,
Dependability management – Part 3: Application guide – Section 2: Collection of
dependability data from the field
• IEC 61000-1-1:1992,
Electromagnetic compatibility (EMC) – Part 1: General – Section 1: Application
and interpretation of fundamental definitions and terms
• IEC 61000-2-5:1995,
Electromagnetic compatibility (EMC) – Part 2: Environment – Section 5:
Classification of electromagnetic environments – Basic EMC publication
• IEC 61508-1:1998,
Functional safety of electrical/electronic/programmable electronic safety-related
systems – Part 1: General requirements
• IEC 61508-3:1998,
Functional safety of electrical/electronic/programmable electronic safety-related
systems – Part 3: Software requirements
• IEC 61508-4:1998,
Functional safety of electrical/electronic/programmable electronic safety-related
systems – Part 4: Definitions and abbreviations
• IEC 61508-5:1998,
Functional safety of electrical/electronic/programmable electronic safety-related
systems – Part 5: Examples of methods for the determination of safety integrity
levels
• IEC 61508-6,
Functional safety of electrical/electronic/programmable electronic safety-related
systems – Part 6: Guidelines on the application of parts 2 and 3
• IEC 61508-7:2000,
Functional safety of electrical/electronic/programmable electronic safety-related
systems – Part 7: Overview of techniques and measures
• IEC Guide 104:1997,
The preparation of safety publications and the use of basic safety publications
and group safety publications
• ISO/IEC Guide 51:1990,
Guidelines for the inclusion of safety aspects in standards
• IEEE 352:1987,
IEEE guide for general principles of reliability analysis of nuclear power
generating station safety systems

7 E/E/PES safety lifecycle requirements


7.1.3 Requirements
7.1.3.1 If another E/E/PES safety lifecycle is used, it shall be specified during functional
safety planning (see clause 6 of IEC 61508-1), and all the objectives and requirements of
each subclause of IEC 61508-2 shall be met.
86

7.1.3.2 The procedures for management of functional safety (see clause 6 of IEC 61508-
1) shall run in parallel with the E/E/PES safety lifecycle phases.
7.1.3.3 Each phase of the E/E/PES safety lifecycle shall be divided into elementary
activities, with the scope, inputs and outputs specified for each phase (see table 1).
7.1.3.4 Unless justified during functional safety pla nning, the outputs of each phase of the
E/E/PES safety lifecycle shall be documented (see clause 5 of IEC 61508-1).

Figure 2 – E/E/PES safety lifecycle (in realisation phase)

E/E/PES safety requirements specification


E/E/PES safety validation planning
E/E/PES design and development
E/E/PES integration
E/E/PES installation, commission in, operation, and maintenance procedures
E/E/PES safety validation
E/E/PES modification
E/E/PES verification
E/E/PES functional safety assessment

Table 1 – Overview – Realisation phase of the E/E/PES safety lifecycle, only column for
safety lifecycle phase or activity is shown here.

7.2 E/E/PES safety requirements specification


7.2.2 General requirements
7.2.2.1 The specification of the E/E/PES safety requirements shall be derived from the
allocation of safety requirements, specified in 7.6 of IEC 61508-1, and from those
requirements specified during functional safety planning (see clause 6 of IEC 61508-1).
7.2.2.3 The specification of the E/E/PES safety requirements shall contain the
requirements for the E/E/PES safety functions (see 7.2.3.1) and the requirements for
E/E/PES safety integrity (see 7.2.3.2).
7.2.3 E/E/PES safety requirements

7.3 E/E/PES safety validation planning


7.3.2 Requirements

7.4 E/E/PES design and development


7.4.2 General requirements
7.4.2.1 The design of the E/E/PE safety-related system shall be created in accordance
with the E/E/PES safety requirements specification (see 7.2), taking into account all the
requirements of 7.4.
7.4.2.5 When independence between safety functions is required (see 7.4.2.3 and 7.4.2.4)
then the following shall be documented during the design: the method of achieving
independence, the justification of the method.
7.4.2.7 The developer of the E/E/PE safety-related system shall review the requirements
for safety-related software and hardware to ensure that they are adequately specified.
7.4.2.10 During the design and development activities, the significance (where relevant)
of all hardware and software interactions shall be identified, evaluated and documented.
7.4.3 Requirements for hardware safety integrity
7.4.3.1 Architectural constraints on hardware safety integrity
7.4.3.1.1 Any fault exclusions shall be justified and documented.
7.4.3.2 Requirements for estimating the probability of failure of safety functions due
to random hardware failures
7.4.4 Requirements for the avoidance of failures
87

7.4.4.3 Maintenance requirements, to ensure the safety integrity of the E/E/PE safety-
related systems is kept at the required level, shall be formalised at the design stage.
7.4.4.5 During the design, E/E/PES integration tests shall be planned and documented.
7.4.5 Requirements for the control of systematic faults
7.4.6 Requirements for system behaviour on detection of a fault
7.4.7 Requirements for E/E/PES implementation
7.4.7.3 The following information shall be available for each safety-related subsystem:
a) a functional specification of those functions and interfaces of the subsystem which can
be used by safety functions;
o) documentary evidence that the subsystem has been validated.
7.4.7.6 A previously developed subsystem shall only be regarded as proven in use when
there is adequate documentary evidence which is based on the previous use of a specific
configuration of the subsystem (during which time all failures have been formally
recorded, see 7.4.7.10), and which takes into account any additional analysis or testing.
7.4.7.10 Only previous operation where all failures of the subsystem have been
effectively detected and reported (for example, when failure data has been collected in
accordance with the recommendations of IEC 60300-3-2) shall be taken into account
when determining whether the above requirements (7.4.7.6 to 7.4.7.9) have been met.
7.4.8 Requirements for data communications

7.5 E/E/PES integration


7.5.2 Requirements
7.5.2.3 The integration of safety-related software into the PES shall be carried out
according to 7.5 of IEC 61508-3.
7.5.2.4 Appropriate documentation of the integration testing of the E/E/PE safety-related
systems shall be produced. If there is a failure, the reasons for the failure and its
correction shall be documented.
7.5.2.5 During the integration and testing, any modifications or change to the E/E/PE
safety-related systems shall be subject to an impact analysis which shall identify all
components affected and the necessary re-verification activities.

7.6 E/E/PES operation and maintenance procedures


7.6.2 Requirements
7.6.2.1 E/E/PES operation and maintenance procedures shall be prepared which shall
specify the following:
c) the documentation which needs to be maintained on system failure and demand rates
on the E/E/PE safety-related systems;
d) the documentation which needs to be maintained showing results of audits and tests on
the E/E/PE safety-related systems;.
e) the maintenance procedures to be followed when faults or failures occur in the E/E/PE
safety-related systems, including
– procedures for fault diagnoses and repair,
– procedures for revalidation,
– maintenance reporting requirements;
f) the procedures for reporting maintenance performance shall be specified. In particular:
– procedures for reporting failures,
– procedures for analysing failures;
g) procedures for maintaining the tools and equipment.
7.6.2.2 The E/E/PE safety-related system operation and maintenance procedures shall be
continuously upgraded.

7.7 E/E/PES safety validation


7.7.2 Requirements
88

7.7.2.1 The validation of the E/E/PES safety shall be carried out in accordance with a
prepared plan (see also 7.7 of IEC 61508-3).
7.7.2.3 Each safety function specified in the requirements for E/E/PES safety (see 7.2),
and all the E/E/PES operation and maintenance procedures shall be validated by test
and/or analysis.
7.7.2.4 Appropriate documentation of the E/E/PES safety validation testing shall be
produced.
7.7.2.5 When discrepancies occur (i.e. the actual results deviate from the expected results
by more than the stated tolerances), the results of the E/E/PES safety validation testing
shall be documented,

7.8 E/E/PES modification


7.8.2 Requirements
7.8.2.1 Appropriate documentation shall be established and maintained for each E/E/PES
modification activity.
7.8.2.2 Manufacturers or system suppliers shall maintain a system to initiate changes as a
result of defects being detected.
7.8.2.4 After modification, the E/E/PE safety-related systems shall be reverified and
revalidated.

7.9 E/E/PES verification


7.9.2 Requirements
7.9.2.1 The verification of the E/E/PE safety-related systems shall be planned
concurrently with the development (see 7.4), for each phase of the E/E/PES safety
lifecycle, and shall be documented.
7.9.2.6 The result of each verification activity shall be documented.
7.9.2.7 For E/E/PES safety requirements verification, after E/E/PES safety requirements
have been established (see 7.2), and before the next phase (design and development)
begins, verification shall
b) check for incompatibilities between
– the E/E/PES safety requirements (7.2),
– the safety requirements allocation (IEC 61508-1),
– the E/E/PES tests (see 7.4), and
– the user documentation and all other system documentation.
7.9.2.8 For E/E/PES design and development verification, after E/E/PES design and
development (see 7.4) has been completed and before the next phase (integration) begins,
verific ation shall
c) check for incompatibilities between
– the E/E/PES safety requirements (7.2),
– the E/E/PES design and development (7.4), and
– the E/E/PES tests (see 7.4).
7.9.2.9 For E/E/PES integration verification, the integration of the E/E/PE safety-related
systems shall be verified to establish that the requirements of 7.5 have been achieved.
7.9.2.10 Test cases and their results shall be documented.

8 Functional safety assessment


The requirements for functional safety assessment are as detailed in clause 8 of IEC
61508-1

Annex A
(normative)
Techniques and measures for E/E/PE safety-related systems: control of
failures during operation
89

Procedural and organisational techniques and measures are necessary throughout the
E/E/PES safety lifecycle to avoid introducing faults.

Annex B
(normative)
Techniques and measures for E/E/PE safety-related systems: avoidance
of systematic failures during the different phases of the lifecycle

Annex C
(normative)
Diagnostic coverage and safe failure fraction
Carry out a failure mode and effect analysis.
90

15 Annex 6: Extracted version of IEC 61508-3


The extracted version has been produced by starting from the original standard text and
then removing examples, guidelines, details and motivations and further by focusing on
aspects related to processes, artefacts and referenced standards. The ratio of the extracted
information to the original is 1:10 or higher. Thus, the extracted version cannot be used
as a substitute for the original standard.

2 Normative references
• IEC 61508-1:1998,
Functional safety of electrical/electronical/programmable electronic safety-
related systems – Part 1: General requirements
• IEC 61508-2,
Functional safety of electrical/electronical/programmable electronic safety-
related systems – Part 2: Requirements for electrical/electronical/programmable
electronic safety-related systems
• IEC 61508-4:1998,
Functional safety of electrical/electronical/programmable electronic safety-
related systems – Part 4: Definitions and abbreviations of terms
• IEC 61508-5:1998,
Functional safety of electrical/electronical/programmable electronic safety-
related systems – Part 5: Examples of methods for the determination of safety
integrity levels
• IEC 61508-6:
Functional safety of electrical/electronical/programmable electronic safety-
related systems – Part 6: Guidelines on the application of parts 2 and 3
• IEC 61508-7:
Functional safety of electrical/electronical/programmable electronic safety-
related systems – Part 7: Overview of techniques and measures
• ISO/IEC Guide 51:1990,
Guidelines for the inclusion of safety aspects in standards
• IEC Guide 104:1997,
Guide to the drafting of safety standards, and the role of Committees with safety
pilot functions and safety group functions

6 Software quality management system


6.2 Requirements
6.2.1 The requirements are as detailed in 6.2 of IEC 61508-1.
6.2.2 The functional safety planning shall define the strategy for the software
procurement, development, integration, verification, validation and modification to the
extent required by the safety integrity level of the E/E/PE safety-related system.
6.2.3 Software configuration management should
c) maintain all configuration items, at least the following: safety analysis and
requirements; software specification and design documents; software source code
modules; test plans and results; pre-existing included software components and packages;
all tools and development environments.
d) Document modification requests; to document the (partial) integration testing which
justifies the baseline (see 7.8);
e) Document configuration status, release status, modifications;
f) Formally document the release of safety-related software.

7 Software safety lifecycle requirements


7.1 General
91

7.1.2 Requirements
7.1.2.1 A safety lifecycle for the development of software shall be selected and specified
during safety planning in accordance with clause 6 of IEC 61508-1.
7.1.2.4 A V-model is used.
7.1.2.8 If at any stage of the software safety lifecycle, a change is required pertaining to
an earlier lifecycle phase, then that earlier safety lifecycle phase and the following phases
shall be repeated.

7.2 Software safety requirements specification


7.2.2 Requirements
7.2.2.5 The software developer shall establish procedures for resolving any disagreements
over the assignment of the software safety integrity level.

7.3 Software safety validation planning


7.3.2 Requirements
7.3.2.2 The plan for validating the software safety shall be made.
7.3.2.4 As part of the procedure for validating safety-related software, the scope and
contents of the planning shall be reviewed with the assessor if required by the safety
integrity level (see 8.2.12 of IEC 61508-1).

7.4 Software design and development


7.4.2 General requirements
7.4.3 Requirements for software architecture
7.4.4 Requirements for support tools and programming languages
7.4.4.3 The programming language shall have a translator/compiler which has either a
certificate of validation to a recognised national or international standard, or it shall be
assessed to establish its fitness for purpose.
7.4.4.5 Coding standards shall be used for the development of all safety-related software.
7.4.5 Requirements for detailed design and development
7.4.5.2 The following information should be available prior to the start of detailed design:
the specification of requirements for software safety (see 7.2); the description of the
software architecture design (see 7.4.3); the plan for validating the software safety (see
7.3).
7.4.6 Requirements for code implementation
7.4.7 Requirements for software module testing
7.4.7.3 The results of the software module testing shall be documented.
7.4.7.4 The procedures for corrective action on failure of test shall be specified.
7.4.8 Requirements for software integration testing
7.4.8.1 Software integration tests shall be specified concurrently during the design and
development phase.
7.4.8.4 The results of software integration testing shall be documented.
7.4.8.5 During software integration, any modification or change to the software shall be
subject to an impact analysis which shall determine all software modules impacted, and
the necessary re-verification and re-design activities.

7.5 Programmable electronics integration (hardware and software)


7.5.2 Requirements
7.5.2.1 Integration tests shall be specified during the design and development.
7.5.2.6 During the integration testing of the safety-related programmable electronics
(hardware and software), any modification or change to the integrated system shall be
subject to an impact analysis which shall determine all software modules impacted, and
the necessary re-verification activities.
7.5.2.7 Test cases and their results shall be documented for subsequent analysis.
92

7.5.2.8 The integration testing of the safety-related programmable electronics (hardware


and software) shall be documented, stating the test results,

7.6 Software operation and modification procedures


7.6.1 Objective
7.6.2 Requirements
The requirements are given in 7.6 of IEC 61508-2 and 7.8 of this standard.

7.7 Software safety validation


7.7.2 Requirements
7.7.2.2 The validation activities shall be carried out as specified during software safety
validation planning (see 7.3).
7.7.2.3 The results of software safety validation shall be documented.
7.7.2.6 The supplier and/or developer shall make available the documented results of the
software safety validation and all pertinent documentation to the system developer to
enable him to meet the requirements of IEC 61508-1 and IEC 61508-2 of this standard.
7.7.2.7 All equipment used for validation shall be qualified.
7.7.2.8 Test cases and their results shall be documented for subsequent analysis and
independent assessment as required by the safety integrity level (see 8.2.12 of IEC 61508-
1);

7.8 Software modification


7.8.2 Requirements
7.8.2.1 Prior to carrying out any software modification, software modification procedures
shall be made available (see 7.16 of IEC 61508-1).
7.8.2.2 A modification shall be initiated only on the issue of an authorised software
modification request under the procedures specified during safety planning (see clause 6).
7.8.2.4 The impact analysis results of the proposed software modification shall be
documented.
7.8.2.5 All modifications which have an impact on the functional safety of the E/E/PE
safety-related system shall initiate a return to an appropriate phase of the software safety
lifecycle. All subsequent phases shall then be carried out in accordance with the
procedures specified for the specific phases in accordance with the requirements in this
standard. Safety planning (see clause 6) should detail all subsequent activities.
7.8.2.8 Details of all modifications shall be documented.
7.8.2.9 Information (for example a log) on the details of all modifications shall be
documented. The documentation shall include the reverification and revalidation of data
and results.

7.9 Software verification


7.9.2 Requirements
7.9.2.1 The verification of software shall be planned (see 7.4) concurrently with the
development, for each phase of the software safety lifecycle, and this information shall be
documented.
7.9.2.6 All essential information from phase N of the software safety lifecycle needed for
the
correct execution of the next phase N+1 shall be available and should be verified.
7.9.2.7 Subject to 7.1.2.1, the following verification activities shall be performed:
a) verification of software safety requirements (see 7.9.2.8)
b) verification of software architecture (see 7.9.2.9)
c) verification of software system design (see 7.9.2.10)
d) verification of software module design (see 7.9.2.11)
e) verification of code (see 7.9.2.12)
f) data verification (see 7.9.2.13)
93

g) software module testing (see 7.4.7)


h) software integration testing (see 7.4.8)
i) programmable electronics integration testing (see 7.5)
j) software safety requirements testing (software validation) (see 7.7)
7.9.2.12 The source code shall be verified by static methods to ensure conformance to the
specified design of the software module (see 7.4.5), the required coding standards (see
7.4.4), and the requirements of safety planning (see 7.3).

8 Functional safety assessment


8.1 The objective and requirements of clause 8 of IEC 61508-1 apply to the assessment of
safety-related software.

8.2 Unless otherwise stated in application sector international standards, the minimum
level of independence of those carrying out the functional safety assessment shall be as
specified in 8.2.12 of IEC 61508-1.

Annex A
(normative)
Guide to the selection of techniques and measures
The ranking of the techniques and measures is linked to the concept of effectiveness used
in IEC 61508-2. For a particular application, the appropriate combination of techniques or
measures are to be stated during safety planning.
94

16 Annex 7: Extracted version of IEC 61713


The extracted version has been produced by starting from the original standard text and
then removing examples, guidelines, details and motivations and further by focusing on
aspects related to processes, artefacts and referenced standards. The ratio of the extracted
information to the original is 1:10 or higher. Thus, the extracted version cannot be used
as a substitute for the original standard.

1 Scope
This guide is intended to be used to support IEC 60300-3-6 and overall software life-
cycle as defined in ISO/IEC 12207.

2 Normative references
• IEC 60050(191),
International Electrotechnical Vocabulary (IEV) – Chapter 191: Dependability
and quality of service
• IEC 60300-2:1995,
Dependability management – Part 2: Dependability programme elements and
tasks
• IEC 60300-3-6:1997,
Dependability management – Part 3: Application guide – Section 6: Software
aspects of dependability
• IEC 61160,
Formal design review
• ISO/IEC 12207,
Information technology – Software life cycle processes
• ISO 8402,
Quality management and quality assurance – Vocabulary.

5 Dependability activities in the primary software processes


5.1.1 Specification of dependability requirements
5.1.2 Selection of supplier
If the supplier is being considered as a potential software developer, his development
process activities should be assessed according to the guidelines given in 5.3. If the
supplier is a software developer, then his maintenance support organization structure and
product support procedures should be assessed in accordance with the maintenance
process dependability guidelines in 5.5.

New software product:


If the supplier is a software developer, then the methods he will use to measure
dependability as defined in the requirement specification should be assessed. If software
is to be developed, the supplier's software development and implementation process
should be reviewed in accordance with the development process dependability guidelines
(see 5.3).

Software process maturity should be made available for review, if applicable.


5.1.3 Preparation of contracts
5.1.4 Supplier monitoring
Monitoring of the supplier’s review, audit, verification and validation activities (see
clause 6)
5.1.5 Acceptance and completion
Conductance of the acceptance test. The acquirer will accept the deliverable software
from the supplier when all the defined dependability acceptance conditions are satisfied.
95

5.2 Supply process


5.2.1 Initiation
The supplier should specifically review the dependability requirements of the acquirer
and deciding to bid or accept.
5.2.2 Preparation of response
5.2.3 Contract
5.2.4 Planning
The supplier should develop and document plans based upon the planning requirements.
Planning items to be considered under Project Management Plans are.
a) Project management planning – project organizational structure, responsibility and
authority – (project management plan)
b) Engineering environment for development, operation or maintenance – (system
engineering management plan-SEMP, software development plan-SDP)
c) Work breakdown structure including software products, services, resources, software
size, etc. – (WBS)
d) Management of the quality characteristics of the software product – (software QA
plan)
e) Reliability program including reliability growth test, software stress testing, etc –
(reliability program plan)
f) Failure mode and criticality analysis, single -point failure and isolation fault – software
FMECA
g) Management of the safety, security, and other critical requirements – (software safety
plan, computer security plan)
h) Subcontractor management plan
i) Verification and validation approach and agent – (IV&V plan)
j) Risk management, which involves potential technical, cost, and schedule risks – (risk
management plan)
k) Training of personnel – (training plan)
5.2.5 Execution and control
a) The supplier should implement and execute the project management plans referred to
in 5.2.4.
b) If the supplier is developing a software product, then the dependability activities
defined in 5.3 should be carried out.
c) If the supplier is using software contractors, he will act as an acquirer when acquiring
the software from the contractors. It is important therefore that the supplier, with respect
to his software contractors, should carry out the dependability activities defined in 5.1.
d) If the supplier is operating the software product or service, then the dependability
activities defined in 5.4 should be carried out.
e) If the supplier is maintaining the software product, then the dependability activities
defined in 5.5 should be carried out.
f) If the supplier is supporting the software product, then the support services required to
maintain the specified availability level, or which are the result of a specified requirement
or solution to a requirement, should be identified.
g) The areas of control should include monitoring and controlling the progress of these
dependability requirements, the progress of technical performance, their associate costs,
and schedules and reporting of project status. This section should also include problem
identification relatin g to the dependability function, recording, analysis and resolution.
5.2.6 Review and evaluation
The supplier organization is responsible for ensuring that the processes are in existence
and functional.
a) The supplier should carry out the review and evaluation activities defined in the project
plan and communicate the documented results to the acquirer if it is specified in the
contract or is required by the acquirer.
96

b) The review activities should include specific reference to dependability requirements.


c) Satisfaction of dependability requests or specific standards or regulations should be
demonstrated to the acquirer.
5.2.7 Delivery and completion
The acquirer should ensure that all specified or contracted software-related activities have
been completed or have achieved a mutually acceptable state before the software product
or service is delivered.
a) The supplier should deliver the software product or service as specified in the contract
and which has been the subject of the supply process dependability activities described in
5.2.1 to 5.2.6.
b) The supplier should support the delivered software product according to the
dependability activities defined in 5.5.

5.3 Development process


If the acquirer requires the supplier to develop a software product, it is important that all
the relevant activities of the development process (see ISO/IEC 12207) and the
supporting life-cycle processes are carried out. In particular, the review and evaluation
activities of the supporting life-cycle processes (see 5.2.6). should be carried out by the
supplier. Therefore, methods for continuously identifying, prioritizing, monitoring,
managing and tracking risks is recommended. The software developer’s activities should
include the use of prototyping, modelling and simulation.
5.3.1 Process implementation
The process implementation activity defines or selects a software life-cycle model
appropriate to the scope, magnitude and complexity of the project. The activities and
tasks of the development process are mapped onto the selected life-cycle model and
appropriate tools, methods, software languages and standards selected by the developer to
enable the activities of the development process to be carried out see IEC 60300-3-6. If
the developer has a dependability programme (see IEC 60300-2), the development
process for software products should be documented and referenced in the dependability
programme see IEC 60300-3-6.
5.3.2 System requirement analysis
The purpose is to produce a specification of functions, specificatio n of dependability
functions and requirements for them. Every relevant regulation or dependability standard
should be taken into account in the analysis of systems analysis requirements and
dependability function specification.
5.3.3 System architectural design
5.3.4 Software requirements analysis
To produce a specification of functions for each module or item of software, including
use of standards.
5.3.5 Software architectural design
5.3.6 Software detailed design
a) The developer should have well-documented and established software detailed design
activities.
c) The design input and output data at each stage of the software development process
should be recorded, analysed and documented for use in project review and to assist
product design improvements.
d) The procedures for design change control and reviews should be specified and
implemented (see IEC 61160).
e) The design verification process should be documented and implemented.
g) User documentation should be developed and updated as required in conjunction with
the software. It should be consistent with the revision level of the software.
5.3.7 Software coding and testing
a) The developer should have well-documented and established software coding and
testing procedures.
97

b) The procedure for software product test and validation should be documented and
should include a corrective action procedure.
c) There should be a fully documented, coordinated procedure for reporting software
defects and tracking their subsequent correction.
5.3.8 Software integration
The integration test results should be presented in an auditable form. The procedure for
software integration, system test and installation should be fully documented.
5.3.9 Software qualification testing
a) The developer should conduct qualification testing in accordance with any specific
dependability qualification requirements.
b) The developer should evaluate test coverage and conformance with any dependability
requirements.
c) There should be a full programme of technical reviews, internal audits and change
control reviews for any specific qualification tests.
5.3.10 System integration
a) The procedure for system integration and test should be fully documented.
b) The system integration test results should be subject to a full programme of technical
reviews, internal and external audits.
5.3.11 System qualification testing
The qualification test results should be audited and a qualification report prepared.
5.3.12 Software installation
a) The developer should install the software product or system according to the
installation documentation and verify that it is installed and operating as required.
b) Conformance with any specified installation related dependability requirements should
be verified and documented.
5.3.13 Software acceptance support
Given by the developer.

5.4 Operation process


5.4.1 Process implementation
a) The operation procedures should be documented and available to the users. The
manuals should be kept up to date via an active user document update service.
c) Customers' complaints during system operation and servicing should be retained via a
documented reporting procedure and analysed for prompt corrective action where
appropriate.
d) Procedures should be defined which specify any routine actions (for example backing
up or init ialising data, start-up or shutdown actions) which are necessary in order to meet
the identified dependability functions.
f) Procedures should be defined which specify how the software should be tested in its
operational environment and how the results of the testing are to be linked to maintenance
actions or to any reliability growth programme that is being implemented.
5.4.2 Operational testing
a) Assessment of the system dependability will require the establishment of a framework
for collection of data, selection of the appropriate software reliability assessment
technique and comparing the assessed dependability with the specified system
dependability function requirements.
c) Records should be kept of any changes in operational procedure so that any related
changes in system reliability can be identified and operational procedures improved.
5.4.3 System operation
a) The check for compliance with those dependability functions that specify the type and
frequency of the maintenance support activities should therefore be carried out.
b) The user documentation should be checked. If possible using audit. The results with a
report of any corrective actions should be reported to the acquirer.
c) Records should be kept of any changes in operational procedure after delivery.
98

5.4.4 User support


c) The supplier should have a procedure for receiving problem reports or enhancement
requests from the user, generating solutions and implementing the resulting software
update on the user’s system.
e) The supplier should record all support incidents for analysis and feedback to the
development and maintenance processes.
f) The supplier should provide a full programme of user training.

5.5 Maintenance process


5.5.1 Process implementation
The process implementation activity consists of the tasks that enable the maintainer to
develop, document and execute procedures for carrying out the activities of the
maintenance process described in 5.5.2 to 5.5.6. The supplier should ensure that there is a
set of documented procedures in place for receiving, recording and tracking problem
reports and modification requests from users and providing feedback to users. The
supplier should ensure that the documented procedures are being implemented by both
user and supplier and including training.
5.5.2 Problem and modification analysis
This concerns analysis of the problem report or modification request. The supplier should
check for compliance with the original dependability function requirements. The
maintainer should document each problem report or modification request, the results of
the analysis and the implementation options developed from the analysis results.
5.5.3 Modification implementation
Analysis to determine which documentation and associated software items need to be
modified followed by implementation of the identified modifications, test and evaluation
of the modified software and documentation items. The development process and all its
associated activities (see 5.3) should be used.
5.5.4 Maintenance review/acceptance
Carried out by the maintainer to determine the integrity of the modified system and to
obtain approval for completion of the modification. A review of the integration,
qualification and installation test results for dependability function compliance should be
carried out following completion of the modification implementation activity tasks (see
5.5.3).
5.5.5 Migration
When migration from a previous to a new operational environment shall be made.
Migration is made in the same way as problem and modification analysis (5.5.2),
modification implementation (5.5.3) and maintenance review/acceptance (5.5.4). A
migration plan should be developed, documented and executed. The user is given full
information concerning the migration.
5.5.6 Software retirement
A retirement pla n should be developed and documented. The user should include
archiving of the retiring software product, documentation and data in the retirement plan.

6 Dependability activities in the supporting software processes


The supporting software life-cycle processes include documentation, configuration
management, quality assurance, verification, validation, joint review, audit, and problem
resolution.

a) The information produced by the life-cycle processes should be documented and


maintained to the standards defined in the documentation process.
c) The joint review, audit and problem resolution processes activities should form an
integral part of all the software life-cycle processes.
d) The verification process determines whether the software has been produced according
to the requirements and conditions defined in the software life-cycle processes and the
99

validation process determines whether the software has been made according to the
specified requirements and should be carried out as the software progresses through its
life cycle via the primary and supporting processes. A final verification and validation
should form part of the product acceptance plan.

7 Dependability activities in the organizational software life -cycle processes


The organizational life-cycle processes include management, infrastructure, improvement
and training processes.

a) The management process contains the activities and tasks used to carry out product
management, project management and task management of the primary and supporting
processes.
b) The infrastructure includes hardware, software, tools, techniques, and standards for
development, operation or maintenance of the software.
c) The improvement process is a process for establishing, assessing, measuring,
controlling and improving a software life-cycle process. The process activities establish a
suite of organizational processes required to implement the primary and supporting
processes, set up an assessment mechanism to ensure their continuing effectiveness and
implement improvements considered necessary as a result of the assessment.
d) The training process provides and maintains trained personnel for the acquisition,
supply, development, operation and maintenance processes. The acquirer should confirm
that the supplier of the software has planned and implemented a training programme
including training documentation. The training plan should be reviewed for compliance
with requirements.

Annex B
(informative)
Interaction of users with primary software life -cycle processes
100

Acquirer Supplier Developer Operator Maintainer


Acquisition Performs Assesses Ensures
process acquisition statement of requirement
activities work specification
Negotiates meets
and operator
signs contract needs

Supply Signs contract Performs


process with supplier supply
Monitors activities
supplier’s
activities

Development Witnesses Reviews Performs


process qualification execution of development
testing development activities
procedures

Operation Reviews Defines Performs


process performance operating operation
of procedures activities
operational
system

Maintenance Provides Makes software Performs Performs


process maintenance maintainable version maintenance
support for (e.g. good code control and activities.
acquirer structure and updates
documentation) training
for operators
101

17 Annex 8: Extracted version of RTCA/DO-


178B
The extracted version has been produced by starting from the original standard text and
then removing examples, guidelines, details and motivations and further by focusing on
aspects related to processes, artefacts and referenced standards. The ratio of the extracted
information to the original is 1:10 or higher. Thus, the extracted version cannot be used
as a substitute for the original standard.

1.0 INTRODUCTION
Words such as "shall" and "must" are avoided. Generally system processes are not
considered.

2.0 SYSTEM ASPECTS RELATING TO SOFTWARE DEVELOPMENT


Between system safety assessment process and the system design process, the flow of
information described in these sections is iterative. System requirements are input to the
software life cycle processes.
2.1.2 Information Flow from Software Processes to System Processes
Information aspects are stated.

2.2 Failure Condition and Software Level


2.2.3 Software Level Determination

2.3 System Architectural Considerations


2.3.1 Partitioning
2.3.2 MULTIPLE-VERSION DISSIMILAR SOFTWARE
2.3.3 Safety Monitoring

2.4 System Considerations for User-Modifiable Software, Option-Selectable


Software and Commercial Off-The -Shelf Software
Conditions for user modification are given. COTS software included in airborne systems
or equipment should satisfy the objectives of this document and augmented if necessary.

2.5 System Design Considerations for Field-Loadable Software

2.6 System Requirements Considerations for Software Verification


The system requirements should be developed into software high-level requirements that
are verified by the software verification process activities.

2.7 Software Considerations in Sys tem Verification

3.0 SOFTWARE LIFE CYCLE


Integral processes are performed concurrently with the software development processes
throughout the software life cycle. A project defines one or more software life cycle(s) by
choosing the activities for each process, specifying a sequence for the activities, and
assigning responsibilities for the activities. Results of a prototype evaluation are used to
refine the requirements and to mitigate development and technical risks. The various parts
of the selected software life cycle are tied together with a combination of incremental
integration process and software verification process activities. Transition criteria are
used to determine whether a process may be entered or re-entered, i.e. iterative. Every
input to a process need not be complete before that process can be initiated, if the
transition criteria established for the process are satisfied. If a process acts on partial
inputs, subsequent inputs to the process should be examined to determine that the
102

previous outputs of the software development and software verification processes are still
valid.

4.0 SOFTWARE PLANNING PROCESS


4.1 Software Planning Process Objectives
To define the means of producing software.

4.2 Software Planning Process Activities


Definition of software development standards, methods and tools should be chosen.
Aspects on user-modifiable code are included. Provide definition of and coordination
between the software development and integral processes. The software planning process
should inc lude a means to revise the software plans. The software plans and software
development standards should be under change control and reviewed.

4.3 Software Plans


The software life cycle(s), including the inter-relationships between the processes, their
sequencing, feedback mechanisms, and transition criteria are determined. Development
and revision of the software plans are coordinated. Software plans specify the
organizations that will perform those activities. Quality aspects on software plans. Plan
for Software Aspects of Certification, Development Plan, Software Verification Plan,
Software Configuration Management Plan, Software Quality Assurance Plan.

4.4 Software Life Cycle Environment Planning


Define the methods, tools, procedures, programming languages and hardware that will be
used to develop, verify, control and produce the software life cycle data (section 11) and
software product.
4.4.1 Software Development Environment
Software development environment methods and tools should be chosen. Specification
and documentation aspects on use of tools are included. Software verification process
activities or software development standards, which include consideration of the software
level, should be defined.
4.4.2 Language and Compiler Consideration
The software planning process considers language and compiler features when choosing a
programming language and planning for verification. The verification planning should
provide a means of reverification.
4.4.3 Software Test Environment
Define the methods, tools, procedures and hardware that will be used to test the outputs of
the integration process. The differences between the target computer and the emulator or
simulator, and the effects of these differences on the ability to detect errors and verify
functionality, should be considered. Detection of those errors should be provided by other
software verification process activities and specified in the Software Verification Plan.

4.5 Software Development Standards


Included are the Software Requirements Standards, the Software Design Standards and
the Software Code Standards. The software verification process uses these standards as a
basis.

4.6 Review and Assurance of the Software Planning Process


Each process produces evidence that its outputs can be traced to their activity and inputs.

5.0 SOFTWARE DEVELOPMENT PROCESSES


5.1 Software Requirements Process
Inputs include the system requirements, the hardware interface and system architecture
from the system life cycle process, and the Software Development Plan and the Software
103

Requirements Standards from the software planning process. Output is the software high-
level requirements, the Software Requirements Data (subsection 11.9). Derived high-
level requirements are indicated to the system safety assessment process. Incorrect inputs
should be reported as feedback to the input source processes for clarification or
correction.

5.2 Software Design Process.


The software design process inputs are the Software Requirements Data, the Software
Development Plan and the Software Design Standards. The software high-level
requirements are refined through one or more iterations in the software design process to
make the Design Description (subsection I 1. 10) that includes the software architecture
and the low-level requirements (both should be traceable) that can be used to implement
Source Code. Derived low-level requirements are checked and provided to the system
safety assessment process. Feedback at incorrect inputs is provided to the system life
cycle process, the software requirements process, or the software planning process. Extra
requirements included concerning User-Modifiable Software and hierarchy for
requirements.

5.3 Software Coding Process


The coding process inputs are the low-level requirements and software architecture from
the software design process, and the Software Development Plan and the Software Code
Standards. The Source Code (subsection 1 1. I 1) and object code are produced by this
process based upon the software architecture and the low-level requirements. The Source
Code should be traceable to the Design Description. Inadequate or incorrect inputs
detected during the software coding process should be provided to the software
requirements process, software design process or software planning process as feedback
for clarification or correction.

5.4 Integration Process


The inputs are the software architecture from the software design process, and the Source
Code and object code from the software coding process. The outputs are the Executable
Object Code, as defined in subsection 11.12, and the linking and loading data (subsection
11.16). Feedback at incorrect inputs is provided to the software requirements process, the
software design process, the software coding process or the software planning process.
Additional requirements for patches and deactivated code are included. The methods for
handling deactivated code should comply with the software plans. The SCM process
should effectively track the patch. Provide evidence that the patch satisfies all objectives
of the software developed by normal methods. Justification in the Software
Accomplishment Summary for the use of a patch is included.

5.5 Traceability
SW tracability concerns both directions system requirement <=> high-level requirement.
Traceability between system requirements and software requirements, traceability
between the low-level requirements and high-level and traceability between Source Code
and low-level requirements should be provided.

6.0 SOFTWARE VERIFICATION PROCESS


To detect and report errors. Removal of the errors is an activity of the software
development processes.

6.2 Software Verification Process Activities


The inputs include the system requirements, the software requirements and architecture,
traceability data, Source Code, Executable Object Code, and the Software Verification
Plan. The outputs are recorded in Software Verification Cases and Procedures (subsection
104

11.13) and Software Verification Results (subsection 11.14). This includes traceability
between the software requirements and the test cases and between the code structure and
the test cases. Further aspects on traceability and quality aspects on verification are
described. Deficiencies and errors should be reported to the software development
processes for clarification and correction.

6.3 Software Reviews and Analyses


6.3.1 Reviews and Analyses of the High-Level Requirements
Detect and report requirements errors introduced during the software requirements
process. Quality aspects on high-level requirement including traceability from system.
Conformance to Requirements Standards are included.
6.3.2 Reviews and Analyses of the Low-Level Requirements
Detect and report requirements errors introduced during the software design process.
Quality aspects on low-level requirement including traceability from high-level
requirements. Conformance to Software Design Standards are included.
6.3.3 Reviews and Analyses of the Software Architecture
Detect and report errors introduced during the development of the software architecture.
This includes Compatibility with high-level requirements. Quality aspects on software
architecture, and Conformance to Software Design Standards are included.
6.3.4 Reviews and Analyses of the Source Code
Detect and report errors introduced during the software coding process. Compliance with
the low-level requirements and software architecture is important. Quality aspects on
Source Code including traceability from low-level requirements, and Conformance to
software code are included standards.
6.3.5 Reviews and Analyses of the Outputs of the Integration Process
6.3.6 Reviews and Analyses of the Test Cases, Procedures and Results

6.4 Software Testing Process


6.4.1 Test Environment
6.4.2 Requirements -Based Test Case Selection.
6.4.3 Requirements -Based Testing Methods
6.4.4 Test Coverage Analysis

7.0 SOFTWARE CONFIGURATION MANAGEMENT (SCM) PROCESS


The SCM process is applied as defined by the software planning process (section 4) and
the Software Configuration Management Plan (subsection 11.4). Outputs of the SCM
process are recorded in Software Configuration Management Records (subsection 11.18)
or in other software life cycle data.

7.1 Software Configuration Management Process Objectives


Working in cooperation with the other software life cycle processes.

7.2 Software Configuration Management Process Activities


The SCM process continues throughout the service life of the airborne system or
equipment.
7.2.1 Configuration Identification
Used for each software life cycle data and configuration item.
7.2.2 Baselines and Traceability
A software product baseline should be defined in the Software Configuration Index
(subsection II. 16). A baseline or configuration item should be traceable either to the
output it identifies or to the process with which it is associated.
7.2.3 Problem Reporting, Tracking and Corrective Action
Problem report are according to subsection 11.17. Problem reports that require corrective
action should invoke the change control activity.
105

7.2.4 Change Control


Software life cycle processes should be repeated from the point at which the change
affects their outputs. Throughout the change activity, software life cycle data affected by
the change should be updated and records should be maintained for the change control
activity. The change control activity is aided by the change review activity.
7.2.5 Change Review
This includes assessment of the impact on safety-related requirements with feedback to
the system safety assessment process.
7.2.6 Configuration Status Accounting
This includes definition of the data to be maintained and the means of recording and
reporting status of this data.
7.2.7 Archive, Retrieval and Release
7.2.8 Software Load Control
Records should be kept that confirm software compatibility with the airborne system or
equipment hardware.
7.2.9 Software Life Cycle Environment Control
The software life cycle environment tools are defined by the software planning process
and identified in the Software Life Cycle Environment Configuration Index (subsection
11.15). Configuratio n identification should be established for the Executable Object Code
(or equivalent) of the tools used to develop, control, build, verify, and load the software.
The SCM process for controlling qualified tools, should comply with the objectives
associated with Control Category I or 2 data (subsection 7.3), as specified in paragraph
12.2.3, item b. c. Unless 7.2.9 item b applies, the SCM process for controlling the
Executable Object Code (or equivalent) of tools used to build and load the software
should comply with the objectives associated with Control Category 2 data, as a
minimum.

7.3 Data Control Categories


Control Category 1 (CC1) and Control Category 2 (CC2). These categories are related to
the configuration management controls placed on the data.

8.0 SOFTWARE QUALITY ASSURANCE PROCESS


The SQA process is applied as defined by the software planning process (section 4) and
the Software Quality Assurance Plan (subsection 11.5). Outputs of the SQA process
activities are recorded in Software Quality Assurance Records (subsection 11.19) or other
software life cycle data.

8.1 Software Quality Assurance Process Objectives


Software development processes and integral processes comply with approved software
plans and standards. The transition criteria for the software life cycle processes are
satisfied.

8.2 Software Quality Assurance Process Activities


Deviations from the software plans and standards are detected, recorded, evaluated,
tracked and resolved. Approved deviations are recorded. The SQA process should
produce records of the SQA process activities
(subsection 11.19).

8.3 Software Conformity Review

9.0 CERTIFICATION LIAISON PROCESS


The certification liaison process is applied as defined by the software planning process
(section 4) and the Plan for Software Aspects of Certification (subsection 11.1).
106

9.1 Means of Compliance and Planning


The Plan for Software Aspects of Certification (subsection 11.1) defines the software
aspects of the airborne system or equipment within the context of the proposed means of
compliance. This plan also states the software level(s) as determined by the system safety
assessment process.

9.2 Compliance Substantiation


Submit the Software Accomplishment Summary (subsection 11.20) and Software
Configuration Index (subsection 11.16) to the certification authority.

9.3 Minimum Software Life Cycle Data That Is Submitted to Certification Authority
Plan for Software Aspects of Certification, Software Configuration Index, Software
Accomplishment Summary.

9.4 Software Life Cycle Data Related to Type Design


Unless otherwise agreed by the certification authority, the regulations concerning
retrieval and approval of software life cycle data related to the type design applies to:
Software Requirements Data, Design Description, Source Code, Executable Object Code,
Software Configuration Index, Software Accomplishment Summary

10.0 OVERVIEW OF AIRCRAFT AND ENGINE CERTIFICATION


10.2 Software Aspects of Certification
The certification authority assesses the Plan for Software Aspects of Certification.

10.3 Compliance Determination


For the software, compliance with the certification basis is accomplished by reviewing
the Software Accomplishment Summary.

11.0 SOFTWARE LIFE CYCLE DATA


Control such as minutes, memoranda (if intended to be used for this purpose). This data
should be defined in the software plan that defines the process for which the data will be
produced.

11.1 Plan for Software Aspects of Certification


Summarizes the justification provided by the system safety assessment process. Summary
of each software life cycle and its processes, the system life cycle processes and
certification liaison process responsibilities.

11.2 Software Development Plan


It may be included in the Plan for Software Aspects of Certification. A description of the
software life cycle processes to be used to form the specific software life cycle(s) to be
used on the project, including the transition criteria for the software development
processes.

11.3 Software Verification Plan


This includes organizational responsibilities within the software verification process and
interfaces with the other software life cycle processes. The transition criteria for entering
the software verification process defined in this plan.

11.4 Software Configuration Management Plan.


Establishes the methods to be used to achieve the objectives of the software configuration
management (SCM) process. The transition criteria for entering the SCM process are
defined. A definition of the software life cycle data produced by the SCM process,
including SCM Records, the Software Configuration Index and the Software Life Cycle
107

Environment Configuration Index. The means of applying SCM process requirements to


sub-tier suppliers.

11.5 Software Quality Assurance Plan


This includes the SQA activities that are to be performed for each software life cycle
process. A list is given below.
• The transition criteria for entering the SQA process.
• The timing of the SQA process activities in relation to the activities of the
software life cycle processes.
• A definition of the records to be produced by the SQA process.
• A description of the means of ensuring that sub-tier suppliers processes and
outputs will comply with the SQA Plan.

11.6 Software Requirements Standards


Define the methods, rules and tools to be used to develop the high-level requirements.
This includes the method to be used to provide derived requirements to the system
process.

11.7 Software Design Standards


Define the methods, rules and tools to be used to develop the software architecture and
low-level requirements.

11.8 Software Code Standards


Define the programming languages, methods, rules and tools to be used to code the
software.

11.9 Software Requirements Data


This includes a definition of the high-level requirements including the derived
requirements.

11.10 Design Description


This includes a definition of the software architecture and the low-level requirements that
will satisfy the software high-level requirements.

11.11 Source Code


Consists of code and the compiler instructions for generating the object code from the
Source Code, and linking and loading data.

11.12 Executable Object Code

11.13 Software Verification Cases and Procedures


Detail how the software verification process activities are implemented.

11.14 So ftware Verification Results


Produced by the software verification process activities.

11.15 Software Life Cycle Environment Configuration Index (SECI)


Identifies the configuration of the software life cycle environment.

11.16 Software Configuration Inde x (SCI)


Identifies the configuration of the software product. Reference is given to the Software
Life Cycle Environment Configuration Index (subsection II. 15).

11.17 Problem Reports


108

Identify and record software product anomalous behavior, process non-compliance with
software plans and standards, and deficiencies in software life cycle data.

11.18 Software Configuration Management Records


The results of the SCM process activities are recorded in SCM Records.

11.19 Software Quality Assurance Records.


The results of the SQA process activities are recorded in SQA Records.

11.20 Software Accomplishment Summary


Showing compliance with the Plan for Software Aspects of Certification. References the
software life cycle data.

12.0 ADDITIONAL CONSIDERATIONS


The intention to use previously developed software is stated in the Plan for Software
Aspects of Certification.

12.1 Use of previously developed software


12.1.1 Modifications to Previously Developed Software
Where outputs of the previous software life cycle processes comply with this document.
a. The revised outputs of the system safety assessment process should be reviewed
considering the proposed modifications.
b. If the software level is revised, the guidelines of paragraph 12.1.4 should be
considered.
e. Areas affected by the change should be reverified considering the guidelines of section
6.
12.1.2 Change of Aircraft Installation
b. If functional modifications are required, the guidelines of paragraph 12.1.1, should be
satisfied.
c. If the previous development activity did not produce outputs required to substantiate
the safety objectives of the new installation, the guidelines of paragraph 12.1.4, should be
satisfied.
12.1.3 Change of Application or Development Environment
It may require activities (e.g. tool qualification, tests, reviews) in addition to software life
cycle process activities which address modifications.
12.1.4 Upgrading A Development Baseline
b. Software aspects of certification should be based as determined by the system safety
assessment process.
c. Software life cycle data from a previous development should be evaluated to ensure
that the software verification process objectives of the software level are satisfied for the
new application.
e. If use of product service history is planned the guidelines of paragraph 12.3.5 should be
considered.
f. The applicant should specify the strategy for accomplishing compliance with this
document in the Plan for Software Aspects of Certification.
12.1.5 Software Configuration Management Considerations
In addition to the guidelines of section 7.
a. Traceability from the software product and software life cycle data of the previous
application to the new application.
b. Change control that enables problem reporting, problem resolution, and tracking of
changes to software components used in more than one application.
12.1.6 Software Quality Assurance Considerations
In addition to the guidelines of section 8:
109

b. Assurance that changes to the software life cycle processes are stated in the software
plans.

12.2 Tool Qualification


A tool qualification process is used.
c. The software configuration management process and software quality assurance
process objectives for airborne software should apply to software tools to be qualified.
The intention to use the tool is stated in the Plan for Software Aspects of Certification.
12.2.1 Qualification Criteria for Software Development Tools
a. If a software development tool is to be qualified, the software development processes
for the tool. should satisfy the same objectives as the software development processes of
airborne software.
c. The applicant should demonstrate that the tool complies with its Tool Operational
Requirements (subparagraph 12.2.3.2).
12.2.2 Qualification Criteria for Software Verification Tools
It should be achieved by demonstration that the tool complies with its Tool Operational
Requirements.
12.2.3 Tool Qualification Data
Plan for Software Aspects of Certification should specify the tool to be qualified and
reference the tool qualification data. For software development tools, the tool
qualification data should be consistent with
the data in section 11:
(1) A Tool Qualification Plan describes the tool qualification process and description and
use of the tool and satisfies the same objectives as the Plan for Software Aspects of
Certification.
(2) Tool Operational Requirements satisfies the same objectives as the Software
Requirements Data.
(3) A Tool Accomplishment Summary satisfies the same objectives as the Software
Accomplishment Summary.
12.2.3.2 Tool Operational Requirements
b. User information, such as installation guides and user manuals.
c. A description of the tool's operational environment.
12.2.4 Tool Qualification Agreement
The certification authority gives its agreement to the use of a tool in two steps according
to the list below.
• For software development tools, agreement with the Tool Qualification Plan.
• For software verification tools, agreement with the Plan for Software Aspects of
Certification of the airborne software.
• For software development tools, agreement with the Tool Accomplishment
Summary.
• For software verification tools, agreement with the Software Accomplishment
Summary of the airborne software.

12.3 Alternative Methods


The applicant should specify in the Plan for Software Aspects of Certification, and obtain
agreement from the certification authority.
12.3.1 Formal Methods
12.3.2 Exhaustive Input Testing
12.3.3 Considerations for Multiple -Version Dissimilar Software Verification
If the software verification process is modified evidence should be provided that the
software verification process objectives are satisfied and that equivalent error detection is
achieved for each software version. Changes in software verification process activities
may be agreed with the certification authority, if the changes are substantiated by
110

rationale that confirms equivalent software verification coverage. The tool qualification
process may be modified.
12.3.4 Software Reliability Models
Rationale for the model should be included in the Plan for Software Aspects of
Certification, and agreed with by the certification authority.
12.3.5 Product Service History
If equivalent safety for the software can be demonstrated by the use of the software's
product service history, some certification credit may be granted. The data described
should be specified in the Plan for Software Aspects of Certification.

Annex A
(Normative)
PROCESS OBJECTIVES AND OUTPUTS BY SOFTWARE LEVEL
Guidelines are given in tables A-1 to A-10.
111

18 Annex 9: Extracted version of IEC 60601-1-


4
The extracted version has been produced by starting from the original standard text and
then removing examples, guidelines, details and motivations and further by focusing on
aspects related to processes, artefacts and referenced sta ndards. The ratio of the extracted
information to the original is 1:10 or higher. Thus, the extracted version cannot be used
as a substitute for the original standard.

1.203 Relationship to other standards


1.203.1 IEC 60601-1
For MEDICAL ELECTRICAL EQUIP MENT, this Collateral Standard complements IEC
60601-1 and its amendments.
1.203.2 Particular Standards
A requirement in a Particular Standard takes priority over the corresponding requirement
in this Collateral Standard.
1.203.3 Normative references
• IEC 60601-1: 1988,
Medical electrical equipment – Part 1: General requirements for safety
Amendment No. 1 (1991) Amendment No. 2 (1995)
• IEC 60601-1-1:1992,
Medical electrical equipment – Part 1: General requirements for safety – 1.
Collateral Standard: Safety requirements for medical electrical systems
• IEC 60788:1984,
Medical radiology – Terminology
• ISO 9000-3:1991,
Quality management and quality assurance standards – Part 3: Guidelines for the
application of ISO 9001 to the development, supply and maintenance of software
• ISO 9001:1994,
Quality systems – Model for quality assurance in design, development,
production, installation and servicing

6 Identification, marking and documents


6.8 ACCOMPANYING DOCUMENTS
6.8.201 All relevant information regarding significant RESIDUAL RISK including
descriptions of the HAZARDS and any actions by the OPERATOR or the USER
necessary to avoid/mitigate them shall be placed in both the INSTRUCTIONS FOR USE
and the RISK MANAGEMENT FILE.

SECTION 9: ABNORMAL OPERATION AND FAULT CONDITIONS;


ENVIRONMENTAL TESTS
52 Abnormal operation and fault conditions
52.201 Documentation
52.201.1 Documents produced from application of this standard shall be maintained and
shall form part of the quality records; see figure 201. This should be done in accordance
with 6.3 of ISO 9000-3.
52.201.2 These documents, herein referred to as the RISK MANAGEMENT FILE, shall
be approved, issued and changed in accordance with a formal configuration management
system. This should be done in accordance with 6.2 of ISO 9000-3.
52.201.3 A RISK MANAGEMENT SUMMARY shall be developed throughout the
DEVELOPMENT LIFE-CYCLE as part of the RISK MANAGEMENT FILE.
Compliance is checked by inspection of the RISK MANAGEMENT FILE.
112

Figure 201 – Content of RISK MANAGEMENT FILE and RISK MANAGEMENT


SUMMARY

52.202 RISK management plan


52.202.1 The MANUFACTURER shall prepare a RISK management plan.
52.202.2 This plan shall include the following.
a) scope of the plan, defining the project or product and the DEVELOPMENT LIFE-
CYCLE phases for which the plan is applicable;
b) the DEVELOPMENT LIFE-CYCLE to be applied (see 52.203), including a
VERIFICATION plan and a VALIDATION plan;
c) management responsibilities in accordance with 4.1 of ISO 9001;
d) RISK management process;
52.202.3 If the plan changes during the course of development, a record of the changes
shall be kept. Compliance is checked by inspection of the RISK MANAGEMENT FILE.

52.203 DEVELOPMENT LIFE-CYCLE


52.203.2 The DEVELOPMENT LIFE-CYCLE shall be divided into phases and tasks,
with a well-defined input, output and activity for each.
52.203.3 The DEVELOPMENT LIFE-CYCLE shall include integral processes for RISK
management.
52.203.4 The DEVELOPMENT LIFE-CYCLE shall include documentation
requirements.
52.203.5 RISK management activities shall apply throughout the DEVELOPMENT
LIFE-CYCLE as appropriate; see 52.204. Compliance is checked by inspection of the
RISK MANAGEMENT FILE.
52.203.6 Where appropriate, a defined system for problem resolution within and between
all phases and tasks of the DEVELOPMENT LIFE CYCLE shall be developed and
maintained as part of the RISK MANAGEMENT FILE. The system may include
reporting of potential or existing SAFETY and/or performance problems.

52.204 RISK management process


52.204.1 A RISK management process includes RISK analysis and RISK control.
52.204.2 The process shall be applied throughout the DEVELOPMENT LIFE-CYCLE.
52.204.3 RISK analysis
52.204.3.1 HAZARD ANALYSIS
52.204.3.1.1 HAZARD identification shall be carried out as defined in the RISK
management plan; see 52.202.
52.204.3.1.6 Matters considered shall include, among other things, user interface and
INSTRUCTIONS FOR USE, third party software.
52.204.3.1.8 The methods used shall be documented in the RISK MANAGEMENT
FILE.
52.204.3.1.9 The results of the application of the methods shall be documented in the
RISK MANAGEMENT FILE.
52.204.3.1.10 Each identified HAZARD and its initiating causes shall be recorded in the
RISK
MANAGEMENT SUMMARY. Compliance is checked by inspection of the RISK
MANAGEMENT FILE.
52.204.3.2 RISK estimation
52.204.3.2.3 The SEVERITY level categorization method shall be recorded in the RISK
MANAGEMENT FILE.
52.204.3.2.4 The likelihood estimation method shall be either quantitative or qualitative
and shall be recorded in the RISK MANAGEMENT FILE.
113

52.204.3.2.5 The estimated RISK shall be recorded against each HAZARD in the RISK
MANAGEMENT SUMMARY. Compliance is checked by inspection of the RISK
MANAGEMENT FILE.
52.204.4 RISK control
52.204.4.4 Adequate USER information on the RESIDUAL RISK.
52.204.4.5 The requirement(s) to control the RISK shall be documented in the RISK
MANAGEMENT SUMMARY (directly or as a cross reference).
52.204.4.6 An evaluation of the effectiveness of the RISK controls shall be recorded in
the RISK MANAGEMENT SUMMARY. Compliance is checked by inspection of the
RISK MANAGEMENT FILE.

52.205 Qualification of personnel


The design and modification of a PEMS shall be considered as an assigned task in
accordance with 4.18 of ISO 9001. Compliance is checked by inspection of the
appropriate files.

52.206 Requirement specification


52.206.1 For the PEMS and each of its subsystems (e.g. for a PESS) there shall be a
requirement specification.
52.206.2 The requirement specification shall detail the functions that are RISK-related.

52.207 Architecture
52.207.5 The architecture specification shall be made

52.208 Design and implementation


52.208.1 Where appropriate, the design shall be decomposed into subsystems, each
having a design and test specification.
52.208.2 Descriptive data regarding the design environment shall be included in the RISK
MANAGEMENT FILE.

52.209 VERIFICATION
52.209.2 A VERIFICATION plan shall be produced to show how the SAFETY
requirements for each DEVELOPMENT LIFE-CYCLE phase will be verified.
52.209.3 The VERIFICATION shall be performed according to the VERIFICATION
plan. The results of the VERIFICATION activities shall be documented, analyzed and
assessed.
52.209.4 A reference to the methods, techniques and results of the VERIFICATION shall
be included in the RISK MANAGEMENT SUMMARY.

52.210 VALIDATION
52.210.2 A VALIDATION plan shall be produced to show that correct SAFETY
requirements have been implemented.
52.210.3 The VALIDATION shall be performed according to the VALIDATION plan.
The results of
VALIDATION activities shall be documented, analyzed and assessed.
52.210.5 All professional relationships of the members of the VALIDATION team with
members of the design team shall be documented in the RISK MANAGEMENT FILE.
52.210.7 A reference to the methods and results of the VALIDATION shall be included
in the RISK MANAGEMENT FILE. Compliance is checked by inspection of the RISK
MANAGEMENT FILE.

52.211 Modification
52.211.1 If any or all of a design results from a modification of an earlier design then
either all of this standard applies as if it were a new design or the continued validity of
114

any previous design documentation shall be assessed under a modification/change


procedure.
52.211.2 All relevant documents in the DEVELOPMENT LIFE-CYCLE shall be revised,
amended, reviewed, approved under a document control scheme in accordance with 4.5.2
of ISO 9001 or equivalent. Compliance is checked by inspection of the RISK
MANAGEMENT FILE.

52.212 Assessment
52.212.1 Assessment shall be carried out to ensure that the PEMS has been developed in
accordance with the requirements of this standard and recorded in the RISK
MANAGEMENT FILE. Compliance is checked by inspection of the RISK
MANAGEMENT FILE.

Annex BBB
(informative)
Rationale
Iteration of portions of the process is expected, but no requirements have been given
because the need to repeat processes is unique to a particular project. Iterations also arise
from the more detailed understanding that merges during the design process.

RISK management process:


For a particular medical application under consideration, a Particular Standard will
provide more specific methods for managing RISK, including pass/fail requirements.

Annex DDD
(informative)
Development life-cycle
Figure DDD.1 – A DEVELOPMENT LIFE-CYCLE model for PEMS

Table DDD.1 – Suggested correlation of the documentation requirement to the


DEVELOPMENT LIFE-CYCLE phases
Below not the complete table is listed, only the column containing the documents.

DOCUMENT LIST
Identified hazards and their initiating causes
Estimated RISK
Requirements to control RISK
RISK management plan
Deveklopment life-cycle
PEMS requirement specification
Verification plan
Validation plan
Subsystem (e.g. PES S) requirement specification
PEMS architecture specification
PESS architecture specification
Subsystem design specification
Subsystem test specification
Verification methods and results
Validation methods and results
Evaluation of effectiveness of the RISK controls
Residual RISK
Assessment report
RISK management summary
115

19 Annex 10: Extracted version of EN 50128


The extracted version has been produced by starting from the original standard text and
then removing examples, guidelines, details and motivations and further by focusing on
aspects rela ted to processes, artefacts and referenced standards. The ratio of the extracted
information to the original is 1:10 or higher. Thus, the extracted version cannot be used
as a substitute for the original standard.

2 Normative references
• EN 50126
Railway applications - The specification and demonstration of Reliability,
Availability, Maintainability and Safety (RAMS)
• EN 50129 (at draft stage)
Railway applications - Safety related electronic systems for signalling
• EN 50159-1
Railway applications - Communication, signalling and processing systems Part 1:
Safety-related communication in closed transmission systems
• EN 50159-2
Railway applications - Communication, signalling and processing systems Part 2:
Safety-related communication in open transmission systems
• EN ISO 9001
Quality systems - Model for quality assurance in design/development,
production, installation and servicing
• EN ISO 9000-3
Quality management and quality assurance standards ΠPart 3: Guidelines for the
application of ISO 9001:1994 to the development, supply, installation and
maintenance of computer software

4 Objectives and conformance


Requirements, techniques and measures are related to software safety integrity level.
4.5 If a technique or measure is ranked as highly recommended (HR) in the tables then
the rationale for not using that technique should be detailed and recorded either in the
Software Quality Assurance Plan or in another document referenced by the Software
Quality Assurance Plan. This is not necessary if an approved combination of techniques
given in the corresponding table is used.
4.6 If a technique or measure is proposed to be used that is not contained in the tables
then its effectiveness and suitability shall be recorded in either the Software Quality
Assurance Plan or in another document referenced by the Software Quality Assurance
Plan.
4.7 Compliance with the requirements of a particular clause and their respective
techniques and measures detailed in the tables shall be assessed by the inspection of
documents required by this standard, other objective evidence, auditing and the
witnessing of tests.
4.8 This European Standard requires the use of a package of techniques and their correct
application. These techniques are required from the tables and detailed in the
bibliography.

5 Software safety integrity levels


5.2 Requirements
5.2.1 There shall be produced, in accordance with EN 50126 and EN 50129, System
Requirements Specification, System Safety Requirements Specification, System
Architecture Description, System Safety Plan,
116

5.2.6 The software safety integrity level shall be specified in the Software Requirements
Specification (clause 8). If different software components have different software safety
integrity levels, these shall be specified in the Software Architecture Specification (clause
9).

6 Personnel and responsibilities


6.2 Requirements
6.2.1 As a minimum, the supplier and/or developer and the customer shall implement the
relevant parts of EN ISO 9001, in accordance with the guidelines contained in EN ISO
9000-3.
6.2.2 Except at software safety integrity level zero, the safety process shall be
implemented under the control of an appropriate safety organisation which is compliant
with the "Safety Organisation" sub-clause in the "Evidence of Safety Management"
clause of EN 50129.
6.2.4 It is highly recommended that the training, experience and qualifications of all
personnel involved in all the phases of the Software Lifecycle, including management
activities, be justified with respect to the particular applic ation, except at software safety
integrity level zero.
6.2.5 The justification contained in 6.2.4 shall be recorded in the Software Quality
Assurance Plan.
6.2.9 The parties responsible for the various clauses are according to the list.
• Software Requirements Specification (clause 8) Designer
• Software Requirements Test Specification (clause 8) Validator
• Software Architecture (clause 9) Designer
• Software Design and Development (clause 10) Designer
• Software Verification and Testing (clause 11) Verifier
• Software/Hardware Integration (clause 12) Designer
• Software Validation (clause 13) Validator
• Software Assessment (clause 14) Assessor

7 Lifecycle issues and documentation


7.2 Requirements
7.2.1 A lifecycle model for the development of software shall be selected. It shall be
detailed in the Software Quality Assurance Plan in accordance with clause 15 of this
European Standard.
7.2.2 Quality Assurance procedures shall run in parallel with lifecycle activities and use
the same terminology.
7.2.3 All activities to be performed during a phase shall be defined prior to the phase
commencing. Each phase of the software lifecycle shall be divided into elementary tasks
with a well defined input, output and activity for each of them.
7.2.4 The Software Quality Assurance Pla n shall describe which verification steps and
reports are required.
7.2.6 Traceability of documents shall be provided for and documented relationship with
other documents. In addition, each document except documents for COTS software (see
9.4.5) or previously developed software (see 9.4.6) shall be written.
7.2.8 To the extent required by the software safety integrity level, the documents listed in
the Documents Cross Reference Table (see below) shall be produced.
7.2.10 A Document Cross Reference Table (shown in the standard) is used for showing
the relation between documents, phases and clauses.

Below only the column listing the documents is shown.


117

System Requirements Specification


System Safety Requirements Specification
System Architecture Description
System Safety Plan
SW Quality Assurance Plan
SW Configuration Management Plan
SW Verification Plan
SW Integration Test Plan
SW/HW Integration Test Plan
SW Validation Plan
SW Maintenance Plan
Data Preparation Plan
Data Test Plan
SW Requirements Specification
Application Requirements Specification
SW Requirements Test Specification
SW Requirements Verification
SW Architecture Specification
SW Design Specification
SW Arch. and Design Verification Report
SW Module Design Specification
SW Module Test Specification
SW Module Verification Report
SW Source Code
SW Source Code Verification Report
SW Module Test Report
SW Integration Test Report
Data Test Report
SW/HW Integration Test Rep
SW Validation Report
SW Assessment Report
SW Change Records
SW Maintenance Records

8 Software requirements specification


8.2 Input documents
System Requirements Specification, System Safety Requirements Specification, System
Architecture Description, Software Quality Assurance Plan

8.3 Output docume nts


Software Requirements Specification, Software Requirements Test Specification

8.4 Requirements
8.4.1 Express the required properties of the software. These properties are all (except
safety) defined in ISO/IEC 9126. The software safety integrity level shall be derived as
defined in clause 5 and recorded in the Software Requirements Specification.
8.4.2 To the extent required by the software safety integrity level the Software
Requirements Specification shall be expressed and structured in such a way that it is
traceable back to all documents mentioned under 8.2.
8.4.13 A Software Requirements Test Specification shall be developed from the Software
Requirements Specification. This test specification shall be used for verification of all the
requirements as described in the Software Requirements Specification.
8.4.15 Traceability to requirements and means shall be provided to allow this to be
demonstrated throughout all phases of the lifecycle.
118

9 Software architecture
9.2 Input documents
Software Requirements Specification, System Safety Requirements Specification, System
Architecture Description, Software Quality Assurance Plan

9.3 Output documents


Software Architecture Specification

9.4 Requirements
9.4.1 The proposed software architecture shall be established by the software supplier
and/or developer and detailed in the Software Architecture Specification.
9.4.2 The Software Architecture Specification shall consider the feasibility of achieving
the Software Requirements Specification at the required software safety integrity level.
9.4.3 The Software Architecture Specification shall identify, evaluate and detail the
significance of all hardware/software interactions. As required by EN 50126 and EN
50129, the preliminary studies concerning the interactions between hardware and
software shall have been recorded in the System Safety Requirements Specification.
9.4.5 If COTS software is to be used at software safety integrity levels 1 or 2, it shall be
included in the software validation process. For levels 3 or 4, COTS software shall be
included in the validation testing. Error logs shall exist and shall be evaluated;
9.4.6 If previously developed software is to be used then it shall be clearly documented.
There shall be evidence that interface specifications to other modules which are not being
re-verified, re-validated and re-assessed are being followed.

10 Software design and implementation


10.2 Input documents
Software Requirements Specification, Software Architecture Specification, Software
Quality Assurance Plan

10.3 Output documents


Software Design Specification, Software Module Design Specification, Software Module
Test Specification, Software Source code and supporting documentation, Software
Module Test Report

10.4 Requirements
10.4.1 The Software Requirements Specification and the Software Architecture
Specification shall be available, although not necessarily finalised, prior to the start of the
design process.
10.4.3 The Software Design Specification shall describe the software design based on a
decomposition into modules with each module having a Software Module Design
Specification and a Software Module Test Specification.
10.4.9 To the extent required by the software safety integrity level, the programming
language selected shall have a translator/compiler which has one of the following:
i) a "Certificate of Validation" to a recognised National/International standard;
ii) an assessment report which details its fitness for purpose;
iii) a redundant signature control based process that provides detection of the translation
errors.
10.4.11 For any alternative language detailing its fitness for purpose shall be recorded in
the Software Architecture Specification or Software Quality Assurance Plan.
10.4.12 Coding standards shall be developed and used for the development of all
software. These shall be referenced in the Software Quality Assurance Plan (see 15.4.5).
10.4.14 Software Module Testing: Each module shall have a Software Module Test
Specification which the module shall be tested against. A Software Module Test Report
119

shall be produced and shall include test cases and their results shall be recorded in a
machine readable form for subsequent analysis.
10.4.18 Traceability of requirements to the design or other objects which fulfil them.
Traceability of design objects to the implementation objects which instantiate them. The
output of the traceability process shall be the subject of formal configuration
management.

11 Software verification and testing


11.2 Input documents
System Requirements Specification, System Safety Requirements Specification, Software
Requirements Specification, Software Requirements Test Specification, Software
Architecture Specification, Software Design Specification, Software Module Design
Specification, Software Module Test Specification, Software Source code and supporting
documentation, Software Quality Assurance Plan, Software Module Test Report

11.3 Output documents


Software Verification Plan, Software Requirements Verification Report, Software
Architecture and Design Verification Report, Software Module Verification Report,
Software Source Code Verification Report, Software Integration Test Plan, Software
Integration Test Report

11.4 Requirements
11.4.1 A Software Verification Plan shall be created.
11.4.3 The Software Verification Plan shall describe the activities to be performed to
ensure correctness and consistency with respect to the products and standards provided as
input to that phase.
11.4.9 The results of each verification shall be retained in a form defined or referenced in
the Software Verification Plan such that it is auditable.
11.4.10 After each verification activity a verification report shall be. The verification
reports shall address the following:
i) items which do not conform to the Software Requirements Specification, Software
Design Specification or Software Module Design Specifications;
ii) items which do not conform to the Software Quality Assurance Plan;
11.4.11 Software Requirements Verification: Once the Software Requirements
Specification has been established, verification shall address:
i) the adequacy of the Software Requirements Specification in fulfilling the requirements
set out in the System Requirements Specification, the System Safety Requirements
Specification and the Software Quality Assurance Plan;
ii) the adequacy of the Software Requirements Test Specification as a test of the Software
Requirements Specification;
iii) the internal consistency of the Software Requirements Specification.
The results shall be recorded in a Software Requirements Verification Report.
11.4.12 Software Architecture and Design Verification: After the Software Architecture
Specification and the Software Design Specification have been established, verification
shall address:
i) the adequacy of the Software Architecture Specification and the Software Design
Specification in fulfilling the Software Requirements Specification;
ii) the adequacy of the Software Design Specification for the Software Requirements
Specification.
iii) the adequacy of the Software Integration Test Plan as a set of test cases for the
Software Architecture Specification and the Software Design Specification;
iv) the internal consistency of the Software Architecture and Design Specifications.
The results shall be recorded in a Software Architecture and Design Verification Report.
120

11.4.13 Software Module Verification: After each Software Module Design Specification
has been established, verification shall address:
i) the adequacy of the Software Module Design Specification in fulfilling the Software
Design Specification;
ii) the adequacy of the Software Module Test Specification as a set of test cases for the
Software Module Design Specification;
iii) the decomposition of the Software Design Specification into software modules and
the Software Module Design Specifications.
iv) the adequacy of the Software Module Test Reports as a record of the tests carried out
in accordance with the Software Module Test Specification.
The results shall be recorded in a Software Module Verification Report.
11.4.14 Software Source Code Verification: To the extent demanded by the software
safety integrity level the Software Source Code shall be verified to ensure conformance to
the Software Module Design Specification and the Software Quality Assurance Plan. This
shall include a check to determine whether the coding standards have been applied
correctly. The results shall be recorded in a Software Source Code Verification Report.
11.4.15 A Software Integration Test Report shall be produced as follows:
i) a Software Integration Test Report shall be produced stating the test results and
whether the objectives and criteria of the Software Integration Test Plan have been met.
ii) the Software Integration Test Report shall be in a form that is auditable;
iii) test cases and their results shall be recorded, preferably in machine readable form for
subsequent analysis;
v) the identity and configuration of the items verified.
11.4.16 For software/hardware integration, see 12.4.8.

12 Software/hardware integration
12.2 Input documents
System Requirements Specification, System Safety Requirements Specification, System
Architecture Description, Software Requirements Specification, Software Requirements
Test Specification, Software Architecture Specification, Software Design Specification,
Software Module Design Specification, Software Module Test Specification, Software
Source code and supporting documentation, Hardware documentation.

12.3 Output documents


Software/Hardware Integration Test Plan, Software/Hardware Integration Test Report

12.4 Requirements
12.4.1 For software safety integrity levels greater than zero, a Software/Hardware
Integration Test Plan will be created.
12.4.7 Test cases and their results shall be recorded, preferably in machine readable form
for subsequent analysis.
12.4.8 A Software/Hardware Integration Test Report shall be produced as follows:
i) Software/Hardware Integration Test Report shall state the test results and whether the
objectives and criteria of the Software/Hardware Integration Test Plan have been met.
ii) the Software/Hardware Integration Test Report shall be in a form that is auditable;
iii) test cases and their results shall be recorded, preferably in a machine-readable form
for subsequent analysis.

13 Software validation
13.2 Input documents
Software Requirements Specification, All Hardware and Software Documentation,
System Safety Requirements Specification

13.3 Output documents


121

Software Validation Plan, Software Validation Report

13.4 Requirements
13.4.3 A Software Validation Plan shall be established and detailed in suitable
documentation.
13.4.7 The Software Validation Plan shall identify the steps necessary in fulfilling the
safety requirements set out in the System Safety Requirements Specification. The
Validator shall check that the verification process is complete.
13.4.10 The results of the validation shall be documented in the Software Validation
Report in an auditable form.
13.4.11 Once hardware/software integration is finished, a Software Validation Report
shall be produced as follows:
i) it shall state whether the objectives and criteria of the Software Validation Plan have
been met.
ii) it shall state the tests results and whether the whole software on its target machine
fulfils the requirements set out in the Software Requirements Specification;
iii) an evaluation of the test coverage on the requirements of the Software Requirements
Specification shall be provided;
13.4.13 Any discrepancies found, including detected errors, shall be clearly identified in a
separate section of the Software Validation Report and included in any release note that
accompanies the delivered software.
13.4.14 The software shall be tested against the Software Requirements Test
Specification. These tests shall show that all of the requirements in the Software
Requirements Specification are correctly performed. The results shall be recorded in a
Software Validation Report.

14 Software assessment
14.2 Input documents
System Safety Requirements Specification, All Hardware and Software Documentation

14.3 Output documents


Software Assessment Report

14.4 Requirements
14.4.2 Software with a Software Assessment Report from another Assessor does not have
to be an object for an entirely new assessment.
14.4.5 The Assessor shall assess that the software of the system is fit for its intended
purpose and
responds correctly to safety issues derived from the System Safety Requirements
Specification.
14.4.9 The Assessor shall produce a report for each review that shall detail his assessment
results.

15 Software quality assurance


15.2 Input documents
All the documents available at each stage of the lifecycle.

15.3 Output documents


Software Quality Assurance Plan, Software Configuration Management Plan. The plans
shall be issued at the beginning of the project and updated during the lifecycle.

15.4 Requirements
122

15.4.1 The supplier and/or developer shall have and use as a minimum a Quality
Assurance System compliant with EN ISO 9000 series, to support the requirements of
this European Standard. EN ISO 9001 accreditation is highly recommended.
15.4.2 As a minimum, the supplier and/or developer and the customer shall implement for
the software development the relevant parts of EN ISO 9001, in accordance with the
guidelines contained in EN ISO 9000-3.
15.4.3 The supplier and/or developer shall prepare and document, on a project by project
basis, a Software Quality Assurance Plan to implement the requirements of 15.4.1 and
15.4.2 of this European Standard.
15.4.5 All activities, actions, documents, etc. required by all the sections of EN ISO
9000-3 and of this European Standard (annex A included) shall be specified or referenced
in the Software Quality Assurance Plan. None of the lists in EN ISO 9000-3 shall be
presumed to be exhaustive.
15.4.6 As a minimum, configuration management shall be carried out in accordance with
the guidelines contained in EN ISO 9000-3.
15.4.7 The adequacy and results of Software Verification Plans shall be examined.
15.4.8 The supplier and/or developer shall establish, document and maintain procedures
for External Supplier Control. New software shall be developed and maintained in
conformity with the Software Quality Assurance Plan of the Supplier or with a specific
Software Quality Assurance Plan prepared by the external supplier in accordance with the
Software Quality Assurance Plan of the Supplier.
15.4.9 The supplier and/or developer shall establish, document and maintain procedures
for Problem Reporting and Corrective Actions. These procedures shall implement the
relevant parts of EN ISO 9001, covering re-test, re-verification, re-validation and re-
assessment. As a minimum, problem reporting and corrective action management shall be
applied in the software lifecycle starting immediately after Software Integration and
before the starting of formal Software Validation, also covering the whole phase of
Software Maintenance.

16 Software maintenance
16.2 Input documents
All documents.

16.3 OUTPUT DOCUMENTS


Software Maintenance Plan, Software Change Records, Software Maintenance Record

16.4 Requirements
16.4.1 As a minimum, maintenance shall be carried out in accordance with the guidelines
contained in EN ISO 9000-3.
16.4.2 Maintainability shall be designed into the software system, in particular, by
following the requirements of clause 10 of this European Standard. ISO/IEC 9126 should
also be.
16.4.3 Procedures for the maintenance of software shall be established and recorded in
the Software Maintenance Plan.
16.4.4 The maintenance activities shall be audited against the Software Maintenance
Plan, at intervals defined in the Software Quality Assurance Plan.
16.4.7 External supplier control, problem reporting and corrective actions shall be
managed with the same criteria specified in the relevant paragraphs of the Software
Quality Assurance cla use.
16.4.8 A Software Maintenance Record shall be established for each Software Item
before its first release, and it shall be maintained. In addition to the requirements of EN
ISO 9000-3 for "Maintenance Records and Reports",
16.4.9 A Software Change Record shall be established for each maintenance activity.
123

17 Systems configured by application data


17.2 Input documents
Software Requirements Specification, Software Architecture Specification

17.3 Output documents


Application Requirements Specification, Data Preparation Plan, Data Test Plan, Data Test
Report

17.4 Requirements
17.4.1 Data Preparation Lifecycle
17.4.1.1 Application Requirements Specification
This shall include standards that the application must comply with.
17.4.1.2 Overall Installation Design
17.4.1.3 Data Preparation
The data preparation process shall include the production of specific information (e.g.
control tables), production of the data source code and its compilation, checking and other
verification activities, and testing of the application data.
17.4.1.4 Integration and Acceptance
The system shall be commissioned as a fully operational system, and a final acceptance
process shall be carried out on the complete installation.
17.4.1.5 Validation and Assessment
Validation and assessment activities shall audit the performance of each stage of the life-
cycle.
17.4.2 Data Preparation Procedures and Tools
Specific data preparation procedures and tools shall be developed to allow the data
preparation lifecycle specified in 17.4.1.
17.4.2.1 At the Software Design phase for the system configured by application data, a
Data Preparation Plan shall be produced.
17.4.2.2 The Data Preparation Plan shall allocate a safety integrity level to any hardware
or software tools used in the data preparation lifecycle.
17.4.2.3 Where new notations are introduced, the necessary user documentation must be
provided and training shall also be provided.
17.4.2.4 The verification, test, validation and assessment reports required to demonstrate
that the data preparation has been carried out in accordance with the plan. This
information shall be contained in the Data Test Plan and the results shall be recorded in
the Data Test Report.
17.4.2.5 All data and associated documentation shall be subject to the configuration
management requirements of section 15 of this standard. Configuration management
records shall be created.
17.4.3 Software Development
17.4.3.1 The system safety integrity level will determine the standards to be applied.

The relevant figures are listed below.

Figure 2 – Software Safety Route Map


Figure 3 – Development Lifecycle 1
Figure 4 – Development Lifecycle 2

Annex A
(normative)
Criteria for the Selection of Techniques and Measures
If a Highly Recommended technique or measure is not used then the rationale behind not
using it should be detailed in the Software Quality Assurance Plan or in another
document referenced by the Software Quality Assurance Plan;
124

If a Not Recommended technique or measure is used then the rationale behind using it
should be detailed in the Software Quality Assurance Plan or in another document
referenced by the Software Quality Assurance Plan.

Annex B
(informative)
Bibliography of techniques
69 techniques are described.

Vous aimerez peut-être aussi