Vous êtes sur la page 1sur 20

IMPROVING THE SOFTWARE PROCESS FOR EMBEDDED SYSTEMS

Hans R. Brands

Anton H. Evers

Hasselt (B), June 1999

Class 302 & 312


INTRODUCTION

Philips Digital Video Systems Product Services (DVS-PS) in Hasselt, Belgium,


develop and produce innovative consumer electronic products such as CD - recordable -
rewritable audio recorders, DVD players, and digital settop boxes for the global market.
These products contain embedded software of an ever-increasing complexity. The market
demands an ever decreasing time-to-market. Like other software groups within Philips,
the Hasselt software group faces this challenge in a two-path strategy, consisting of
architecture improvement and software process improvement. Our contribution to this
conference deals with software process improvement and the use of the Capability
Maturity Model (CMM) as a reference model for process improvement.

This paper first describes the CMM and then its application at the software
development department of Philips DVS-PS.

THE CAPABILITY MATURITY MODEL

Background, software project internals

Any software project can be characterised by 3 factors, namely required


functionality (including quality), allowed cost and allowed lead-time. At the start of a
software project these three factors should be clearly defined in order to be able to control
the project in a reasonable way. Dozens of project disasters can be traced back to
uncertainty about one of these factors. We will come back to control of these external
factors in the next chapter.

Spec. Design Code Test

Figure 1. The software development phases over time.

Having established these external factors, a first step towards control of the
project is making a work-breakdown. Traditionally, a software project is divided into a
specification, design, code and test phase (Fig. 1). In order to be able to control the
development, these phases are further divided into smaller work packages. Sizes of the
work-packages and their required effort are estimated and an analysis of the risks we face
is made. A schedule is designed. In other words, planning the project is necessary before
we can measure whether the project is on schedule. Once the project has started, regular
measurement of the progress is necessary to gain insight into and oversight of the project.
We need insight into the project in order to control it, we need oversight in order to be
able to report on an appropriate level to interested parties inside and outside the project:
we need project tracking (Fig. 2).

Planning & Tracking

Spec. Design Code Test

Figure 2. The software development phases + project management.

The quality of the outcome of the project is highly dependent of the quality of
project deliverables and of the extent to which the agreed software process is executed. In
order to control the quality of processes and products, a suitable form of quality assurance
should be established (Fig. 3).

Planning & Tracking

Spec. Design Code Test

Quality Assurance

Figure 3. Quality Assurance added.

The larger the software project, the more intermediate deliverables are made
available to project team members. And as change of deliverables is a daily practice, each
of the team members should have access to the latest accepted version of the deliverables
that he / she needs. A suitable form of configuration management should be put in place
to retain control over the change requests, the problem reports and the changing
deliverables, be it documents or code (Fig. 4).
It should be noted, that the quality assurance and configuration management
activities have been extended to last before and after the project, both types of activities
being most fruitful when they are embedded in the standing organisation.

Planning & Tracking

Spec. Design Code Test

Quality Assurance

Configuration Management

Figure 4. Complete view of a software project.

Having software project planning, software project tracking, software quality


assurance and software configuration management in place means having a basic control
over the software project internals. What about the software project externals?

Background, software project externals

Controlling a software project means controlling the internal processes and


controlling the external influences on the software project. The major external influences
on the software project are changes on requirements, be it functional requirements
(including quality requirements) or financial or lead-time boundary conditions.
Controlling them means managing of the requirements and contract management, so it
seems appropriate to have a requirements management process available, in order to
balance changing requirements with contractual project obligations.

Even more so, complex software projects are executed by subcontracting (parts
of) the project, for reasons of efficiency or available capacity or knowledge. It seems
obvious that, in order to control the project, a software subcontract management process
should be present.

Conclusion: controlling external influences at least means requirements


management and (if applicable) subcontract management.
So far about controlling external and internal factors and influences in order to
retain control over a software project. What is the relationship between controlling a
software project and the capability maturity model?

History of the Capability Maturity Model (CMM)

The CMM was developed by the Software Engineering Institute of Carnegie


Mellon University in response to the Department of Defences wish to evaluate the
capability of its software suppliers. It was first published in 1993 in the form of Technical
Reports of the SEI. Since then it has evolved and has been used by major industries all
over the world to assess their software processes and to guide the improvement of their
software processes. The model has evolved from studies of the Software Engineering
Institute by Watts Humphrey and his colleagues (Ref.1 and 2). They studied the
characteristics of software developing organisations. It turned out that on a similar scale
of maturity, organisations had common focus on their internal processes. This led to the
Capability Maturity Model which identifies five levels of maturity of software developing
organisations, namely the Initial Level (1), the Repeatable Level (2), the Defined Level
(3), the Managed Level (4) and the Optimising Level (5).

The levels of the CMM

Organisations can improve their capability through managing their processes


along the lines of the CMM, starting on their initial level (mostly Level One). What is to
be found within the various levels?

Optimising (5)

Managed (4)

Defined (3)

Repeatable (2)

Initial (1)

Figure 5. Levels of the CMM.


The Repeatable Level

A Level One software process is characterised as ad-hoc or chaotic, with only a


limited number of defined processes. Basically any software process that is not a level 2
or higher process is at the Initial Level. About 70% of the software development groups
are acting on this level.

A Level Two software process is characterised as a software process that has the
above mentioned elementary controlling processes operational. These internal and
external controlling processes Requirements Management, Software Project Planning,
Software Project Tracking and Oversight, Software Quality Assurance, Software
Subcontract Management and Software Configuration Management are identified as Key
Process Areas (KPAs) for Level Two in the Capability Maturity Model. It should be
noted that these KPA's have only little software specific content, but are general project
control process areas.

The Defined Level

Once a software organisation has gained basic control over its software projects,
other areas become subject to improvement. The Defined Level is characterised by seven
Key Process Areas that define and maintain the software process. The seven KPA's are:

Organisation Process Focus: the area that establishes and improves a common
understanding of the overall software process.

Organisation Process Definition: the area that defines and maintains the
definition of a standard software process, from which software projects are
documented instantiations.

Training Program: the area that defines the development of software skills in
the organisation.

Integrated Software Management: the area that integrates the standard software
process with the Planning and Tracking level 2 KPA's, leading to synergy
between engineering processes and management processes.

Software Product Engineering: the area that is concerned with the technical
software engineering activities and their coherent relations.

Intergroup Co-ordination: the area that is concerned with the relations between
the software development organisation and surrounding disciplines and
organisations, such as hardware development, quality, service and
manufacturing.

Peer Reviews: the area that is concerned with the defect removal process that
can be implemented through inspections and walk-throughs.
Having these seven KPA's effectively in place means that the organisation
behaves as an organisation with a Defined Level software process, as a Level Three
organisation. The next maturity level is the Managed Level (level 4). What about the
KPA's that serve this level?

The Managed Level

The aim of a level 4 organisation is to control its software processes


quantitatively. Figures from the review processes reveal the need for improving the
design process; figures from the tracking process feed back into the estimating database,
statistically significant figures are used to tune the training plan and the coding rules. The
Key Process Areas that are relevant to such an organisation are

Quantitative Process Management, aimed at controlling the software processes


quantitatively and

Software Quality Management, aimed at controlling the (quality of the)


software products quantitatively.

The organisation that is able to control its software process this way, may call
itself an organisation with a Managed Level software process. It seems obvious that only
a limited number of organisations have been able to grow to this level of capability.

The Optimising Level

Yet, there is another step to take for mature organisations. The step towards the
Optimising Level is characterised by

Defect Prevention, fighting the common causes of defects instead of fighting


the defects, by means of a systematic and planned approach

Technology Change Management, aimed at systematic and planned evaluation


and introduction of new technologies that improve quality and productivity

Process Change Management, aimed at continuous and planned software


process improvement with participation of the whole organisation.

For those of us who have been trying to improve productivity with the aid of
CASE tools, the Capability Maturity Model tells us that successful introduction should be
built on a mature process!
The KPA internals

Each Key Process Area is described with a number of goals that have to be
achieved in order to have the KPA fully satisfied.

The way to reach the goal may be different for various organisations; however,
the CMM gives guidance by offering a number of key practices for each KPA. The key
practices address institutionalisation or implementation. In the model the key practices
are organised by the following common features: Commitments to Perform, Abilities to
Perform, Activities Performed, Measurements and Analysis, and Verifying
Implementation.

The Commitment to Perform key practices describes the organisations formal


understanding of the need of the key practice; policy statements and assignment of
leadership or sponsorship roles are documented here.

The Abilities to Perform ask for proper budgeting, for an adequate organisational
structure, for training of and orientation into the KPA.

The Activities Performed guide actions that people having various roles in the
organisation should perform in order to make the key process effective. These activities
may differ from organisation to organisation, but if you start from scratch, you may use
them literally as a starting point.

The Measurements and Analysis key practices describe mostly status


determinations in order to control and improve the key process.

The Verifying Implementation key practices describe whether senior


management, the project manager and the quality assurance function verify the key
process practices regularly and event-driven.

Software Process Improvement using the CMM

A precondition for a successful improvement program is management awareness.


If the organisation does not have a business case for software process improvement, any
attempt to improvement is bound to fail. In most cases, however, the need for either
project predictability, or project lead-time decrease or product quality increase is a
management issue.

The Philips approach to improvement programs is depicted in Fig.6. We will


provide more details in the next part of this presentation. For now we focus on the
assessment of the current situation. Here the Capability Maturity Model can be used as
the yardstick to identify the current software capability of a given software development
organisation. The measurement or assessment of the organisation's capability can be
performed either by independent consultancy organisations, or by specialised
departments or by the software organisation itself. In the latter case guidance by a
specialist is recommended.

management define
awareness
awareness objectives

assess
verify
current
results
situation

define
execute improvement
improvements plan

schedule establish
improvements commitment

Figure 6. The improvement cycle.

From the results of the assessment the areas of improvement can easily be
identified, whereas the CMM provides guidance in the Key Process Areas that can be
most fruitfully implemented. The CMM recommends improving level by level, however,
if business needs call for another priority, or if an organisation already has practices in
place, other implementation tactics may be followed. The various activities to implement
improvements and their priorities appear in an improvement plan, which after having
been agreed by the parties involved, lead to an improvement schedule.

The details of the improvement areas can be found in the detailed descriptions of
the CMM Key Process Areas (Ref. 2).

In designing an improved process, the CMM, being used as a reference model for
measuring the capability of a software organisation, is also often used as the reference
model to design an improved software organisation. We believe an even richer model
should be used, making use of the treasures of the software engineering community as
well as our own organisational treasures.
For that purpose we investigated into software process models and we designed a
general model for the software process as a reference for the DVS Product Services
softwareprocess. The model is based on IEEE1074, ISO 12207, CMM and proprietary
software process models, baptized the Smiley model (Ref. 5). An impression of this
reference model is given in Fig. 7. The primary software creation process is depicted in
the centre from top to bottom. Management processes are in the hat of smiley, whereas
supporting processes are in the mouth.

Figure 7. The Smiley reference model of a software process.

What are our experiences using the CMM to improve our software process?
IMPLEMENTATION OF THE IMPROVEMENT PROCESS AT PHILIPS DVS-PS

The software engineering group in Hasselt is part of a development organisation


that develops complete consumer products: hardware, software, the production process,
encasing, manuals. DVS-PS do not have a marketing function. It develops and produces
for other Philips business units. All hardware development is concentrated in Hasselt,
Belgium. Software development is distributed over Eindhoven (The Netherlands), Dublin
(Ireland), Bangalore (India), Grenoble (France) and Hasselt. Software sizes increase
rapidly: the first CD player contained 4 KB of software; the current CD players have
already 64KB. CD audio recorders contain about 512 KB while settop boxes can have up
to 3 MB of software.

Before a product can be released for production the Quality Department verifies
that the product is according to government regulations and adheres to internal
production standards. Of course it is also verified that the product fulfils the requirements
of the customer.

The software engineering group in Hasselt consists of 66 engineers (June 1999)


Most of them have an electronic engineering background. Because there is more work
than we can handle, we also hire people from software houses (45) and subcontract to
partners (10 people involved). Part of the staff is working in other Philips labs in order to
acquire knowledge for new projects. There has been a strong growth in the department: in
the past year we have doubled the number of engineers.

A typical software project lasts 9 months and is staffed with 5 to 15 engineers.


Larger projects are split into sub teams of this size. A project in DVS-PS can also be part
of a larger project that is run together with other Philips development labs. Projects can
have a strong interrelation when a family of products is developed. The software is
always developed in conjunction with the hardware. In most cases the key components
are being developed as well. An overall project leader is responsible for the total
development and transfer to the factory. These overall project leaders are more hardware
oriented, so software project leaders are assigned to manage the software part of the
projects.

The Software Improvement Process

Early 1996 the current DVS-PS development organisation was created with a
crew of 25. Software engineers had different backgrounds and experience in projects of
varying maturity and different working practice. There were ISO9000 procedures
available describing a generic software plan, documents in software development and
quality control. The Quality Department only looked at the functionality of the product
as perceived by the consumer and did not have a real software focus. By that time we had
projects with a reported lead-time slip of 6 to 12 months, resulting in missed
opportunities. One project celebrated its 1000th bug found in beta-testing. The first
experiments with subcontracting resulted in the software engineers physically relocating
to the Hasselt premises.
To make everybody aware of the importance of the software process in
combatting these disasters, a consultant was invited to give a lecture about Software
Process Improvement (SPI). SPI has a major focus in all Philips software development
groups and is using the Capability Maturity Model (CMM) as a vehicle. Not only the
software engineers attended, but also the overall project leaders and development
management. In this lecture a rough outline of process improvement in software
development was given. The following steps can be determined (Fig. 6):

1. Management awareness. It is of utmost importance that the management team


understands why the software process has to be improved. Reasons can be
predictability of lead-time and cost, quality of the product, diminishing risks,
et cetera.

2. Define objectives. Which are the goals and targets of the software
improvement process? Can they be linked to the mission of the organisation?

3. Awareness. Create awareness in all layers of the organisation concerned, so all


software developers, overall project leaders, account managers, customers.

4. Assessment. Check the current status of the development organisation either


by an external or an internal assessment. We chose a guided Interim Maturity
Evaluation, where an external consultant guides the assessment by about 8
Philips software engineers. Guidelines and checklist are available in books or
on the website of the Software Engineering Institute (Ref. 3). Scores were
between 4 and 6 on the scale of 0 to 10.

5. Improvement plan. Dont try to improve everything at once. Select areas


where you can gain results quickly. Although it is an item for CMM level 3,
we started with introducing reviews. Also planning was selected as an early
improvement item. The improvement process was managed as a normal
project with a dedicated SPI co-ordinator as project leader. We chose an
external consultant with SPI experience. Do not select one of your own project
leaders, to avoid a priority conflict in the organisation.

6. Execution. The SPI project was divided into a number of subprojects. Each
subproject was assigned to a working group. These working groups were
given a time budget of 5 10% of their normal working hours. Most
subprojects delivered a procedure, a template or a checklist.

7. Verification. It is advised to verify the status of the software process twice a


year. This can be achieved by holding an Interim Maturity Evaluation. It will
show the progress and focus on attention points. Start a new improvement
cycle.
Experiences and Results

The approach was successful as all software engineers participated in the


improvement process. At the moment a working procedure was agreed upon it was
already deployed without resistance. A disadvantage is that it sometimes takes a lot of
time before a deliverable is ready. This is caused by the high pressure on the software
development projects. The organisation then tends to prefer short-term satisfaction of the
customer to working on medium term improvements.

Requirements engineering.

We introduced usecases (a technique used in object oriented development) to


describe the user interface of our products. A typical product is described in hundreds of
pages. Although this process is very time consuming, it is appreciated by the customer as
well as the developers. It gives both parties a good idea of the functionality of the product
and minimises surprises at the moment the product is created. The requirements are
frozen in an early stage (milestone Specification Release). After this milestone changes
are discouraged. A Change Control Board discusses all proposed changes and decides
whether they will be implemented.

Project planning

Projects are planned according to a checklist containing the process knowledge of


the organisation. Breakdowns are made to work packages with a lead-time of one week.
Estimations are made by the software project leader and discussed with the engineers that
carry out the task. Also all dependencies of other departments are planned and
communicated.

Project tracking

Every week a tracking is done of released deliverables, budget spent and hours
spent. The customer is informed and can follow the process closely. It will be clear
immediately when the project is behind schedule or when more hours are spent than
agreed in the budget.

An example of such a chart is given in Fig. 8.


DREAM : Milestones Completed versus Planned
115

110
113
105
101
100

95
90
90

85 Delay in (Plan 747.5)


* DR_DTR, DL_DTR
80
* BE_STR, FM_STR,
75 SMG_STR, CC_STR,
LS_STR, ML_STR, AP_STR,
70
DCP_STR, DP_STR
65

60

55

50

45

40

35

30

25 Original
New Agreed 747.5
20
Last Estimate
15 Completed
10

0
722

724

726

728

730

732

734

736

738

740

742

744

746

748

750

752

802

804

806

808

810

812

814

816

818

820

822

824

826

828

830
Figure 8. Example of a tracking sheet.

The line Completed shows that in week 804 a delay is reported, even according
to the last estimate. The reason here was that the team was waiting for input from others.

Subcontract management

For selecting our suppliers a procedure has been established. A prospective


supplier has to provide details about their quality system first. Then a two-day audit is
held where the software development manager, the software quality officer and a
software purchaser visit the company and rate it according to a checklist. This checklist
handles general information (age, number of employees, financial health, et cetera), role
of management, quality assurance, design/engineering and marketing/customer support.
The questions for design/engineering and quality assurance basically check if the supplier
is operating on CMM level 2. Depending on the results, a supplier can be put on the
shortlist. In case some criteria are not met, improvement actions can be agreed upon.

Each project is subject to a make-or-buy decision. In a meeting it is discussed


whether it is necessary or preferable to put out a project or part of it. Decision items are
strategic value, available resources, necessary skills, approval of our customer, et cetera.
When it is decided to put out, a request for proposal is sent to one or more suppliers. In a
second meeting the proposals are evaluated and a supplier is selected. It is also possible
to decide to do the project ourselves when the proposals are not sufficient or when
circumstances have changed (for instance resources available due to a cancelled project).
When the supplier has been selected, a contract negotiation is started to agree in detail
about a project management plan with deliveries, costs, reporting, acceptance criteria, et
cetera.

During execution of the contract with the supplier is it normal that we receive
weekly reports about their project status. All deliverables (such as designs, code, and test
reports) are reviewed. Preferably the supplier uses the same configuration management
tool as we have in order to facilitate multi-site configuration management.

We now have three years of experience with an Irish software supplier who
assisted us in developing the CDR application software for several projects. These
projects are all delivered according to agreed development time and budget. Furthermore
some small projects were issued to a development group in Vienna and an Indian
software company. Only the latter experience was not satisfactory, because we were both
in a learning phase and did not operate on CMM Level Two. Furthermore two projects
are running together with the Philips Software Centre in Bangalore. This development
site operates on CMM Level Three.

Quality assurance

Within the software group a team of Software Quality Inspectors has been
established. These inspectors are part of a project team but also report to the Software
Quality Assurance Manager (the head of the team). The inspectors check whether product
deliverables and the software creation process are according to the agreed procedures.
They have played an important role as review moderators when the role was introduced.
More and more we see a shift from product verification towards process verification and
support. The budget for this role takes 5% of the project budget. This is exclusive the
time that team members spend in reviews.

In addition the Quality Department audits the projects regularly. At milestone


meetings a report is issued about the status of the deliverables and the process.

Configuration management

To ensure a proper administration of project deliverables all documents, sources,


make files, meeting reports et cetera are kept in a configuration management database. A
tool makes sure that developers only have access to their private area or released
components. A configuration manager is responsible for promotion from private to test
and from test to released software. Once software is released, it cannot be changed
anymore. Thus it is always possible to react to problems in released software, as all
material remains available.

The configuration manager is responsible for administering the status of a


problem report or change request in a separate tool. Action holders are informed and
tracked. When a new release is issued, a report describes all problems solved and changes
implemented. Also all known bugs are described.
Configuration management takes about 5% of the total project budget. The
configuration management task is mostly combined with the quality inspector task for
another project.

Organisation process focus.

The software group has established a Software Engineering Process Group


(SEPG), which is responsible for the SPI-project. The SPI-project is planned and
executed similar to a development project, having an SPI-project leader and an SPI-
configuration manager, a change control board and quality control through external
auditors. The results of the SPI-project are disseminated through regular group meetings,
training for new employees and the DVS-PS intranet web site.

Organisation process definition.

With the aid of the Smiley reference process model, 16 working groups (one
group for each process) designed process and procedure descriptions, as well as templates
and examples for documents. The complete software process contains some 25 process
descriptions and some 80 procedures. Each project tailors the standard software process
to its own needs. The project plan is based on the project specific software process.

Intergroup relation with hardware development.

Our historic growth was from hardware interrelated software development to


hardware related software development. During our Software Process Improvement
discussions about the standard and templates for requirements we found out that our
requirements specifications still were of a mixed type. We tried to develop software
starting with a system specification without an intermediate software specification. This
had more consequences, as we found out. During the parallel software and hardware
developments, there was no responsible person for the system architecture in the
organisation. This problem has been solved in the mean time.

We already mentioned that hardware and software development occur in parallel.


Without a proper plan identifying the synchronisation moments between both
development groups, software suffered from the delivery of untested boards and
hardware suffered from lack of software assistance for hardware testing.

As a first step towards solving this problem, the hardware group started
developing debug-, diagnostic-, service- and test software themselves, which meant that
they tried to hire software engineers for hardware development support. However, in the
current software market situation software engineers prefer to be employed by a software
group, rather than a hardware group.

This meant that we decided to have the debug software to be developed by the
hardware group and the diagnostic-, service- and test software by the software group. It
turns out to be a better solution, provided the hardware group improves their development
process to include project planning and tracking activities similar to the software
planning and tracking activities.

As a result the MMA hardware group has started a Hardware Process


Improvement program!

The relation with the Quality Department

Where three years ago the Quality Department was merely detecting problems in
the product during beta testing it now participates in project audits. Furthermore it is able
to detect the point in time where it makes sense to start beta testing. In Fig. 9 you can see
the actual number of bugs found during integration and alpha testing and the prediction
from week 916 onwards. In week 916 it does not make sense to start beta testing. Already
in week 916 we are able to see that by week 928 beta testing can start.

bugtracking: prediction

700

600

500 actual number


GO prediction

400

300

200

100

0
843 844 845 846 847 848 849 850 851 852 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 921 922 923 924 925 926 927 928 929

actual number 2 12 14 14 16 20 24 26 26 49 64 75 80 92 114 142 164 182 196 221 232 254 273 305 326 364 400
GO prediction 0 0 326 360 400 440 520 550 570 590 608 625 640 660 660

Week

Figure 9. Bug tracking prediction


Intergroup co-ordination.

Most of the project co-ordination takes place on overall project level. Planning
and tracking delivery from and to the software group are the responsibility of the project
leader software.

The current process definition activity generates an overview of all links from
software processes to the external processes, which will be used for co-ordinating
purposes.

Training program.

We extended the scope of the KPA Training Program to include all human
resource management activities, including hiring, firing, appraisal et cetera. The training
program procedures, including identifying the skills and needs of the organisation, of the
project and of the person and the resolution to those needs, were designed and
implemented.

Integrated Software Management.

The planning and tracking activities from CMM Level Two have been upgraded
to monitoring and control activities. Having metrics defined for each process, the
monitoring and control process simply monitors these metrics. In a later stage, a software
quality management process will set boundary targets for those performance indicators
that are calculated from the metrics.

CMM audits

The first assessment that the software group performed was an Incremental
Maturity Evaluation that was guided by an external consultant. We found out that we had
a Level One maturity process. After about half a year we did another Incremental
Maturity Evaluation that made us believe that we nearly had attained a Level Two. The
external assessment of November 1997 told us the truth: we attained a Repeatable Level
for the KPAs Requirements Management and Project Tracking and a Defined Level for
the KPAs Peer Reviews and Process Focus. The other four Level Two KPAs were
partly satisfied.

Since then we focused more on a formal definition of our Level Two processes,
including the policy statements in our organisation documents and their independent
verification. In the external assessment of July 1998 this resulted in a fully satisfied for
all Level Two KPAs, as can be seen from the Kiviat diagram in Fig. 10a. These scores
were confirmed in the Incremental Maturity evaluations of November 1998 and March
1999.
score > 8: Fully satisfied;
5 < score < 8: Partly satisfied;
RM 0 < score < 5: Not satisfied.
10

CM 5 PP
4

2 Sep-96
1 Nov-97
0 Jul-98
Nov-98
Mar-99

QA PT

SM

Figure 10a. Results of consecutive assessments for CMM level 2.

Since the third quarter of 1998 the software group focuses on improvement of the
CMM Level 3 KPAs, resulting in a better process definition, with the aid of the Smiley
model, and in improvement of our HRM procedures, including training development and
training program execution. The results of the measurements are depicted in Fig. 10b.
PF
10
9
8
7
PR 6
PD
5
4
3
2
1
0

IC TP

Nov-98
Mar-99

PE IM

Figure 10b. Results of consecutive assessments for CMM level 3.

One of the lessons we learned was that we had much too high expectations of our
own software processes and that improving considerably means a real change and a real
challenge. Furthermore we learned what the CMM really is and what it is worth. It is
really worth the investment! At this moment we believe that we are able to repeat the
same type of projects in a similar way as we have done; our estimates for delivery dates
are within 10 % accurate; our staff is acting under less pressure. And we have grown to
be a reliable partner in hardware related software development.

References:

1. W.S. Humphrey, Managing the software process, Addison-Wesley, Reading,


MA, 1989.

2. M.C. Paulk et al, The Capability Maturity Model: Guidelines for Improving the
Software Process, Addison-Wesley, Reading, MA, 1994.

3. The Software Engineering Institute website: http://www.sei.cmu.edu/.

4. The Philips website: http://www.philips.com/.

5. The Vision Consort website: http://www.visionconsort.nl/.

Vous aimerez peut-être aussi