Vous êtes sur la page 1sur 4

Copyright © 2003 Information Systems Audit and Control Association. All rights reserved. www.isaca.org.

The COBIT Maturity Model in a


Vendor Evaluation Case
By Andrea Pederiva

Main Issues and Lessons Learned

T
he maturity model provided by the COBIT Management
Guidelines for the 34 COBIT IT processes is becoming At the beginning of this benchmarking effort, there were
an increasingly popular tool to manage the timeless two main issues:
issue of balancing risk and control in a cost-effective manner. • The need for a criterion to choose the processes to bench-
Control Objectives for Information and related Technology mark
(COBIT) is published by the IT Governance Institute (ITGI) • The need for a method to measure the vendor’s maturity
and Information Systems Audit and Control Foundation level with respect to the COBIT Maturity Model
(ISACF). The processes to benchmark were chosen by scoring the
The COBIT Maturity Model is an IT governance tool used COBIT IT processes on a risk-importance basis, from the point
to measure how well developed the management processes are of view of a potential customer. This task followed a logic
with respect to internal controls. The maturity model allows an similar to the one in the risk assessment form of the COBIT
organization to grade itself from nonexistent (0) to optimized Implementation Tool Set.
(5). Such capability can be exploited by auditors to help man- The definition of a method to measure the maturity level
agement fulfill its IT governance responsibilities, i.e., exercise required more effort, in part, because the desire was for a
effective responsibility over the use of IT just like any other method precise and efficient enough to allow for interaction
part of the business.1 with potential vendors. A questionnaire and a ranking system
A fundamental feature of the maturity model is that it were developed to compute the maturity level from the
allows an organization to measure as-is maturity levels, and questionnaire results. While the approach was not unusual,
define to-be maturity levels as well as gaps to fill. As a result, there were a few new ideas used that proved to be valuable.
an organization can discover practical improvements to the (These new ideas subsequently have been tested by other
system of internal controls of IT. However, maturity levels are AIEA2 colleagues.)
not a goal, but rather they are a means to evaluate the adequa- The method used is not strictly incremental and, therefore,
cy of the internal controls with respect to company business does not satisfy the COBIT Maturity Model’s incremental crite-
objectives. rion3—to check “a posteriori.”4
In volume 6, 2002, of the Information Systems Control However, the method proved to be strong, with respect to
Journal, the article “Control and Governance Maturity Survey: the objective of benchmarking the four organizations under
Establishing a Reference Benchmark and a Self-assessment examination, and the results were logical given the knowledge
Tool,” by Erik Guldentops, CISA, CISM, Wim Van collected on the organizations during the benchmarking effort.
Grembergen, Ph.D., and Steven De Haes, discusses the results Moreover, it appears that the method can be further developed
of the 2002 ISACA survey on the maturity level of 15 COBIT to build a strictly incremental approach.5
IT processes. According to the article, survey target processes Finally, if combined with different methods, the comparison
were selected a year prior by interviewing a group of 20 IT and between the benchmarking results and the ISACA survey
senior experts. results provided a basis for an overall discussion on the
The ISACA survey results can be used as a reference distribution of the “strongest” and “weakest” areas.
benchmark and a self-assessment tool. The results of the
survey cover a broad range of countries, industries and size Benchmarking Results
groups, making them useful for numerous companies Figure 1 displays the results of the benchmarking effort as
worldwide. compared with the 2002 ISACA survey results. The four organi-
In an engagement experience, this author participated on a zations were essentially software makers with an independent
team that used the COBIT Maturity Model to benchmark four consultancy branch; hence, the benchmark values refer to the
possible vendors, and then compared its results to the ISACA ISACA 2002 survey results related to the Europe, Middle East
survey results. The process undertaken, as well as the lessons and Africa (EMEA) small6 IT service providers. The ISACA
learned and the results, is discussed in the remainder of this value was computed as the weighted mean of the respondents’
article. distribution on the six possible maturity levels (0 - 5).
The benchmarking results reveal that three of the four com-
panies are aligned between themselves and the ISACA sample
with the exception of a few processes. However, one company

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 3, 2003


almost always has a maturity level far below the other The Questionnaire
companies. To arrange the questionnaire, the maturity level descriptions
The results can be better understood with the following of the COBIT Maturity Model were studied. It was concluded
facts: that the descriptions of the maturity levels could be viewed as
• CO4 was the only participant in the benchmark that was not sets of “atomic” statements. Each maturity level description is
ISO9000-certified. a statement that can be either true or false, or either partially
• The respondents are IT vendors with consultancy branches; true or partially false. The examination resulted in the
hence, IT processes (including AI4 Install and Accredit realization that a compliance value could be computed for
Systems) are their core processes. the maturity level by collecting and then combining a
• The processes in which the ISACA sample has a score lower compliance value for each statement.
than the benchmarking sample appear to be the typical core Based on this concept, the maturity level descriptions were
processes of an application systems developer and vendor. split into separate statements, and all statements in the maturity
This suggests a possibly different composition between the level descriptions were separate in the questionnaire.7
benchmarking sample and the ISACA sample for the EMEA Figure 2 displays an example of how the questionnaire
small IT service providers. statements were derived for the maturity model of process
• CO3 is the IT subsidiary of a publicly owned group with PO10 Managing Projects.
strong quality and organizational culture—in part, due to its
defense-related experiences. Figure 2—Questionnaire Construction for PO10
Managing Projects (maturity levels 0 and part of 1)
Maturity Level Description Questionnaire Statements
Figure 1—Benchmarking the Results
0 Nonexistent—Project • The organization does not use
management techniques are not project management techniques.
used and the organization does • The organization does not
not consider business impacts report on project
PO5—Manage the IT investment associated with project mismanagement effects or
5.00 mismanagement and development project failures.
M1—Monitor the process 4.00 P10—Manage projects development project failures.
3.00 1 Initial/Ad hoc—The organization • The organization is generally
2.00
is generally aware of the need for aware of the need for projects
DS10—Manage problems and
1.00 AI1—Identify solutions
projects to be structured and it to be structured.
incidents
is aware of the risks of poorly • The organization is aware
0.00
managed projects. The use of of the risks of poorly
project management techniques managed projects.
AI2—Acquire and maintain
and approaches within IT is a • The use of project management
DS5—Ensure systems security application software decision left to individual IT techniques and approaches
managers. Projects are generally within IT is a decision left to
AI6—Manage changes AI5—Install and accredit systems
poorly defined. individual IT managers.
• Projects are generally poorly
defined.

To obtain a compliance value for each statement, the fol-


lowing question was asked: “With respect to your organization,
Benchmark Method how much do you agree with the following statements?” Four
The method used in the benchmark is based on a question- possible answers—not at all, a little, quite a lot or
naire derived from the COBIT Maturity Model, and it relies on completely—were provided.
a “scenario” concept—i.e., every maturity level is considered The answers were mapped to the following compliance
to be a scenario. A maturity level scenario includes the descrip- values—0, 0.33, 0.66, 1 (figure 3).8
tion of the organization and the internal controls of a company
satisfying the requirements of a specific maturity level. Figure 3—Compliance Level Numeric Values
The questionnaire is intended to capture the compliance of
an IT organization under investigation to the diverse scenarios Agreement with Statement Compliance Value
describing each maturity level. Based on the questionnaire Not at all 0
results, an algorithm computes a “compliance” vector that A little 0.33
describes the compliance of the organization to every scenario. Quite a lot 0.66
Completely 1
Then, it uses the vector to compute the maturity level as a
weighted average of the organization’s compliance with respect
to each scenario. The Algorithm
When the questionnaire is complete, each maturity level
will have a set of statements, each with its own compliance
value of 0, 0.33, 0.66 or 1. For example, figure 4 shows the
fully compiled questionnaire for the level 3 maturity model of

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 3, 2003


The ability to see the compliance values as a description of
Figure 4—Questionnaire for the Level 3 Maturity
the “contribution” of each maturity level scenario to the
Model of Process PO10
overall maturity level of the organization. The compliance
How much values were normalized by imposing them as shown in the
do you agree?
example in figure 6.

Completely
Quite a lot
Not at all
Statements

A little
compliance
Level Statements values Figure 6—Computation of the Normalized Compliance Vector
1 The IT project management process and x 0.66
methodology have been formally Not normalized Normalized
established and communicated. compliance values compliance values
2 IT projects are defined with appropriate x 0.66 Level (A) [A/Sum(A)]
business and technical objectives. 0 0.00 0.000
3 Stakeholders are involved in the x 1 1 0.00 0.000
management of IT projects. 2 0.50 0.176
4 The IT project organization and some x 1 3 0.78 0.275
roles and responsibilities are defined. 4 0.77 0.272
5 IT projects have defined and updated x 0.66 5 0.79 0.277
schedule milestones. Total: 2.84 1
6 IT projects have defined and managed x 1
budgets. Finally, the maturity level summary for the process was
7 IT projects monitoring relies on x 0.33 computed by combining the normalized compliance values for
clearly defined performance
measurement techniques. each maturity level as shown in figure 7.
8 IT projects have formal post-system x 0.66
implementation procedures. Figure 7—Computation of the
9 Informal project management training x 1 Summary Maturity Level
is provided. Normalized
10 Quality assurance procedures and post- x 0.66 compliance values Contribution
system implementation activities have Level (B) (A*B)
been defined, but are not broadly 0 0.000 0.00
applied by IT managers.
1 0.000 0.00
11 Policies for using a balance of internal x 1
and external resources are being defined. 2 0.176 0.35
Total level: 8.63 3 0.275 0.83
4 0.272 1.09
process PO10 Managing Projects, with the compliance values 5 0.277 1.38
for each statement. Total maturity level: 3.65
The compliance value for the scenario can be computed as
the average of the compliance level of the statements. In the Lessons Learned
case of maturity level 3, it is equal to 8.63/11 = 0.78. Because of its construction criteria, the questionnaire is
Working in the same way on the other maturity levels, one aligned completely with the maturity model and fairly detailed
can compute a compliance value for all the maturity levels with respect to the maturity requirements. This has proven to
form 0 to 5. Figure 5 provides an example of the possible be useful to support subsequent discussions aimed at identify-
results of this computation. ing the key points that were enabling or preventing the
organization to reach a given maturity level.
Figure 5—Computation of the Maturity Level The experience also suggested that to exploit the benefits of
Compliance Values the maturity level paradigm, it appears useful to allow a com-
pany to grade itself in the full range from 0 to 5 rather than
Sum of statements Number of maturity Maturity level
having to choose a position in the coarse grid (0, 1, 2, ... 5).
Maturity compliance values level statements compliance value
level (A) (B) (A/B) As a suggestion, in performing a benchmarking effort, first
0 0.00 2.00 0.00 discuss the maturity requirements without showing the maturi-
1 0.00 9.00 0.00 ty level in which the questions belong. This will reduce any
2 3.00 6.00 0.50 bias by the respondents that can be present when they know
3 8.63 11.00 0.78 the effects of the answers on the final result.
4 6.97 9.00 0.77 Further work must be done to assemble a strictly incremen-
5 6.31 8.00 0.79 tal approach to the measurement of the maturity level, as
required by the COBIT Maturity Model. However, for compari-
son purposes, the method presented here proved to be efficient,
providing strong results and facilitating the subsequent discus-
sions on the use of the COBIT Maturity Model as an effective
management tool.
INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 3, 2003
To make the method applicable beyond comparison, as with 3
COBIT Management Guidelines, IT Governance Institute and
planning improvements (as-is, to-be, gap analysis), it must Information Systems Audit and Control Foundation, 2000, p. 100
manage partial compliance at lower levels. That is not an issue 4
For example, one has to check “a posteriori” if a company in
in the method presented here, but for improvements one wants which the method is rating, for example, 3.5 really satisfies
to see consistency at the lower levels before evaluating the all the conditions to meet the maturity level 3.
contributions at the higher levels. 5
This author has recently engaged in another benchmarking
effort involving some 20 small Italian banks. This effort
Summary involves a strictly incremental approach.
The COBIT Management Guidelines do not suggest any spe-
6
Staff is greater than 150 people, but less than 1,500 people.
cial methodology to measure the maturity level of the IT
7
When the descriptions were split into distinct statements,
processes, and many approaches can be followed. However, to some statements partially lost their context and became less
use the COBIT Maturity Model as an effective management understandable; for this reason, some statements were modi-
tool, companies must develop an efficient methodology to fied for the sake of clarity or to avoid ambiguities.
measure the maturity level of their IT processes. The results of
8
It was discovered that equivalent questions and a set of possi-
a benchmarking effort based on the COBIT Maturity Model and ble answers could be used. For example, after some discus-
the method used to measure a maturity level for the IT process- sion, the following question was chosen: “How do you rank
es have been presented. Although the method is not strictly the following statements in the range true, partially true, not
incremental, as required by the Maturity Model, it proved to be completely false, false?” Some alternatives were discussed,
an efficient tool to measure the maturity level for comparison and the one described was chosen.
purposes. It also provided ideas that can be developed and used
to build a strictly incremental method. Andrea Pederiva
is a manager at Deloitte & Touche in Treviso, Italy. He has
developed vast experience in the field of management and con-
Endnotes
trol of information systems with specific reference to the
1
Hardy, Gary,“Make Sure Management and IT are on the
development and the assurance of internal controls for IT orga-
Same Page: Implementing an IT Governance Framework,”
nizations. He has experience in privacy, IT auditing, quality
Information Systems Control Journal, volume 3, 2002
management, IT security, project management and project risk
2
Associazione Italiana IS Auditors (AIEA), is the Milan
management.
ISACA chapter, in Italy.

Information Systems Control Journal, formerly the IS Audit & Control Journal, is published by the Information Systems Audit and Control Association, Inc.. Membership in the association, a voluntary
organization of persons interested in information systems (IS) auditing, control and security, entitles one to receive an annual subscription to the Information Systems Control Journal.

Opinions expressed in the Information Systems Control Journal represent the views of the authors and advertisers. They may differ from policies and official statements of the Information Systems Audit
and Control Association and/or the IT Governance Institute® and their committees, and from opinions endorsed by authors' employers, or the editors of this Journal. Information Systems Control Journal
does not attest to the originality of authors' content.

© Copyright 2003 by Information Systems Audit and Control Association Inc., formerly the EDP Auditors Association. All rights reserved. ISCATM Information Systems Control AssociationTM

Instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. For other copying, reprint or republication, permission must be obtained in writing from the
association. Where necessary, permission is granted by the copyright owners for those registered with the Copyright Clearance Center (CCC), 27 Congress St., Salem, Mass. 01970, to photocopy articles
owned by the Information Systems Audit and Control Association Inc., for a flat fee of US $2.50 per article plus 25¢ per page. Send payment to the CCC stating the ISSN (1526-7407), date, volume,
and first and last page number of each article. Copying for other than personal use or internal reference, or of articles or columns not owned by the association without express permission of the
association or the copyright owner is expressly prohibited.

www.isaca.org

INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 3, 2003

Vous aimerez peut-être aussi