Académique Documents
Professionnel Documents
Culture Documents
T
he maturity model provided by the COBIT Management
Guidelines for the 34 COBIT IT processes is becoming At the beginning of this benchmarking effort, there were
an increasingly popular tool to manage the timeless two main issues:
issue of balancing risk and control in a cost-effective manner. • The need for a criterion to choose the processes to bench-
Control Objectives for Information and related Technology mark
(COBIT) is published by the IT Governance Institute (ITGI) • The need for a method to measure the vendor’s maturity
and Information Systems Audit and Control Foundation level with respect to the COBIT Maturity Model
(ISACF). The processes to benchmark were chosen by scoring the
The COBIT Maturity Model is an IT governance tool used COBIT IT processes on a risk-importance basis, from the point
to measure how well developed the management processes are of view of a potential customer. This task followed a logic
with respect to internal controls. The maturity model allows an similar to the one in the risk assessment form of the COBIT
organization to grade itself from nonexistent (0) to optimized Implementation Tool Set.
(5). Such capability can be exploited by auditors to help man- The definition of a method to measure the maturity level
agement fulfill its IT governance responsibilities, i.e., exercise required more effort, in part, because the desire was for a
effective responsibility over the use of IT just like any other method precise and efficient enough to allow for interaction
part of the business.1 with potential vendors. A questionnaire and a ranking system
A fundamental feature of the maturity model is that it were developed to compute the maturity level from the
allows an organization to measure as-is maturity levels, and questionnaire results. While the approach was not unusual,
define to-be maturity levels as well as gaps to fill. As a result, there were a few new ideas used that proved to be valuable.
an organization can discover practical improvements to the (These new ideas subsequently have been tested by other
system of internal controls of IT. However, maturity levels are AIEA2 colleagues.)
not a goal, but rather they are a means to evaluate the adequa- The method used is not strictly incremental and, therefore,
cy of the internal controls with respect to company business does not satisfy the COBIT Maturity Model’s incremental crite-
objectives. rion3—to check “a posteriori.”4
In volume 6, 2002, of the Information Systems Control However, the method proved to be strong, with respect to
Journal, the article “Control and Governance Maturity Survey: the objective of benchmarking the four organizations under
Establishing a Reference Benchmark and a Self-assessment examination, and the results were logical given the knowledge
Tool,” by Erik Guldentops, CISA, CISM, Wim Van collected on the organizations during the benchmarking effort.
Grembergen, Ph.D., and Steven De Haes, discusses the results Moreover, it appears that the method can be further developed
of the 2002 ISACA survey on the maturity level of 15 COBIT to build a strictly incremental approach.5
IT processes. According to the article, survey target processes Finally, if combined with different methods, the comparison
were selected a year prior by interviewing a group of 20 IT and between the benchmarking results and the ISACA survey
senior experts. results provided a basis for an overall discussion on the
The ISACA survey results can be used as a reference distribution of the “strongest” and “weakest” areas.
benchmark and a self-assessment tool. The results of the
survey cover a broad range of countries, industries and size Benchmarking Results
groups, making them useful for numerous companies Figure 1 displays the results of the benchmarking effort as
worldwide. compared with the 2002 ISACA survey results. The four organi-
In an engagement experience, this author participated on a zations were essentially software makers with an independent
team that used the COBIT Maturity Model to benchmark four consultancy branch; hence, the benchmark values refer to the
possible vendors, and then compared its results to the ISACA ISACA 2002 survey results related to the Europe, Middle East
survey results. The process undertaken, as well as the lessons and Africa (EMEA) small6 IT service providers. The ISACA
learned and the results, is discussed in the remainder of this value was computed as the weighted mean of the respondents’
article. distribution on the six possible maturity levels (0 - 5).
The benchmarking results reveal that three of the four com-
panies are aligned between themselves and the ISACA sample
with the exception of a few processes. However, one company
Completely
Quite a lot
Not at all
Statements
A little
compliance
Level Statements values Figure 6—Computation of the Normalized Compliance Vector
1 The IT project management process and x 0.66
methodology have been formally Not normalized Normalized
established and communicated. compliance values compliance values
2 IT projects are defined with appropriate x 0.66 Level (A) [A/Sum(A)]
business and technical objectives. 0 0.00 0.000
3 Stakeholders are involved in the x 1 1 0.00 0.000
management of IT projects. 2 0.50 0.176
4 The IT project organization and some x 1 3 0.78 0.275
roles and responsibilities are defined. 4 0.77 0.272
5 IT projects have defined and updated x 0.66 5 0.79 0.277
schedule milestones. Total: 2.84 1
6 IT projects have defined and managed x 1
budgets. Finally, the maturity level summary for the process was
7 IT projects monitoring relies on x 0.33 computed by combining the normalized compliance values for
clearly defined performance
measurement techniques. each maturity level as shown in figure 7.
8 IT projects have formal post-system x 0.66
implementation procedures. Figure 7—Computation of the
9 Informal project management training x 1 Summary Maturity Level
is provided. Normalized
10 Quality assurance procedures and post- x 0.66 compliance values Contribution
system implementation activities have Level (B) (A*B)
been defined, but are not broadly 0 0.000 0.00
applied by IT managers.
1 0.000 0.00
11 Policies for using a balance of internal x 1
and external resources are being defined. 2 0.176 0.35
Total level: 8.63 3 0.275 0.83
4 0.272 1.09
process PO10 Managing Projects, with the compliance values 5 0.277 1.38
for each statement. Total maturity level: 3.65
The compliance value for the scenario can be computed as
the average of the compliance level of the statements. In the Lessons Learned
case of maturity level 3, it is equal to 8.63/11 = 0.78. Because of its construction criteria, the questionnaire is
Working in the same way on the other maturity levels, one aligned completely with the maturity model and fairly detailed
can compute a compliance value for all the maturity levels with respect to the maturity requirements. This has proven to
form 0 to 5. Figure 5 provides an example of the possible be useful to support subsequent discussions aimed at identify-
results of this computation. ing the key points that were enabling or preventing the
organization to reach a given maturity level.
Figure 5—Computation of the Maturity Level The experience also suggested that to exploit the benefits of
Compliance Values the maturity level paradigm, it appears useful to allow a com-
pany to grade itself in the full range from 0 to 5 rather than
Sum of statements Number of maturity Maturity level
having to choose a position in the coarse grid (0, 1, 2, ... 5).
Maturity compliance values level statements compliance value
level (A) (B) (A/B) As a suggestion, in performing a benchmarking effort, first
0 0.00 2.00 0.00 discuss the maturity requirements without showing the maturi-
1 0.00 9.00 0.00 ty level in which the questions belong. This will reduce any
2 3.00 6.00 0.50 bias by the respondents that can be present when they know
3 8.63 11.00 0.78 the effects of the answers on the final result.
4 6.97 9.00 0.77 Further work must be done to assemble a strictly incremen-
5 6.31 8.00 0.79 tal approach to the measurement of the maturity level, as
required by the COBIT Maturity Model. However, for compari-
son purposes, the method presented here proved to be efficient,
providing strong results and facilitating the subsequent discus-
sions on the use of the COBIT Maturity Model as an effective
management tool.
INFORMATION SYSTEMS CONTROL JOURNAL, VOLUME 3, 2003
To make the method applicable beyond comparison, as with 3
COBIT Management Guidelines, IT Governance Institute and
planning improvements (as-is, to-be, gap analysis), it must Information Systems Audit and Control Foundation, 2000, p. 100
manage partial compliance at lower levels. That is not an issue 4
For example, one has to check “a posteriori” if a company in
in the method presented here, but for improvements one wants which the method is rating, for example, 3.5 really satisfies
to see consistency at the lower levels before evaluating the all the conditions to meet the maturity level 3.
contributions at the higher levels. 5
This author has recently engaged in another benchmarking
effort involving some 20 small Italian banks. This effort
Summary involves a strictly incremental approach.
The COBIT Management Guidelines do not suggest any spe-
6
Staff is greater than 150 people, but less than 1,500 people.
cial methodology to measure the maturity level of the IT
7
When the descriptions were split into distinct statements,
processes, and many approaches can be followed. However, to some statements partially lost their context and became less
use the COBIT Maturity Model as an effective management understandable; for this reason, some statements were modi-
tool, companies must develop an efficient methodology to fied for the sake of clarity or to avoid ambiguities.
measure the maturity level of their IT processes. The results of
8
It was discovered that equivalent questions and a set of possi-
a benchmarking effort based on the COBIT Maturity Model and ble answers could be used. For example, after some discus-
the method used to measure a maturity level for the IT process- sion, the following question was chosen: “How do you rank
es have been presented. Although the method is not strictly the following statements in the range true, partially true, not
incremental, as required by the Maturity Model, it proved to be completely false, false?” Some alternatives were discussed,
an efficient tool to measure the maturity level for comparison and the one described was chosen.
purposes. It also provided ideas that can be developed and used
to build a strictly incremental method. Andrea Pederiva
is a manager at Deloitte & Touche in Treviso, Italy. He has
developed vast experience in the field of management and con-
Endnotes
trol of information systems with specific reference to the
1
Hardy, Gary,“Make Sure Management and IT are on the
development and the assurance of internal controls for IT orga-
Same Page: Implementing an IT Governance Framework,”
nizations. He has experience in privacy, IT auditing, quality
Information Systems Control Journal, volume 3, 2002
management, IT security, project management and project risk
2
Associazione Italiana IS Auditors (AIEA), is the Milan
management.
ISACA chapter, in Italy.
Information Systems Control Journal, formerly the IS Audit & Control Journal, is published by the Information Systems Audit and Control Association, Inc.. Membership in the association, a voluntary
organization of persons interested in information systems (IS) auditing, control and security, entitles one to receive an annual subscription to the Information Systems Control Journal.
Opinions expressed in the Information Systems Control Journal represent the views of the authors and advertisers. They may differ from policies and official statements of the Information Systems Audit
and Control Association and/or the IT Governance Institute® and their committees, and from opinions endorsed by authors' employers, or the editors of this Journal. Information Systems Control Journal
does not attest to the originality of authors' content.
© Copyright 2003 by Information Systems Audit and Control Association Inc., formerly the EDP Auditors Association. All rights reserved. ISCATM Information Systems Control AssociationTM
Instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. For other copying, reprint or republication, permission must be obtained in writing from the
association. Where necessary, permission is granted by the copyright owners for those registered with the Copyright Clearance Center (CCC), 27 Congress St., Salem, Mass. 01970, to photocopy articles
owned by the Information Systems Audit and Control Association Inc., for a flat fee of US $2.50 per article plus 25¢ per page. Send payment to the CCC stating the ISSN (1526-7407), date, volume,
and first and last page number of each article. Copying for other than personal use or internal reference, or of articles or columns not owned by the association without express permission of the
association or the copyright owner is expressly prohibited.
www.isaca.org