Académique Documents
Professionnel Documents
Culture Documents
CIPFA, 1993
Published by Blackwell Publishers, 108 Cowley Road, Oxford OX4 IJF and
238 Main Street, Cambridge, MA 02142, USA.
Peter M. Jackson is
Professor of Economics
and Director ofthe
Public Sector
Economics Research
Centre at the University
of Leicester.
10
11
Figure 1.
Evaluation of the
internal environment
Evaluation of the
external environment
i
Setting
Mission-values-objectives
Direction
Structure-style
Reward-information systems
Performance measures
_^ Strategic
management
i
Business/service plans
I
Activities
I
Implementation
Operational
'management
4.
Outputs/outcomes
Implementation
The use of performance indicators by public sector
organizations has been on the public sector
management agenda in the U K for about ten years.
Research at the University of Leicester shows that
despite the encouragement and leadership of
agencies such as the National Audit Office, the
Audit Commission and the Accounts Commission
for Scotland, implementation of performanceevaluation systems has been slow, piecemeal and
often half-hearted. In-depth case studies of those
organizations which have successfiiUy introduced a
performance-evaluation system demonstrate dearly
that a critical success factor is the amount of thought
and effort which is put into managing the change
process. Most organizations underestimate the
time it takes to design a performance-evaluation
system, to negotiate it into place, to trial it, debug it
and develop it. This whole cycle can take between
three and four years.
It is the responsibility ofthe senior management
group to ensure that a system exists that will
produce the strategic, continuous improvement
and change-management performance indicators.
They also need to sanction the resources required
to establish a management information system that
will collect and disseminate such information. Of
course the senior management team do not make
direct decisions on the precise performance
indicators. Successful implementation requires
that those who are responsible for delivering tbe
'Research carried out at the Public Sector Economics Research
Centre, University ofLeicester, into the practice of performance
measurement in puhlic service organizations.
CIPFA, 1993
12
13
Wish-driven Strategies
What makes one organization more successful than
others? This is another way of asking about its
relative superior performance. Success and high
performance is not the 'product of wish-driven
strategy' (Kay, 1993, p.4). Vision, mission and
values are necessary but certainly not sufficient.
Too many strategic management texts incorrectly
assume tbat provided our organization has a mission
statement then it will out-perform those that do
not.
If mission and vision remain in the minds of
the senior management team and are not
communicated throughout the organization then
they are valueless. The critical success factors (CSF)
are an appraisal of the internal strengths and
weaknesses of the organization; an appreciation
and understandingofitsenvironment; thecapacity
to learn and adopt as the uncertain future unfolds.
Strength in these areas must be worked at and
acquired. This is the function of management.
Success is not achieved by accident. Short run
success gained as a result ofchance will be ephemeral
unless strong management exists to capitalize it.
Success requires a match between internal
capabilities (strengths) and external relationships.
This 'strategic fit' demands an understanding of
tbe organization's relationship with its suppliers;
its clients (customers); the political system within
which it is embedded; the socio/demographic
environment etc. Many 'wish-driven strategies'
simply cannot be delivered because the organization
CIPFA, 1993
14
weakens accountability.
So What?
The trend over the past ten years has been to
produce mission statements, strategies and business/
service plans for public service organizations. Data
have been collected which are then reconstituted
into performance measures and indicators. So
what has anything materially changed? Have
services improved; has greater value been
created?
Many of the changes are symbols of good
management practice. Litde of substance or
consequence changes as a result of their
introduction. They are the first step on a long
journey. They are necessary but not sufficient.
Performance indicators simply represent
management information. Like all forms of
information they are an input which comes in a
variety of qualities. Some convey clearer signals
than others. However, like all inputs they need to
be worked on and crafted if they are to be useful.
Our research shows that too often decision-makers
delegate the task of collecting, collating and
publishing statistics for performance measurement
to a specialist group, TTiis is in many cases a form
of sidelining performance measurement. The
information contained in the performance
indicators do not really influence decisions about
resource use and will not help decision-makers to
create value in these cases. Sidelining of
performance measurement has been found in local
authorities; central government departments and
in GP practices.
If resource use decisions are to be influenced
by the information contained in performance
indicators then they need to be related to incentives
tbat will change behaviour. For example, the
information signals contained in the new capital
charges in the NHS have no impact on resource use
because the charges are not related to funding.
They do not influence behaviour in terms of asset
use with the result that their introduction has had
no impact on the efficient use of assets.
At this moment in time it is impossible to know
the value of performance indicators. It is not
possible to know, let alone evaluate, what has
happened as a result of their use. Whether or not
resources are used more efficiendy; whether or not
public services are more effective; and whether or
not more value has been created remain
unanswered questions. The article of faith is that
overall performance is better with the availability of
the information than without.
Important information gaps do, however,
remain. Insufficient resources have been allocated
to analysing the cause and effect relationships
between different means ofpolicy intervention and
outcomes. Until crucial aspects of policy analysis
References
Harris Research Centre (1990), Information for Strategic
Management: A Study ofLeading Companies. Manchester.
Jackson, P, M, and Palmer, A, (1993), Developing
Performance Monitoring in Public Sector Organizations.
Management Centre, University ofLeicester,
Johnson, H, T. and Kaplan, R, S, (1987), Relevance Lost:
The Rise and Fall of Management Accounting. Harvard
Business School Press, Boston,
Kay, J, A (1993), Foundations ofCorporate Success. Oxford
University Press, Oxford,
Kaplan, R, S, (Ed,), (1990), Measures of Manufacturing
Success. Harvard Business School Press, Boston,
Kass, H, D, and Catrow, B, h. (1990), Images and Identities
in Public Administration. Sage Publications, London,
CIPFA, 1993