Vous êtes sur la page 1sur 21

A Report of the

Policy Review and Performance


Scrutiny Committee

BUDGET BENCHMARKING

JANUARY 2006

County Council of The City and County of Cardiff


CONTENTS

FOREWORD 2

TERMS OF REFERENCE 3

BACKGROUND 4

KEY FINDINGS 5

RECOMMENDATIONS 6

WHAT IS BENCHMARKING 7

USING FINANCIAL COMPARATOR BENCHMARKS 8

THE DRIVERS 9

BUDGET BENCHMARKING RESEARCH 10

SERVICE LEVEL BENCHMARKS 16

INQUIRY METHODOLOGY 18

LEGAL IMPLICATIONS 19

FINANCIAL IMPLICATIONS 19

POLICY REVIEW AND PERFORMANCE SCRUTINY COMMITTEE MEMBERSHIP 20

1
FOREWORD

Ensuring that Council’s delivers value for money for the people of Cardiff is essential.
Budget Benchmarking is a tool that can assist in this process and involves comparing
financial benchmarks for the Council with the same information in similar Councils.

As the Task & Finish group started to collect comparative data and receive evidence from
witnesses, it became clear that there is a wealth of financial information available which is
often under utilised because technical and analytical skills rarely combine. It also became
apparent that Cardiff’s demographics are very different from other Authorities within Wales
which, in some cases, makes the selection of comparator authorities difficult. Diverging
performance and financial regimes between England and Wales also compound these
problems.

Whilst these difficulties were apparent, the Task and Finish undertook some scrutiny of
financial benchmarks at resource allocation level and at a service level and made some
interesting findings.

It will be noted from our conclusions and recommendations that the Committee considers
that, if fully and professionally implemented, benchmarking can be used to lock the Council
into a cycle of continuous improvement and to develop a culture where it is easier to
question the norm and to make changes.

I wish to express appreciation to officers and expert witnesses who presented evidence to
the Task & Finish group meetings. Their contribution, experiences, and ideas helped us to
formulate recommendations, which are commended to the Executive, and which I believe
if fully implemented could strengthen the culture of continuous improvement within the
Council.

Finally I would like to thank my colleagues on the Task & Finish group for their involvement
and hard work.

Councillor David Walker


Chairperson, Policy Review & Performance Scrutiny Committee

2
TERMS OF REFERENCE

1. The Policy Review and Performance Scrutiny Committee agreed the following terms
of reference for the Task & Finish Group at their committee meeting of 17 July, 2005:

‘To undertake financial comparisons and utilise Benchmark information for


specific Council Services and examine the utility of this information for
scrutiny purposes’

2. To aid this inquiry a research project was commissioned. The Scrutiny Research and
Information Team was commissioned to “indicate how currently held / available local
authority data (financial returns) can be utilised to provide comparative material for
Committee members to assess budget allocation for selected services or functions”.

3
BACKGROUND

3. Within the public sector there are many parties interested in the financial performance
of an organization. The public want to be satisfied that they are getting value for
money out of a Council service. Managers want to be able to compare the
performance of their own service to that of others, in order to isolate areas in which
the service could improve, and auditors want to assess the reliability of an
organisation’s financial statements. A tool used by these parties is financial
benchmarking. Benchmarking involves comparing various aspects of a Council’s
operations to those of another, in order to discover areas in which the Council could
improve.

4. At its meeting on the 12th May 2005 the Policy Review and Performance Scrutiny
Committee discussed the approach taken to scrutinising the Executive Budget
Proposals for 2005/06, with a view to identifying an appropriate role for scrutiny in
future budget rounds. The Committee discussed options for better engagement in the
budget with Councillor Mark Stephens, Executive Member – Finance & Economic
Development, Phil Higgins, Corporate Director (Resources), and Christine Salter,
Chief Financial Services Officer. From these discussions the Committee identified a
number of approaches that they would wish to adopt in future budget scrutiny.

5. One such option was Budget Benchmarking which would involve comparing the
budget allocated to various functions in Cardiff with that allocated to the same
functions in other similar authorities. This needed to take account of the relative
priority attached to those functions within the authorities and their relative
performance. It also needed to ensure that “like for like” comparisons were being
undertaken. The Committee resolved to conduct an inquiry to undertake financial
comparisons and utilise benchmark information for specific Council Services and
examine the utility of this information for service and budget planning purposes. This
report highlights the key findings, recommendations and arising from the inquiry.

4
KEY FINDINGS

The key findings from this inquiry are:

6. Benchmarking at operational level using established benchmarking groups, such as


those run by the Association for Public Service Excellence (APSE), is useful for
informing service improvement. Such benchmarking typically involves both
performance and financial comparisons (e.g. unit costs).

7. Benchmarking purely budget information appears to be technically possible, but there


are limitations and caveats in collecting, using, and drawing conclusions from the
data, which greatly limits its use for overall budget planning purposes.

8. There appears to be a level of operational/service area benchmarking activity


ongoing within the authority but the availability of benchmark data is limited to some
service areas involved and there is no reporting of the pertinent issues to Members.

9. It appears to be unclear how service areas are applying benchmarking information in


shaping their individual services and using benchmarking as a tool for continuous
improvement.

10. The financial comparator benchmarking research for the selected service functions
found that a high spending service did not necessarily have good performance (as
defined by the Comprehensive Performance Assessment (C.P.A.)), and indicated that
most of Cardiff’s example service areas were at the ‘average’ spend-per-head
attained by the comparator group (the selected authorities).

11. The diverging performance assessment regimes in Wales and England have resulted
in more robust arrangements in England via C.P.A. Under the C.P.A. framework,
Value for Money assessments place a greater demand on Local Authorities to
compare their service expenditure with others than does that of the Wales
Programme for Improvement.

12. Cardiff, as the largest city in Wales, has no directly comparable authorities in Wales.
Cardiff has a much larger population, a greater population density and a higher level
of urbanisation making comparison with English authorities more relevant. While
models exist to enable the Council to identify English Comparator authorities, data
available for benchmarking is not generally directly comparable.

5
13. The APSE benchmarks indicated that the central charges (eg. HR, Finance, Property
Overheads) may vary considerably between authorities, issues relating to this would
be unlikely to surface unless a corporate review of Business Management
Information was carried out within the Council.

RECOMMENDATIONS

14. The Policy Review and Performance Scrutiny Committee recommend that the
Executive:

R1. Integrates benchmarking information into the current Performance Management


arrangements in order to strengthen the culture of continuous improvement.

R2. Commissions APSE to assist in the development of an improved corporate approach


to the use and application of benchmarking across the Council so that opportunities
for continuous improvement are maximised.

R3. Establishes arrangements to ensure that benchmarking data is more widely available
across the Council, and that this information is easily accessible for Members for
example the information be sent to the Members’ Library.

R4. Reviews the appropriateness of benchmarking with other Welsh Authorities, given the
unique nature of Cardiff, and considers establishing links with similar English
Authorities to increase the robustness and validity of comparable budget and
performance information.

R5. Fully supports and encourages a full review of the Wales Programme for
Improvement process to ensure wider benchmarking and comparative assessments,
particularly the use of the C.P.A Value for Money Assessment.

6
WHAT IS BENCHMARKING ?

15. A benchmark is ‘A surveyor’s mark indicating a point in a line of levels; a standard or


point of reference.’.The Task and Finish Group heard from a number of witnesses
that benchmarking is a powerful performance management tool, which can be used
to generate both incremental change and wide-ranging strategic reform for the
organisation. It is a learning process in which information, knowledge and experience
about leading practices are shared through partnerships between organisations. It
allows an organization to compare itself with others and, in the process, step back
from itself and reflect. In addition comparative measurement through benchmarking
helps to identify problems and opportunities, and also tests ideas and “gut feelings”
about performance and spend. Benchmarking is an ongoing process for finding
improved ways of doing things.

16. There is an important distinction between benchmarking as a process and specific


benchmarks. Benchmarks are data comparisons, such as financial spend (outturn)
and performance output figures. The Task and Finish Group intended to utilise
financial benchmarks to compare a selection of Cardiff County Council functions with
similar English Authorities.

17. The Task and Finish Group were informed that benchmarks could be categorized into
the following three types of benchmarks, which where either qualitative or
quantitative.

7
USING FINANCIAL BENCHMARKS

17. The Task and Finish Group heard from a number of witnesses about the benefits of
financial benchmarking. When used effectively it can identify paucity of information,
assist in realistic target setting, review service delivery methods and procedures. The
possible shortcomings of undertaking financial comparisons can be outweighed by
the need to point services in the right direction. It was noted that it can also be used
to steer funding to under resourced areas.

Benchmarking in Practice
Bristol City Council used financial comparison benchmarking to identify
the low performance and high cost of their Local Tax and Benefits Service.
This has lead to increased change in ethos, and a general shift upwards in
performance, efficiency and improvement.

18. Contextual information about a service is essential when scrutinising financial


comparator benchmarks as it can assist in creating pictures of efficiency which can
be used to generate questions about a service relating to income, cost reduction and
service quality. Financial benchmark comparisons can also be used as a guide to ask
specific questions about a service. For example,

• the relationship between cost, quality and level of service,


• the relationship between income generation and cost reduction, and
• issues relating to maximising income generation.

19. Financial comparator benchmarking can be utilised at three levels:

a) Strategic Resource allocation.


b) Service and Financial Planning.
c) Service Reviews.

20. In order to determine the utility of budget benchmarking the Task and Finish Group
considered that it was necessary to undertake two exercises. The first exercise was
to analyse some financial comparisons of Council functions at the strategic resource
allocation level. The second exercise utilised the financial benchmarks for two
selected service areas using APSE data.

8
THE DRIVERS

21. The level of Council activity relating to financial benchmarking across the UK varies
considerably. The Task and Finish Group received information, from a number of
witnesses, about the drivers and historical nature for financial comparator
benchmarking within Cardiff, Wales, and England.

22. There are several drivers for financial comparator benchmarking. Within Wales the
key driver is the Wales Programme for Improvement (WPI). This is underpinned by
the Local Government Act, 1999 which places a duty on Local Government in Wales
to continue to improve, therefore there is a need to compare with others, in order to
learn best practice and check that services are not excessively expensive when
compared with other areas.

23. In England within the C.P.A. process the Value for Money Assessment is a key driver.
Within the Audit Commission’s C.P.A. key lines of enquiry for Value for Money,
English Councils’ are expected to evidence that costs compare well with others
allowing for external factors. Based on the evidence available, auditors judge how
effective Councils are at achieving good value for money under a set criteria which
are split into 4 levels (level 4 being the best). At level 3 Councils are expected to
evidence that overall costs and unit costs for key services are low compared to other
councils providing similar levels and standards of services and allowing for local
context. At level 4 Councils are expected to evidence that they regularly benchmark
their costs and quality of services currently and then on a regular basis and that this
information is actively used to review and assess value for money in service provision
and corporately.

24. Figure 1 outlines some of the differences between WPI and CPA.

Figure 1. The Differences between WPI and CPA


WPI CPA
Due to flexibility, direct comparisons Direct comparisons readily available
between authorities often not possible for all assessment areas
Assessment results not usually published Assessment results published by regulator
No incentive scheme prescribed Well-defined freedom and flexibility incentives for
improvement in performance
No intervention measures prescribed If a council is identified as failing, early intervention
measures can be applied
Applicable to Local Government Authorities and to be expanded to cover
Fire and Rescue Authorities
Assessments support the development of proportionate audit and inspection programmes
Source: Wales Audit Office

9
25. Welsh Councils make limited use of financial comparison benchmarking. The
diverging performance assessment regimes in Wales and England have resulted in
more robust arrangements (via C.P.A.). W.P.I.’s flexibility has resulted in differing
ways of reporting thus comparisons are difficult at a service level.

26. Other difficulties faced by Welsh Councils included gaining agreement regarding
definitions of costs, differing treatment of costs and the fact that expenditure
represents a local political decision. Evidence from the Welsh Local Government
Association (WLGA) indicated that there is also a reluctance to include unit costs in
performance measures at a Welsh level. The Wales Audit Office view was that whilst
there is a wealth of financial data available, technical knowledge and analytical skills
rarely combine.

27. The pattern of low usage of financial benchmarking at a Welsh level is reflected
within Cardiff County Council. From a historical perspective (pre reorganization) the
level of financial comparisons undertaken via the South Glamorgan Family Groups
benchmarking arrangements was high. Post reorganisation saw a period decline in
the number of financial benchmarking groups undertaken within the Council.

RESEARCH

28. The Task and Finish Group commissioned research to provide comparative material
for four selected functions using Revenue Outturn data held by the ODPM and the
Welsh Assembly. This analysis was undertaken on a functional rather than structural
basis. The guidance and definitions for the service functions were found to vary
between England and Wales and the accountancy practices with regard to Trading
Account Services Returns were also different between the two countries. As a result,
the service functions below were selected on the basis of similarity:

Children’s and Families’ Services


Library Services
Development Control
Economic Development

29. The Task and Finish Group were informed that Cardiff is very different from the other
authorities in Wales; it has a much larger population, a greater population density and
a higher level of urbanisation. This makes the selection of comparator authorities

10
difficult. When benchmarking, it is important that comparisons are made with an
appropriate group.

30. The selection of comparison authorities is not and never will be an exact science.
The various models and groupings that exist only serve as signposts for potentially
comparable authorities, these include:

• Local Government Data Unit Wales (LGDUW) Comparator Model


• Family Origins, Crime and Disorder Model
• CIPFA Nearest Neighbour Model
• The ‘Big 11’
• The ‘Big 21’
• The Office For National Statistics (ONS) Classification of Local and
Health Authorities of Great Britain
• Major Cities (Housing) Grouping
• Major Cities 12 (Environmental Services)
• Education Schools Advisory Services Benchmarking Grouping
• APSE Performance Networks

31. The Task and Finish group were provided with an example of the LGDUW
Comparator Model. This model is a web based model, developed by the Data Unit in
2002. The model allows the user to select any number of twenty three criteria
available. Based on data similarities, English and Welsh authorities are identified in
rank order in terms of comparability with the selected Council. The 23 criteria are
outlined below:

• Total population
• Population of children (less than 16)
• Population of adults (16 and over)
• Population of working age
• Population of retired age
• Proportion of children in the population
• Proportion of adults in the population
• Proportion of working age in the population
• Proportion of retired age in the population
• Population density

11
• Population density of children
• Population density of adults
• Population density of working age
• Population density of retired age
• Proportion of births in the population
• Proportion of deaths in the population
• Proportion of the total population claiming attendance/disability allowance
• Proportion of the working age population claiming unemployment benefits
• Number of people over working age with jobs
• Proportion of the adult population with jobs
• Proportion of the working age population claiming income support
• Proportion of the working age population claiming job seekers allowance
• Child poverty

32. The Task and Finish Group were informed about the ONS model which utilises a
clustering technique where Cardiff is defined as a regional centre.

33. The research team used the LGDUW and the ONS Comparator Model to select
similar authorities to Cardiff. Using the two models and the two nearest comparators
in Wales the following Authorities were selected as a part of the research exercise:

Bristol, Derby City, Southampton, Bolton, Wigan, Sheffield, Sunderland,


Coventry, Walsall, Leeds, Wakefield, Newport and Swansea.

34. The following data sets were utilised and cross referenced with the English
Authorities’ CPA scores.

• Spend per head of population


• Absolute expenditure
• Spend as a proportion of total annual budget
• Wales average spend
• English average spend
• Trend for each applicable indicator over two years (02-03 & 03-04) for each
authority

35. Figure 4 demonstrates that for some services, high levels of spend per head do not
result in high performance ratings. For example Bolton M.B.C. has the highest

12
performance ratings within the group but has low spending per head of the
population. Conversely the highest spender within the group, Walsall, has one of the
lowest CPA and Social Services Inspectorate ratings. It was noted that the poorer
performing services may be attempting to front load resources to address their
performance issues.

36. Most of Cardiff’s example service areas were at the ‘average’ spend-per-head
attained by the comparator group (the selected authorities). The Children’s Services
financial benchmarks highlighted this (see figure 5). Library Services in Cardiff were
also placed at the average level for the comparator group (see figure 6).

Social Care SSI Serving Capacity to Spending


- Children Rating Children Improve Levels 2003-04
(Children’s For Social Well Children’s (1 = Highest
Services) * Services Services? Spender Per
(CPA) (Stars) Head)
Bolton MBC 4 3 Yes Excellent 9
Bristol CC 2 1 Some Promising 3
Coventry CC 2 2 Some Promising 2
Derby CC 2 2 Some Promising 6
Leeds 3 2 Most Uncertain 4
Sheffield 3 2 Most Promising 8
Southampton 3 2 Most Promising 5
Sunderland 3 3 Most Excellent 7
Wakefield 3 2 Most Promising 12
Walsall 2 1 Some Promising 1
Wigan 3 2 Most Promising 11

Figure 4. Children Service Financial Comparator Benchmarks

13
Figure 5: Childrens Services Net Expenditure (02-03 & 03-04) - Spend-per-Head by Selected
Welsh and English Authorities

140.00

120.00

100.00
Spend-per-Head (02-03)
Spend-per-Head - £'s

80.00 Spend-per-Head (03-04)

Average Spend-per-Head (02-03), All


60.00 selected authorities
Average spend-per-Head, All selected
authorities
40.00

20.00

0.00
Ne iff

ry
Sw o rt

ak d
n
a

W d
y
n

an
s

ll
Co ol

ut fie l

n
Su pto
rb
se

ed

sa
l
rd

lto

nt

ie
ist

la
wp

ig
De
Ca

ve

al
ef
an

Bo

Le

er
So hef
Br

W
nd
ha
S

Selected Authorities (comparator group)

Figure 6: Library Services Expenditure (02-03 & 03-04) - Spend-per-Head by Selected


Welsh & English Authorities

25.00
Spend-per-Head - £'s

20.00
Spend-per-Head (02-03)
15.00
Spend-per-Head (03-04)
10.00

5.00 Average Spend-per-Head


(02-03), All selected
0.00 authorities
Average Spend-per-Head
D y
Le y
Bo a
Br n

ak nd
ov l
an rt

ha eld
nd ton
So She ds
ew f

W ll
an
W ld
C isto
N rdif

tr
b

sa
se
lto
Sw po

ie
er
en

W rla

(03-04), All selected


ig
ut ffi
Su mp

al
ef
a

e
C

authorities

Selected Authorities (comparator group)

37. Anomalies existed with the Development Control (figure 7) and Economic
Development financial comparisons which adversely affected question generation.
The Welsh Assembly and ODPM Economic Development financial comparisons
were found to cover cost centres from 7 service areas which compounded the
difficulties of exploring the issues within individual services. It was also highlighted
that there were distinctive funding streams within regeneration funding (e.g. Objective

14
One Funding, New Deal for Communities) which also made financial comparisons
and drawing conclusions problematic.

Figure 7: Development Control ('Core-Planning') Net Expenditure - Spend-per-Head by


selected Welsh and English Authorities

10.00

8.00

6.00 Spend-per-Head (02-03)


Spend-per-Head - £'s

Spend-per-Head (03-04)

4.00
Average Spend-per-Head (02-03), All
selected authorities
Average Spend-per-Head (03-04), All
2.00 selected authorities

0.00
on
try

nd

ld
a
ff

rt

ld
n

n
by

ll
l
to
se
po
di

sa
ed

fie
to

a
ou ffie

rla
pt
en

er
ris

ig
ar

an
ew

ol

am

al
ke
Le

W
D
ov

de
he
C

W
w

a
N

th
C

un
S

W
S

-2.00
S

Selected Authorities (comparator group)

38. The methodology for selecting Authorities from which to compare was deemed to be
appropriate and due to Cardiff’s regional status as a Capital City the need to compare
with regional centres was also justified.

39. The financial benchmarks enabled Members to assess the spending levels for the
specific service functions for the selected authorities. However the information did not
generate any conclusions. Undertaking spend comparisons appears to be technically
possible, but there are limitations and caveats in collecting and using the data which
are detailed below:
• Due to devolution there are differences between the definitions of the services
that make comparisons between England and Wales difficult.
• Due to differences in the way Trading Account Services Returns are handled
in Wales and England several service areas cannot be compared accurately.
• There are differing funding structures in England which may skew the results
especially relating to grants.
• The services within the financial returns do not always equate to the service
structure within Cardiff County Council.
• Local Authorities have different needs and priorities and therefore spending
requirements, and different levels of funding from Government.

15
• Local Authorities have different mixes of fixed and variable costs thus differing
points at which they are most efficient (within Library Services other Authorities
(eg. Newport) have half the level of buildings which affect the out turn figures).
• Local Authorities operating in areas of higher poverty or rural conditions might
be dealing with factors that increase their outturn
• Data gathering difficulties in gaining speedy information from the ODPM.
• Differences in reporting the Financial Reporting Standard 17 (FRS 17)
• Financial comparisons alone do not take account of a range of complicated
factors in relation to differences in delivery of services, current infrastructure in
both regions and social need factors.

SERVICE LEVEL BENCHMARKS

39. The purpose of the second exercise was to scrutinise the APSE operational financial
benchmarks for two specific services. The APSE benchmarking process is organised
via the APSE Performance Network, which provides performance comparisons on a
"like-for-like" basis for local Authorities across the UK.

40. The APSE Performance Network selects comparable authorities (Family Groups) on
the basis of an evaluation of the local characteristics or context in which the service is
provided. A series of relevant, valid factors with appropriate importance weightings
that bring together comparable benchmarking partners are utilised and agreed by a
group of practitioners. Figure 8 provides details of the drivers used in making the
assessment.

Figure 8. APSE Key Drivers for Refuse Collection Source: APSE

16
41. The Members considered a basket of performance measures including unit costs and
other budgetary information for Parks and Waste Management.

42. The APSE data and the explanations of the officers indicated that the level of
expenditure and performance is driven by policy decisions/ priorities. For example,
Parks data indicated that Cardiff was in the lower quartile for National Playing Fields
Association ( NPFA) Standard Playgrounds per 1000 children which historically may
not have been a priority for the Council. In respect of waste it was noted that policy
decisions were key to increasing performance and generating efficiencies.

43. Care is required when considering performance and financial comparator


benchmarks as several factors can influence the level of expenditure at service level.
For example large investments in one year would increase the financial position of a
service disproportionately. It was found that Parks had high central charges (16%)
which are not always within the control of operational managers.

44. The financial comparator information enabled discussions about the relationship
between service expenditure and performance plus factors which increased service
area costs, e.g. overtime or the nature of the parks to be maintained. The recycling
performance and cost information were discussed with the lower cost in Cardiff being
associated with lower recycling activity.

45. It was noted that analysing the statistical information in isolation was not useful but it
can be used to drill down for more contextual information about a service.

46. The scrutiny of the APSE benchmarks alongside explanations from officers enabled
Members to create pictures of efficiency which generated questions about a service
relating to income generation, cost reduction and service quality. The benchmarking
information also assisted in the generation of questions relating to the budgetary
pressures faced by the service area.

47. Some performance indicators did not necessarily reflect reality on the ground. For
example the performance indicator “Households Covered by Kerbside Recycling
Collections” indicated that Cardiff’s figures were within the upper quartile. However,
due to the definition of the performance indicator the indicator measured the potential
level of households covered by recycling kerbside collections not the actual take up,

17
which was significantly lower due to the costs incurred and choices made by
residents.

48. The Task and Finish Group considered the availability of benchmark data in general
is limited to the service areas involved and there is no reporting of the pertinent
issues to Members.

INQUIRY METHODOLOGY

46. The Scrutiny Committee applies a project management approach to its inquiries;
including mechanisms to consistently prioritise topics suggested for scrutiny (PICK
process), scoping reports and project plans. The aim of these is to ensure there is
dialogue with the Service areas involved in the scrutiny process with the ultimate aim
of improving overall service delivery and enabling effective scrutiny.

47. The process was undertaken with the assistance and advice from the Financial
Services.

48. Members of the Task & Finish Group received a briefing report providing definitions
relating to benchmarking, the methodological approaches to benchmarking and the
factors to consider when analysing benchmark information.

49. Members received evidence from internal and external witnesses:

• Internal Witnesses: Phil Higgins, (Corporate Director, Resources), Mara


Michael (Chief Childrens Services Officer), Paul Orders (Head of Policy &
Economic Development) Elspeth Morris (Libraries and Information
Development Manager) , Trevor Gough (Chief Leisure, Libraries and Parks
Officer) Malcom Evans (Chief Regulatory Services Officer), Phillip. Sherratt
(Chief Officer Waste Management), and Mike Clark, (Operational Manager
Parks and Bereavement Services).

• External Organisations: Will Mclaen (Performance and Improvement Adviser


WLGA), Phillip Ramsdale (Executive Director of IPF), David Rees (Head of
Knowledge and Information, Wales Audit Office), Matthew Hockridge ( Senior
Researcher, Wales Audit Office), Mr Carew Reynell (Director of Central
Support Services, Bristol City Council) and Debbie Johns (Principal Adviser,
APSE).

18
LEGAL IMPLICATIONS

50. The Scrutiny Committee is empowered to enquire, consider, review and recommend
but not to make policy decisions. As the recommendations in this report are to
consider and review matters there are no direct legal implications. However, legal
implications may arise if and when the matters under review are implemented with or
without any modifications. Any report with recommendations for decision that goes to
Executive/Council will set out any legal implications arising from those
recommendations. All decisions taken by or on behalf the Council must (a) be within
the legal powers of the Council; (b) comply with any procedural requirement imposed
by law; (c) be within the powers of the body or person exercising powers of behalf of
the Council; (d) be undertaken in accordance with the procedural requirements
imposed by the Council e.g. Scrutiny Procedure Rules; (e) be fully and properly
informed; (f) be properly motivated; (g) be taken having regard to the Council's
fiduciary duty to its taxpayers; and (h) be reasonable and proper in all the
circumstances.

FINANCIAL IMPLICATIONS

51. There are no direct financial implications arising from this report. However, financial
implications may arise if and when the matters under review are implemented with or
without any modifications.

19
POLICY REVIEW & PERFORMANCE SCRUTINY COMMITTEE
MEMBERSHIP 

Councillor David Walker, Chairperson

Councillor Roger Burley Councillor Russell Goodway

Councillor Joseph Carter Councillor John Norman


Councillor Timothy Davies

Councillor Paul Chaundy Councillor Simon Wakefield Councillor John Sheppard

Scrutiny Services, Cardiff County Council


3/4 The Courtyard, County Hall, Atlantic Wharf, Cardiff CF10 4UW
Tel: 029 2087 2296 Fax: 029 2087 2579 Email: scrutinyviewpoints@cardiff.gov.uk
© 2005 Cardiff County Council

20

Vous aimerez peut-être aussi