Vous êtes sur la page 1sur 109

Working Council for Chief Information Officers

IT Balanced Scorecards
End-to-End Performance Measurement for the Corporate IT Function

 Drivers of Balanced Scorecard Adoption in IT


 Best-in-Class IT Balanced Scorecard Metrics
 Principles of Balanced Scorecard Design and Metrics Selection
 Scorecard Development and Life-Cycle Management

© 2003 Corporate Executive Board


ii

Working Council Staff Creative Solutions Group


Executive Director Lead Graphic Design Specialists
Chris Miller Jon Prinsky • Elizabeth Sugerman

Working Council for Chief Information Officers Managing Director Publications Editor
Jaime M. Capellá Dave Engle
2000 Pennsylvania Avenue NW
Washington, DC 20006 Practice Manager
Kris van Riper
Telephone: 202-777-5000
Project Managers
Facsimile: 202-777-5100
Andrew Horne • Matt McWha
166 Piccadilly Consultants
London, W1J 9EF James Bilodeau • Carsten Schmidt
United Kingdom Senior Analyst
Telephone: +44-(0)20-7499-8700 Eric Tinson
Facsimile: +44-(0)20-7499-9700 Analysts
Kiran Mishra • Michael Scutari
www.cio.executiveboard.com Rich Flanagan • Brett Neely
Senior Director
Brian Foster
Directors
Sheldon Himelfarb • Matt Kelly
Associate Directors
Stuart Roberts • Ken Rona • Audrey Taylor

Note to Members Legal Caveat


This project was researched and written to fulfi ll the research requests of several members The Working Council for Chief Information Officers has worked to ensure the accuracy of
of the Corporate Executive Board and as a result may not satisfy the information needs of all the information it provides to its members. This report relies upon data obtained from many
member companies. The Corporate Executive Board encourages members who have additional sources, however, and the Working Council for Chief Information Officers cannot guarantee
questions about this topic to contact the Board staff for further discussion. Descriptions or the accuracy of the information or its analysis in all cases. Further, the Working Council for
viewpoints contained herein regarding organizations profi led in this report do not necessarily Chief Information Officers is not engaged in rendering legal, accounting, or other professional
reflect the policies or viewpoints of those organizations. services. Its reports should not be construed as professional advice on any particular set of
facts or circumstances. Members requiring such services are advised to consult an appropri-
Confidentiality of Findings ate professional. Neither the Corporate Executive Board nor its programs is responsible for
any claims or losses that may arise from (a) any errors or omissions in their reports, whether
This project has been prepared by the Corporate Executive Board for the exclusive use of its caused by the Corporate Executive Board or its sources, or (b) reliance upon any recom-
members. It contains valuable proprietary information belonging to the Corporate Executive mendation made by the Working Council for Chief Information Officers.
Board and each member should make it available only to those employees and agents who
require such access in order to learn from the material provided herein, and who undertake
not to disclose it to third parties. In the event that you are unwilling to assume this confiden-
tiality obligation, please return this document and all copies in your possession promptly to
the Corporate Executive Board. Catalog No.: CIO1L9VDH
Table of Contents

Executive Summary • iv
Member Self-Diagnostic • vii
An Exhaustive Compendium of IT Balanced Scorecard Metrics • 1
Introduction: Principles of Balanced Scorecard Design and Metrics Selection • 9
Best-in-Class IT Balanced Scorecard Metrics • 21
Financial Performance • 22
Project Performance • 26
Operational Performance • 30
Talent Management • 34
User Satisfaction • 40
Information Security • 46
Enterprise Initiatives • 50
Scorecard Development and Life-Cycle Management • 55
Scorecard Rollout • 56
Data Collection and Quality Assurance • 58
Scorecard Review and Revision • 60
Facilitating Scorecard Adoption • 62
Appendix I: Collected IT Balanced Scorecards • 67
Appendix II: IT Balanced Scorecard Tools and Vendors • 91
Order Form • 97
In-Person Research Presentations • 98
Working Council Project Support Desk • 99

© 2003 Corporate Executive Board iii


iv

Executive Summary

Institutionalizing IT Balanced Scorecards


Multiple Drivers of Greater IT Performance Scrutiny 3. Broad Senior Executive Commitment—Exemplars involve a mix
IT organizations are increasingly adopting the balanced scorecard of senior IT and business leadership in balanced scorecard design,
as a management tool incorporating financial, operational, metrics selection, and regular review.
talent management, project management, and user satisfaction
perspectives into assessments of the performance of the function. 4. Enterprise-Standard Metrics Definitions—Progressive practitioners
This responds to two major categories of challenges: achieve consensus on and clearly document scorecard metrics
definitions, allowing review meetings to focus on decisions rather
1. Business-Centric Challenges—These include the CIO’s need to link than debate over the composition or relevance of individual
IT strategy with business strategy, to monitor service levels while metrics.
at the same time reducing expenses, and to better demonstrate the 5. Drill-Down Capability and Available Context—To supplement
business value of IT. a high-level view of IT performance and allow for the detailed
2. IT-Centric Challenges—These include the CIO’s need to better review of trends or variance, exemplar balanced scorecards
manage IT’s human capital and customer satisfaction, to baseline provide visibility into the component elements, source, and
IT’s performance with respect to external providers, to better context of scorecard metrics.
manage historically politicized resource allocation decisions, and 6. Individual Compensation Linked to Scorecard Performance—To
to correct chronic project underperformance. facilitate adoption of the IT balanced scorecard and ensure
stakeholder accountability, leading companies link achievement
Design Principles of Exemplar IT Balanced Scorecards of scorecard targets to individual compensation.
Based on an extensive review of IT balanced scorecards collected
from corporate exemplars, the Working Council finds that the most Seeking a More Balanced View of IT Performance
advanced scorecards share six key structural attributes: Progressive IT balanced scorecard practitioners track metrics in
five key categories, seeking to “balance” traditional supply-side
1. Simplicity of Presentation—Exemplar enterprise IT balanced
operational metrics with demand-side measures such as customer
scorecards are limited to a single page of 10 to 20 metrics that
satisfaction. The five key metrics categories are:
communicate performance in concise, non-technical language for
consumption beyond the IT function. Financial Performance—A set of granular financial metrics
2. Explicit Links to IT Strategy—Best-in-class IT balanced scorecards allows CIOs to understand IT spending in the context of service
are tightly linked with the outputs of the annual IT strategic levels, strategy implementation, and project progress.
planning process and help track progress against IT’s key goals Project Performance—Spurred by a legacy of failed projects and
and objectives. chastened by the reality that large-scale initiatives are most
at risk, exemplars utilize the balanced scorecard as a vehicle
for tracking of project progress and status, with a focus on
enterprise projects.
© 2003 Corporate Executive Board
Executive Summary (continued)

Institutionalizing IT Balanced Scorecards


Operational Performance—Instead of concentrating Scorecard Life-Cycle Management
measurement efforts on system-specific, day-to-day IT After designing the IT balanced scorecard and selecting the
operational metrics, best-in-class balanced scorecard appropriate metrics, advanced practitioners develop a handful of
practitioners seek to provide an aggregate, customer-focused critical competencies to manage the IT balanced scorecard across
view of IT’s operations. its “life cycle”:
Talent Management—Seeking to better manage the IT
Performance Transparency—To facilitate achievement of targets
organization’s human capital, progressive scorecard practitioners
and increase visibility into IT performance, best-in-class
track IT staff satisfaction and retention, as well as the
scorecard users stage regular executive-level reviews of metric
attractiveness of the IT department on the external IT skills
status and gap to goal.
market.
Formalized Data Collection Roles and Responsibilities—
User Satisfaction—By regularly assessing end-user satisfaction,
Exemplars are creating clearly delineated collection processes
CIOs can more easily identify service delivery problems and
and roles to ensure data freshness and accuracy.
more accurately articulate cost and service quality trade-offs for
consideration by business decision makers. Closed-Loop Scorecard Revision—Progressive scorecard
practitioners create a closed-loop process for updating categories
Additional Categories in the Ascent and metrics as business and IT strategies change.
Exemplars include supplemental metric categories to better
understand IT’s performance in two critical areas:
An IT Balanced Scorecard Deployment Toolkit
Information Security—The recent surge in hacks and viruses, The Working Council has included three appendices in this brief
coupled with increased geopolitical tensions, has led advanced IT to assist IT organizations in developing and deploying IT balanced
balanced scorecard practitioners to elevate information security scorecards. The first reproduces the IT balanced scorecards
to the category level, using the scorecard to monitor remediation used by the companies profiled in this brief to provide members
efforts for known vulnerabilities and track proactive policy and with an illustrative sample of scorecard presentation formats. In
certification efforts. addition, we have included a compendium of metrics, organized by
scorecard category, to allow members to qualitatively benchmark
Enterprise Initiatives—Progressive IT departments are also
themselves against other scorecard practitioners or compress the
using their balanced scorecards to highlight IT’s contributions
cycle time required to generate candidate metrics for newly created
to initiatives of corporate strategic importance (for example, IT
scorecards. The third appendix contains an overview of the major
integration work in service to post-merger synergy targets).
vendors of balanced scorecard software solutions, designed to help
IT executives exploring the purchase of one of these tools better
understand potential solutions.

© 2003 Corporate Executive Board v


vi

© 2003 Corporate Executive Board


Member Self-Diagnostic

Assessing the Need for an IT Balanced Scorecard


The following questions are intended to assist CIOs in diagnosing whether an IT balanced scorecard would be a useful addition to their
current IT performance management framework.

Yes No Yes No
1. Can I clearly articulate the link between IT operational 5. Can I communicate a holistic perspective of IT
and project activities and the organization’s stated performance consistently across various geographies
strategic business goals? and business units?
Yes No Yes No
2. Is there a process or mechanism in place to track the 6. Can I easily compare the performance of my IT
impact on service levels and satisfaction of ongoing function to that of industry competitors or companies
cost-efficiency efforts? with similar geographic dispersion or scale?
Yes No Yes No
3. Can I describe the performance of the IT function in 7. Do IT performance management meetings focus
a concise, non-technical, business-friendly fashion? almost solely on discussions of metric comparability
Yes No and validity rather than on making resource
4. Can I effectively communicate the value that IT creates allocation decisions?
for the business? Yes No
8. Do I have a sufficient understanding of the progress
and status of ongoing IT project work to allow for
corrective action if major projects are at risk for scope
creep, budget overruns, or schedule delays?

Subtotal “No” _______ Subtotal “No” _______

Total “No” _______

Diagnostic Evaluation
If four or more “No” answers, then adoption of an IT balanced scorecard
may facilitate IT performance management at your organization.

© 2003 Corporate Executive Board vii


viii

Member Self-Diagnostic (continued)

Designing and Maintaining a World-Class IT Balanced Scorecard


To assist member companies in the initial design and implementation of IT balanced scorecards, as well as the reevaluation of existing scorecards,
the Working Council has created the following diagnostic questionnaire.

Structuring the Scorecard Yes No Selecting Scorecard Metrics Yes No


1. Does the IT balanced scorecard fit onto a single page 8. Do financial metrics move beyond simple reporting
(or screen)? Yes No
of total IT spend to help decision makers reallocate
2. Are the IT balanced scorecard’s metrics devolved IT funding between functional areas, business units,
from the goals articulated in the IT strategic plan? and portfolio categories? Yes No
Yes No
3. Are these metrics expressed in non-technical 9. Are operational metrics aggregated to provide decision
language, allowing business decision makers to easily makers with a “user’s perspective” of IT performance?
Yes No
understand IT performance? 10. Does the balanced scorecard’s project performance
category include an assessment of compliance with
Selecting Scorecard Categories Yes No enterprise architecture goals and contribution to
4. Does the IT balanced scorecard supplement financial corporate business strategies? Yes No
and operational metrics with categories that track 11. Do measures of customer satisfaction incorporate
project performance, user satisfaction, and talent both end-user and executive perspectives on IT
management? Yes No
performance? Yes No
5. Does the IT balanced scorecard also include 12. Do talent management metrics focus on gauging staff
categories for information security? satisfaction, external reputation of the IT organization,
Yes No and other organizational attributes likely to make the
6. Does the scorecard outline target levels for each
metric that have been agreed upon by both senior company a destination for high-potential IT talent? Yes No
IT and business leadership? 13. Does the balanced scorecard build awareness of
Yes No
7. Are scorecard categories, metrics, and weightings information security issues by providing senior
revisited on an annual cycle, to ensure continued decision makers with an assessment of the
relevance to changing business needs? organization’s vulnerability? Yes No
14. Are metrics designed to track IT’s contribution to
major enterprise initiatives aggregated in a single
scorecard category, allowing business sponsors to
quickly assess IT’s level of support?

Subtotal “Yes” _______ Subtotal “Yes” _______


© 2003 Corporate Executive Board
Member Self-Diagnostic (continued)

Designing and Maintaining a World-Class IT Balanced Scorecard


Ensuring Data Accuracy and Relevance Yes No Facilitating Management Decision Making Yes No
15. Do all metrics have clear, well-documented 21. Have business decision makers, metrics owners, and
definitions, agreed upon by senior IT and business IT leaders received training on the basic concepts and
leadership? Yes No uses of balanced scorecards? Yes No
16. Does each IT balanced scorecard metric have a 22. Is the IT balanced scorecard reviewed on a regular
defined collection frequency (e.g., monthly, quarterly, basis by IT and business executives senior enough to
annually) based on the volatility of the business make decisions based on scorecard information?
strategy it helps enable? Yes No
17. Have all scorecard metrics been assigned to a metric
owner whose compensation is based on his or her
timely delivery of required data? Yes No
18. Is data accuracy verified by local metrics experts
before data are published to the scorecard?
Yes No
19. Does the IT balanced scorecard provide readers with
the ability to drill down into the data underlying
scorecard metrics? Yes No
20. Does the IT balanced scorecard provide readers with
context for changes in performance (for example,
historical reference data, external benchmarks, or
metric owner comments)?
Subtotal “Yes” _______ Subtotal “Yes” _______

Total “Yes” _______

Diagnostic Evaluation
Total Number of “Yes” Answers Assessment
1–8 IT Balanced Scorecard Novice
9–16 Developing IT Balanced Scorecard Competency
17–22 Balanced Scorecard Exemplar

© 2003 Corporate Executive Board ix


x

Roadblocks to Successful Scorecard Use


Process Problems Hinder Successful Scorecard Deployment 5. Lack of Drill-Down Capability and Metric Context—Limited
Working Council research into the IT balanced scorecard access to detailed data and contextual information underlying
deployments at dozens of large organizations reveals that those scorecard metrics stymies trend or variance analysis and
efforts often fall prey to a set of seven “deadly sins” of balanced complicates scorecard interpretation.
scorecard design and use:
6. Too Many Metrics—A surplus of metrics overwhelms the
1. An IT-Centric View of IT Performance—Failure to involve senior scorecard reader and leads to suboptimal use of senior decision-
business decision makers in metrics design and selection leads to makers’ limited time.
an operations-biased perspective of IT performance.
7. No Individual Impact—The absence of incentives linking
2. Measures That Don’t Matter—Poor linkage between IT balanced individual behavior to IT balanced scorecard use hampers
scorecard metrics and articulated IT strategies results in strategic scorecard adoption and achievement of targets.
“drift” between day-to-day IT work and the needs of the
business.

3. No Common Ground—The lack of a standard set of metrics


definitions leads to divisional, regional, business unit, and local
variation, complicating scorecard data aggregation.

4. Overreliance on Tools—Focus on the rollout of data collection


tools at the expense of creating a clearly articulated data collection
process results in inaccurate, outdated scorecard information,
hindering effective decision making.

© 2003 Corporate Executive Board


Seven Deadly Sins
Common Pitfalls Encountered Along the Balanced Scorecard Life Cycle
1
An IT-Centric View of IT Performance: Lack of senior Solution:
Strategic business executive involvement in metrics selection and refinement p. 60
2
Measures That Don’t Matter: No Solution:
explicit link between metrics and IT strategy p. 56
3
Solution:
No Common Ground: Lack of standard
metrics definitions complicates aggregation p. 24
4
Solution:
Overreliance on Tools: Lack of focus on data
collection process leads to inaccurate, outdated data
p. 58
Nature 5 Solution:
of Problem Lack of Drill-Down Capability and Metric Context: Unavailable
context for scorecard-level metrics hinders interpretation p. 44
6 Solution:
Too Many Metrics: Lack of aggregation and screening
of low-level metrics, resulting in cumbersome reports p. 32
7
No Individual Impact: Individuals lack Solution:
incentive to inflect scorecard performance p. 62

Tactical

Selection Collection Reporting Usage

Stage of IT Performance Measurement Process

Source: Working Council research.

© 2003 Corporate Executive Board xi


xii

© 2003 Corporate Executive Board


An Exhaustive Compendium
of IT Balanced Scorecard Metrics
Financial Performance Metrics

Project Performance Metrics

Operational Performance Metrics

Talent Management Metrics

User Satisfaction Metrics

Information Security Metrics

© 2003 Corporate Executive Board 1


IT Balanced Scorecards 2

Balanced Scorecard Metrics


Financial Performance
Most Common Metrics IT Departmental Cost
Total IT expenditures IT cost per employee
Percentage of IT expenditures delivering new functionality Total IT spending by geography
Percentage of “lights on” operating costs (including break/fix, depreciation) versus total IT spend Total IT spending by business unit
Project and Investment Cost Performance Expenses compared to revenue per quarter
Percentage of R&D investment resulting in operational applications Year-to-date net book expense
Total value creation from IT enabled projects Spend per portfolio category
Percentage differential in business case estimate and actual benefits of projects Performance against IT spending performance
Percentage of key strategic projects initiated with cost/benefit analysis Central IT spend as percentage of total IT spend
Systems and Services Cost Net present value delivered during payback period
Dollar value of technology assets still in use beyond depreciation schedule
Share of discretionary spending shared by IT
Percentage reduction in maintenance cost of all systems
Average network circuit cost reduction per quarter
PC/laptop software maintenance cost per month per user
Workstation software maintenance cost per month per workstation
E-mail service: cost per month per user
Infrastructure spending
Total maintenance cost
Individual systems cost
Dollars saved through vendor reviews and negotiations
Percentage of year-over-year cost reduction per service
Total cost of ownership of IT services versus external benchmarks
Service unit cost

© 2003 Corporate Executive Board


Balanced Scorecard Metrics
Project Performance
Most Common Metrics Project Alignment with IT Strategy
Percentage of projects on time, on budget Percentage of project requirement fulfilled via reuse
Percentage of projects compliant with architectural standards Percentage of applications deployed on a global basis
Project Spending and Costs Percentage of infrastructure standardization projects of total project pool
Total spent on non-compliant projects as a percentage of total project spending Percentage of projects using common project methodology
Actual versus planned ROI for implementation of key initiatives Percentage of application failures within first 90 days of deployment
Percentage of projects with completed business case Percentage of “at-risk” projects that adopt quality, security, and compliance standards
Percentage of budget allocated to unplanned projects Increase in project management maturity
Share of technology planning projects evolving into full projects Project quality index
Percentage of projects managed with detailed budget data Percentage of projects with completed requirements and architecture document
Project Timeliness and Delivery
Average project duration
Percentage of projects with detailed project plan
Dollars saved through productivity improvement and reusable code
Percentage of projects started on time
Percentage of project milestones delivered

© 2003 Corporate Executive Board 3


IT Balanced Scorecards 4

Balanced Scorecard Metrics


Operational Performance Larger
Most Common Metrics IT Inventory and Contracts
Key applications and systems availability Complete review of global licenses
Help-desk first-call resolution rate Quarterly partner/supplier ratings and in-person review sessions established with all major suppliers
and partners
User-Centric Operational Performance Assets of terminated users placed in global IT asset pool—asset reuse measured
Global desktop availability (aggregate e-mail/servers/LAN/WAN) Percentage of applications purchased versus built
Average number of incidents per user per month (average number of times end user experiences Inventory accuracy
global desktop availability outages per month)
Consistently available and reliable IT services to users Help-Desk Performance
Rate of failure incidents impacting business Mean time to repair for all network outages less than 100 minutes
Network and Systems Performance Mean time to repair for all application systems outages less than four hours
Print server availability Percentage of infrastructure service requests closed within service level agreements
All critical systems and infrastructure have viable business continuity plans by end 2002 Operational Strategy Adoption
System/application database maintained with more than 95 percent accuracy Completion of service transformation with minimum business disruption
Activity increase on the hub All announced changes completed within advertised downtime window
E-mail transmit less than 20 seconds (all regions) All IT provisioning decommissioned for terminated users performed in less than 72 hours after
notification by HR
Monthly average of network availability consistently more than 99.5 percent Network access terminated for terminated users in less than 72 hours after notification by HR
Monthly average of critical systems availability consistently above 99.5 percent Percentage of IT architectural plans approved, reviewed, and accepted by business
Mean time to repair for all client outages less than two hours Number of applications used by more than one line of business
Network uptime Percentage of desktop PC standardized
CRM availability End-to-end availability for customer service
PC/laptop hardware fix or replacement within 48 hours IT effectiveness in resource allocation supporting business objectives
Total cost of ownership of identified products and services compared to industry standards IT solutions meet business needs
Identify and manage strategic alliances with IT partners
Performance of shared services
Decrease average development cost by 10 percent
Percentage of consumer orders processed online

© 2003 Corporate Executive Board


Balanced Scorecard Metrics
Talent Management
Most Common Metrics Training and Personal Development
Employee morale/satisfaction (multiple point scale, low to high) Percentage of employees who have met with direct manager at least once within the past
month
Percentage of individual annual career plan goals met Percentage of performance assessment and development plans delivered to employees
Overall IT staff retention and attrition rate Percentage of employees with mentors
Staffing Percentage of employees with individual development plans
Percentage of non-entry-level position filled internally Percentage of individual training objectives met
Average tenure of solid performers (in years) Employee “business knowledge” survey performance
Percentage of projects led by non-project managers Percentage of managers trained in employee motivation
Percentage of projects assignments that are cross-functional Percentage of staff with appropriate measures for their personal goals
Ratio of skills sets needed to skills set represented Percentage of IT employees who attended one location- and/or function-specific town
meeting per quarter
Performance against staff diversity goals Share of IT training spent in business units
Number of candidates interviewed per open position Number of IT person-hours spent at industry events
Job changes (counting moves transferring out) national versus international Number of training hours per employee per quarter
IT headcount (number of full-time IT staff) Corporate Strategy–Related Metrics
Contractor headcount Number of awards won by company for use of IT
Number of candidates interviewed per open position Competitiveness of current employment offer versus industry
Percentage of planned staffing levels Citation of IT organization in press
Average years of IT experience

© 2003 Corporate Executive Board 5


IT Balanced Scorecards 6

Balanced Scorecard Metrics


User Satisfaction
Most Common Metrics Survey Questions
Overall end-user satisfaction survey Perceived versus actual price competitiveness of IT services (1 to 10 scale, from user survey)
Help-desk first-call resolution rate Perceived ability to deliver technical/business solutions and services (multiple point scale)
Surveys Quality of communication about available services and new technologies (multiple point scale)
Overall business executive satisfaction rating Help-desk client satisfaction—percentage dissatisfied
Non-Survey Metrics Satisfaction with individual operational services, e.g. voice services, network infrastructure
Percentage of hardware service requests closed within 48 hours Contribution to business process improvement
Percentage of software service requests closed within 24 hours Contribution to business value creation
Degree of alignment of IT services with articulated business priorities Contribution to business competitive advantage
Help-desk tickets per user per month Contribution to corporate business strategy
Percentage of service level agreements not met

© 2003 Corporate Executive Board


Balanced Scorecard Metrics
Information Security
Most Common Metrics
Percentage of systems compliant with IT security standards
Number of security incidents in operational systems or infrastructure which lead to material loss
Time to respond to incidents
Percentage of network access points covered by intrusion detection systems
Percentage of external partners in compliance with security standards
Percentage of security patches for client and server applications applied within deadline
Percentage of high-level security breaches dealt with in set time
Percentage of new initiatives that receive security and compliance sign-off
Number of security manager training programs
Percentage of systems compliant with IT security standards
Virus containment 100 percent (zero intrusion)
Percentage of hardware updated with latest virus patch in less than 24 hours after update release

© 2003 Corporate Executive Board 7


8

© 2003 Corporate Executive Board


Introduction
Principles of Balanced Scorecard Design and Metrics Selection

© 2003 Corporate Executive Board 9


IT Balanced Scorecards 10

A Balanced Model for Corporate Performance Measurement


The Balanced Scorecard Fills a Need For More Holistic Corporate IT-Specific Balanced Scorecards Becoming More Prevalent
Performance Management With the current economic downturn, CIOs report more pressure
In their January 1992 Harvard Business Review articles, “The to cut costs and demonstrate how the existing IT budget is being
Balanced Scorecard: Measures That Drive Performance,” Robert S. spent. These twin drivers have led CIOs to reexamine ways to
Kaplan and David P. Norton introduced the concept of the Balanced more effectively measure the cost and service performance of the
Scorecard to businesses as an alternative to the conventional IT function. In response, many are deploying balanced scorecards
financial-only view of corporate performance. Organizations have to capture a more business-focused view of performance. A 2002
embraced the balanced scorecard concept, which supplements Survey by CIO Insight magazine of 357 senior IT executives found
financial performance with metrics tracking customer satisfaction, that 24 percent were using an IT balanced scorecard and that
process performance, and learning and growth to create a more another 30 percent were considering deploying one in the future.
holistic approach to corporate performance management. According The 2003 CIO Insight survey of 345 senior IT executives found that
to Bain & Company, 50 percent of Fortune 1,000 companies one year later this prediction had come to pass, with 30 percent of IT
currently use balanced scorecards to manage organizational departments using an IT balanced scorecard. This growing adoption
performance. of the balanced scorecard as an IT performance management tool
is underscored by the Working Council’s 2003 Research Agenda
Poll, which reveals that 39 percent of member IT organizations have
deployed an IT balanced scorecard.

© 2003 Corporate Executive Board


Growing Scorecard Adoption in IT
Since its creation in 1992 as a more holistic …large companies have adopted the
corporate performance management tool… balanced scorecard at the corporate level…
Percentage of Fortune 1,000 Companies Using Balanced Scorecards
1999 Bain & Company’s CEO Management Tools and Techniques Survey

50 percent of
Kaplan and Norton Harvard Fortune 1,000
companies
introduce the Balanced Business have adopted
Scorecard concept in 50% 50%
an Harvard Business
Review a corporate
Review article balanced
1992 scorecard

Source: Bain & Company.

…and are now increasingly adopting scorecards in IT in response to performance pressure

Balanced Scorecard Use in IT Departments IT Balanced Scorecard Adoption Among Working Council Members
Survey of More Than 345 Senior IT Executives Survey of 124 CIOs

30% Percentage
Percentage of Working
24% Council
of Working
Council 39% members
members using an IT
61% balanced
not using an
IT balanced scorecard
scorecard
2002 2003

Source: CIO Insight. Source: Working Council 2003 Agenda Poll.

© 2003 Corporate Executive Board Introduction: Principles of Balanced Scorecard Design and Metrics Selection 11
IT Balanced Scorecards 12

Many Roads Lead to Balanced Scorecards


Balanced Scorecards as a Multifaceted IT Management Tool IT-Specific Drivers
During the course of its research, the Working Council has observed 4. Providing a Holistic View of IT Operations—CIOs need to ensure
seven key drivers of IT balanced scorecard adoption, including both they run the IT function with the same business discipline that
pressures from IT’s business “clients” and challenges endemic to IT: characterizes other corporate functions. While operational
dashboards provide a snapshot of the performance of specific
systems, IT balanced scorecards provide CIOs with a more
Business-Related Drivers holistic presentation of the performance of the IT organization’s
1. Linking IT Strategy with Business Strategy—IT departments resources and assets, including staff and end-user satisfaction.
frequently lack an understanding of how their efforts enable
corporate strategy. The IT balanced scorecard can help CIOs 5. Baselining IT’s Performance with Respect to External Providers—
be more responsive to corporate strategy on an ongoing basis As companies increasingly bid work competitively with
by highlighting IT metrics that are directly linked to corporate outsourcers, internal IT organizations are being asked to improve
strategic goals. their IT operational and project performance to compete with
third parties for available work. Balanced scorecards provide IT
2. Monitoring Service Levels While Cutting Expenses—Across the with a tool for assessing baseline IT performance against vendor
past several years, most CIOs have seen slowing growth or cost and quality benchmarks, and for tracking corporate IT
reductions in IT budgets, while at the same time meeting service progress against gap-closing goals.
levels becomes more critical as more business processes are
enabled by IT. IT balanced scorecards provide a vehicle to help 6. Depoliticizing Resource Allocation Decisions—IT resource
CIOs balance cost cutting and service delivery needs. allocation decisions at the majority of organizations are often
made lacking complete, accurate information about project costs,
3. Demonstrating IT Value to the Business—Reviews of IT benefits, and risks. The IT balanced scorecard provides a set of
operational measures are not always effective in communicating agreed-upon, up-to-date, standard metrics for measurement of
IT’s impact on the business to project sponsors or business IT service quality and performance, facilitating less politicized
executives. As a result, articulating why a significant portion of funding trade-offs.
the company’s spending is consumed by IT, or why IT might
need to reduce its budget at a slower rate than other functional 7. Correcting Chronic Project Underperformance—Most IT
areas or business units, is a challenge for most CIOs. By tracking organizations are plagued by projects that fail to deliver—the
granular, business-focused metrics, the IT balanced scorecard Standish Group’s 2002 CHAOS survey reveals that 90 percent of
can help CIOs communicate IT’s performance to the rest of the projects larger than $3 million fail. The IT balanced scorecard
organization. provides a vehicle for tracking project progress and status, as well
as progress on IT initiatives, such as systems retirement, allowing
CIOs to make mid-course corrective decisions or cancel projects
when necessary.
© 2003 Corporate Executive Board
Drivers of IT Balanced Scorecard Adoption
Scorecard adoption in IT is driven by both business pressures… …and IT’s own internal needs
1 Linking IT Strategy Q: How Is Your Job 4 Providing a Holistic View of IT Q: Does Your IT Department Regularly Measure
with Business Strategy Performance Primarily Evaluated? Operations Customer Satisfaction with IT Services?
• Problem: Unclear Survey of 388 CIOs, CTOs, and VPs of IT, November 2002 • Problem: IT function traditionally Survey of 539 IT Executives
relationship between Contribution to Achieving
neglects staff management and
IT efforts and corporate 42.3% Yes; Internal No
Business Strategy satisfaction performance
Employees 42% 36%
strategy Operational Performance 23.4% • Balanced scorecard tracks all key
• Balanced scorecard tracks organizational resources and assets 17%
metrics that are linked Financial Performance 11.9% Yes; External
to corporate strategy Interactions with Peers, 4% Business Partners/
Superiors, and Subordinates 11.4% Yes; Both Internally Customers
and Externally
Project Competition 5.7%
Source: CIO Insight. Source: 2003 State of the CIO Survey, CIO Magazine.

2 Monitoring Service Levels Decline in IT Capital Expenditures 5 Baselining IT’s Performance Share of IT Budget Allocated
While Cutting Expenses Reported Capital Expenditures, 752 Corporations with Respect to External to External Service Providers
• Problem: IT budgets cut while 2002 2003 (E) Providers
growing share of business • Problem: IT organizations lack data
processes enabled by IT 10%
to compare their performance against 19%
• Balanced scorecard provides that of outside service providers
ability to balance cost and service (10%) • Balanced scorecard provides baseline
quality in an informed manner (15%) performance, which can be used to 1994 2003 (E)
Source: Goldman Sachs Global Equity Research. assess feasibility of outsourcing Source: Gartner.

3 Demonstrating IT Value Q: Has Pressure to Demonstrate ROI 6 Depoliticizing Resource Politicization of Prioritization Process
to the Business Increased or Decreased in Past 12 Months? Allocation Decisions Survey of 1,077 CIOs, CTOs,
• Problem: Difficulty of tracing Survey of 365 CIOs, CTOs, • Problem: Resource allocation decisions and VPs of IT, July 2001
IT’s impact on business and VPs of IT, December 2002 made lacking project cost, benefit, risk Believe
Prioritization
performance Decreased information Process Is
• Balanced scorecard measures 3.3% • Balanced scorecard provides standard Depoliticized 38% Believe
broad range of granular, business- metrics to inform resource/quality 62% Prioritization
Stayed the Same 36.7% 60.0% Increased
focused metrics trade-off discussions Process Is
Politicized
Source: CIO Insight. Source: CIO Insight.

7 Correcting Chronic Project IT Project Failure Rates


Underperformance Survey of 35,000 IT Projects from 1994 to 2002
• Problem: Little traction of IT projects
• Balanced scorecard provides a single Projects Larger Than $3M 90%
view of overall project status and
enables proactive measures to keep All Projects 23%
projects on track
Source: 2002 CHAOS Survey, The Standish Group.

© 2003 Corporate Executive Board Introduction: Principles of Balanced Scorecard Design and Metrics Selection 13
IT Balanced Scorecards 14

Clearing Up the Confusion


Moving Beyond the Aggregation of Operational Data 4. Data Collection Process—Dashboards are frequently bundled with
While the balanced scorecard is a relatively new performance the systems and services they are designed to track, with hardware
measurement tool for IT groups, most IT organizations have already vendors like Hewlett-Packard offering built-in server dashboards,
created some form of dashboard to track operational performance. and a range of third-party vendors providing similar offerings.
While both tools are used to measure the performance of the IT Despite the availability of automated balanced scorecard
function, dashboards and scorecards differ fundamentally across solutions from vendors such as SAP and PeopleSoft which extract
the following five dimensions: and display information from ERP and CRM systems, most
companies utilize spreadsheets or homegrown reporting tools to
1. Purpose—The IT dashboard is a tool that provides a snapshot aggregate IT balanced scorecard data. In many cases, companies
of current and past IT performance, and is aimed at facilitating also manually collect data for categories such as user satisfaction.
in-the-moment corrections of emerging service problems.
Consequently, dashboards frequently include automated alerting 5. Frequency of Data Update—Dashboards track data continuously
functionality. The IT balanced scorecard is optimized to track the in order to create near-real-time performance visibility. In
implementation of a given set of strategies over time. contrast, IT balanced scorecards are typically reviewed on a
calendar-based cycle, and individual metrics are updated at
2. Audience—IT dashboards focus almost exclusively on tracking different collection frequencies, such as quarterly or annually.
operational IT performance, catering primarily to an audience of
IT operational managers. The IT balanced scorecard focuses on
providing business decision makers, both within and outside of
IT, with information required to make key funding and staffing
decisions.

3. Metrics Tracked—While IT dashboards focus on tracking


the performance and availability of specific applications or
infrastructure (e.g., Web site availability or server uptime), IT
balanced scorecards supplement these operational measures
with aggregate metrics for tracking IT’s performance against
organizational goals (for example, in the areas of talent
management and customer satisfaction).

© 2003 Corporate Executive Board


Differentiating Dashboards and Scorecards
Migrating from “in-the-moment” operational oversight to influencing strategic management decisions
IT Operational Dashboards IT Balanced Scorecards

1 Purpose Operational—Real-time performance Analytical—Trend analysis and tracking


tracking and alerting of strategy execution

2 Audience Access generally limited to IT Executive-level audience, both within


management IT and the business

3 Metrics Tracked Operational performance data about


Performance against organizational goals
a particular system or process

4 Data Collection Automated data collection, in many Portions of required data collected,
Process cases integrated into monitored systems aggregated manually

5 Frequency At set intervals (quarterly, annually, etc.), with


Continuous
of Data Update individual metrics updated at differing frequencies

Sample Metrics
Sample Metrics
• Percentage of projects
• Web server uptime delivering new business
• SAP availability functionality
• Help-desk first-call • Global desktop availability
resolution rate • Percentage of applications
meeting security standards

Source: Working Council research.

© 2003 Corporate Executive Board Introduction: Principles of Balanced Scorecard Design and Metrics Selection 15
IT Balanced Scorecards 16

Building a Better Scorecard


The Structure of the Ideal IT Balanced Scorecard 4. Clearly Defined Metrics—Since the scorecard is used by both
Based on the review of IT balanced scorecards collected from IT and business decision makers, establishing a common, well-
corporate exemplars, the Working Council finds that the most documented set of metrics definitions is essential. By creating a
advanced IT balanced scorecards share six key attributes: shared understanding of scorecard metrics, companies can better
focus review meetings on actual decisions rather than debate over
1. Simplicity of Presentation—The best enterprise-wide IT the composition or relevance of individual measures.
balanced scorecards are limited to a single page of metrics that
communicate top-line performance in concise, non-technical 5. Drill-Down Capability and Metric Context—While a high-level
language for consumption beyond the IT function. These view of IT performance is the most relevant to senior decision
scorecards typically showcase between 10 and 20 key business- makers, understanding and developing solutions to any problems
impacting metrics to convey a high-level perspective of corporate highlighted by the IT balanced scorecard requires a more granular
IT’s performance. view of information. To allow for this, the IT balanced scorecard
must provide the reader with visibility into the component
2. Informed by the Goals of the Annual IT Strategic Plan—The most elements and source of scorecard metrics as well as context
successful IT balanced scorecards are the end result of the annual around metric variance or trends in scorecard metrics.
IT strategic planning process, and track progress against the key
goals and objectives articulated in a written IT strategic plan. 6. Links to Individual Compensation—To facilitate the adoption of
To help track progress against the execution of these strategies, the balanced scorecard within the IT organization and provide
each scorecard metric is reported as “actual” and “target” for incentive for staff to achieve scorecard targets, leading companies
the current period, and is often contrasted with historical link scorecard performance to individual compensation.
performance for the previous period.

3. Broad Senior-Level Ownership—Exemplar IT balanced scorecards


are the product of cross-functional collaboration, with a mix
of senior IT and business leadership involved in designing the
scorecard and selecting and reviewing metrics.

© 2003 Corporate Executive Board


Structural Attributes of the IT Balanced Scorecard Ideal
Six design principles of world-class IT balanced scorecards

1 Simplicity of Presentation 4 Clearly Defined Metrics


• Single page of key performance • Each metric has a clear definition,
categories and metrics agreed on by IT and the business
• Non-technical language for easy • Companion scorecard document
consumption by business sponsors outlines metric definitions,
• Limited number of metrics (10 to 20) assumptions, and collection
methods

2 Informed by Goals of Annual Plan 5 Drill-Down Capability and Metric Context


• Categories and metrics directly • Scorecard allows for drill down
linked to strategies articulated in into more granular data underlying
annual IT strategic plan metrics
• Provides insight into ongoing • Metrics annotated with source
progress of strategy execution by information and contextual
tracking performance against goals explanation of variance or trends

3 Broad Senior-Level Ownership 6 Links to Individual Compensation


• Representative cross-section of • Achievement of balanced scorecard
senior IT and business leaders targets linked to individual
involved in scorecard creation and compensation of IT leadership team
metrics selection $
• Scorecard results are regularly
reviewed by CIO and IT and
business management

Source: Working Council research.

© 2003 Corporate Executive Board Introduction: Principles of Balanced Scorecard Design and Metrics Selection 17
IT Balanced Scorecards 18

A Progression in IT Balanced Scorecard Metrics Sophistication


Cardinal Balanced Scorecard Categories 5. User Satisfaction—By regularly assessing end-user and executive
Like corporate balanced scorecards, IT balanced scorecards will sponsor satisfaction with IT services, CIOs can more easily identify
differ from company to company based on corporate strategic service delivery problems and more accurately articulate cost and
direction. However, despite the fact that no single scorecard template service quality trade-offs for consideration by business decision
can provide the optimal set of performance metrics for every IT makers.
organization, there is a set of key categories IT balanced scorecards
should incorporate to create a truly balanced view of a company’s Categories in the Ascent
IT performance. The five most common IT balanced scorecard
categories include: In addition to the previous five metrics categories, Working Council
research reveals that IT balanced scorecard exemplars frequently add
1. Financial Performance—Particularly in times of cost cutting, two categories to their scorecards:
finance is the foundational scorecard category, allowing CIOs to
understand IT costs and spending in the context of service levels, 6. Information Security—Exemplar scorecard practitioners use this
strategy, implementation, and project progress. category of metrics to track security breaches and calibrate the
organization’s response in terms of spending on preventative
2. Project Performance—New application development projects, measures such as security architecture and training. This is
infrastructure upgrades, and systems consolidation efforts are especially important given that the number of reported corporate
often at risk for budget and schedule overruns. To avoid this, information security breaches is steadily increasing—U.K.-based
exemplar companies use the balanced scorecard as a mechanism MessageLabs reports that e-mail viruses, which represent only a
for periodic tracking of the progress and business impact of their small fraction of all security incidents, doubled in 2002 versus 2001.
largest or most important projects.
7. Enterprise Initiatives—Although the project performance category
3. Operational Performance—In many cases, companies already tracks large IT projects, progressive IT departments are also using
track day-to-day IT operational metrics using operational their balanced scorecards to track IT’s specific contributions to
dashboards; exemplars use IT balanced scorecards to provide initiatives of corporate strategic importance (for example, IT
a more holistic, customer-focused view of IT’s operations (for integration work in service to post-merger synergy targets). This
example, system availability to end users during peak business category is often a temporary one, added to and removed from the
hours). scorecard with the ebb and flow of enterprise-critical initiatives.

4. Talent Management—Exemplar companies use this category


to track job satisfaction and retention of key IT staff, as well as
the attractiveness of the IT department on the external IT skills
market.

© 2003 Corporate Executive Board


Covering All the Bases
The balanced scorecard provides a holistic view of IT performance
Categories of IT Balanced Scorecard Metrics
Illustrative

1 Financial Performance Actual Target Status

Financial Performance
• Cost of data communications per seat $497 $450
Connecting service cost with strategy • Relative spending per portfolio category N.A. N.A.
implementation and project progress
Project Performance 5 User Satisfaction
to facilitate principled trade-offs
• Percentage of new development investment resulting in 65% 70%
new revenue streams
2 Project Performance • Percentage of IT R&D investment leading to IT service 80% 90%
Quat 1 Quat 1 Quat 1 Quat 1
improvements
Identifying service delivery problems
Project Jan Jan Jan Jan Jan Jan Jan Jan Jan Jan Jan Jan
Explore Market Need
Final Quality Assurance Testing
Explore Market Need
Explore Market Need
Final Quality Assurance Testing
Explore Market Need
Final Quality Assurance Testing

by assessing end-user satisfaction


Final Quality Assurance Testing
Explore Market Need
Explore Market Need
Final Quality Assurance Testing
Explore Market Need
Final Quality Assurance Testing
Explore Market Need Operational Performance
Concentrating senior executive • Peak time availability
attention on the largest and most • Critical process uptime 90% 98%
important projects
Talent Management 6 Information Security
• Retention of high-potential staff 2% 4%
3 Operational Performance • External citations of IT achievement 3 5

User Satisfaction
• Entire user population 3.7/5 3.8/5 Focusing IT’s efforts on security
• Focused executive feedback 96% 98% spending and training
Providing a customer-focused view
• Comprehensive perspective 3.3/5 3.5/5
of IT operations
Information Security
4 Talent Management • Percentage of staff receiving security training 40% 70% 7 Enterprise Initiatives
• Percentage of external partners in compliance with 25% 50%
security standards

Assessing staff job satisfaction, Enterprise Initiatives Monitoring IT’s contribution to


retention, and the attractiveness of the • Percentage of acquired company systems integrated in 55% 40% initiatives of corporate strategic
IT workplace M&A category importance
• Number of business process steps enabled by technology 2 3
in Process Reengineering category
Source: Working Council research.

© 2003 Corporate Executive Board Introduction: Principles of Balanced Scorecard Design and Metrics Selection 19
20

© 2003 Corporate Executive Board


Best-in-Class IT Balanced Scorecard Metrics
The following case profiles present more detailed examinations of metrics sophistication in each of the seven IT balanced scorecard
categories. A comprehensive list of scorecard metrics, organized by performance category, is included in the introduction to this brief.

Financial Performance p. 24

Project Performance p. 28

Operational Performance p. 32

Talent Management p. 36, 38

User Satisfaction p. 42, 44

Information Security p. 48

Enterprise Initiatives p. 52

© 2003 Corporate Executive Board 21


IT Balanced Scorecards 22

Financial Performance
Institutionalizing Financial Rigor in IT
Financial Discipline at the Heart of the IT Balanced Scorecard Directional Granularity—Advanced IT balanced scorecard
Although the concept of the balanced scorecard was developed in practitioners adopt a more nuanced view of IT’s financial
order to overcome the traditional reliance on financial-only metrics performance, delineating IT expenditures by geography, by business
for performance measurement, a core set of financial metrics is a key unit, by technology category, or by technology life-cycle stage. This
component of exemplar IT balanced scorecards. These metrics are provides decision makers with at least the directional ability to
especially critical in the current climate of stagnating or shrinking target the most significant cost areas. Typical metrics include total
IT budgets, as accurately tracking and reporting cost and budget infrastructure spending, operating cost, or total IT spending by
performance is a prerequisite for corporate IT organizations. The business unit.
financial IT balanced scorecard category typically exhibits three
Measuring by Portfolio Mix—As a strategic performance
progressive levels of metrics sophistication:
management tool, the IT balanced scorecard needs to present the
An Aggregate Spending Overview—The baseline financial metric for strategic impact of IT investments. Exemplar companies supplement
most IT balanced scorecards is some absolute measure of company a foundation of granular financial metrics, such as total IT cost
total IT expenditure (for example, total IT spending or IT spending and cost of data communication per seat, with an understanding of
as a percentage of revenue). While important to track on an ongoing how these costs compare to industry peers, usually using external
basis and an essential measure of IT’s financial health, this type of benchmarking data. In addition, exemplars also track portfolio
metric fails to provide decision makers with actionable information mix—spending per portfolio category—to anchor IT’s financial
and context about IT spending’s link to the business’s strategic performance in a broader strategic context and help decision makers
imperatives. assess spending in each category relative to that in other categories,
as well as past spending levels.

© 2003 Corporate Executive Board


Financial Performance
Financial Performance Metric Maturity Trajectory
Relative Maturity of Financial Performance Metrics
Illustrative

High
Exemplar
Measuring by Portfolio Mix
Sample Metrics:
• Cost of data communications per seat
• Relative spending per portfolio
category

Progressive Practitioner
Directional Granularity
Granularity Sample Metrics:
of Measure
• Total IT spending by geography
• Total IT spending by business unit
• Total infrastructure spending
Baseline
An Aggregate Spending Overview
Sample Metrics:
• Total annual IT spending
• IT spending as a percentage
of company revenue

Low
Low High
Strength of Metric Link to Business Outcomes

Source: Working Council research.

© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecard Metrics 23


IT Balanced Scorecards 24

Financial Performance
Establishing a Financial Frame of Reference
Company Background human resources, as well as enterprise-centric categories such as
Schlumberger Limited is a $13.5 billion global technology services competitive positioning, employee efficiency, and IT efficiency
company with 76,000 employees of more than 140 nationalities. and support. This metric of current portfolio “mix,” along with
The company has operations in 140 countries and includes three historical mix data, allows decision makers at Schlumberger to
primary business segments: Schlumberger Oilfield Services, the ensure that each of the portfolio categories receives sufficient
world’s largest oilfield services company and the leading supplier funding and to reallocate funding from one category to another
of technology services and solutions to the international oil and to better align with shifting business strategy.
gas industry; WesternGeco, jointly owned with Baker-Hughes, the
More Effective Identification of Standardization and Portfolio
world’s largest surface seismic company; and SchlumbergerSema, an
Optimization Opportunities
IT services company providing consulting, systems integration, and
managed services to the energy, public sector, telecommunications, One benefit Schlumberger reports from its use of the IT balanced
and finance markets. Other Schlumberger businesses include Smart scorecard is that it has been able to more programmatically identify
Cards & Terminals, the NPTest semiconductor testing solutions missed standardization opportunities. In one instance, a business
unit, Verification Systems, and Water Services. unit manager looking at the detailed financial metrics on the
scorecard realized that the unit’s IT cost per seat was higher than the
Scorecard Background company average. An in-depth review of the unit’s IT cost structure
Schlumberger introduced its IT balanced scorecard in 2002 as part revealed that it had not kept pace with the corporate infrastructure
of a larger IT initiative to better understand the comparative IT standardization initiative, resulting in higher operational costs. A
spending levels and align that spending more closely with business second benefit is that Schlumberger is able to make more informed
needs. portfolio decisions. For example, by reviewing the scorecard, the IT
department observed that the amount of IT investment directed to
Providing Granular Cost Transparency and Comparative Context customer care applications was unusually low compared to spending
As a foundation for the financial performance category of its IT in other portfolio categories. This triggered a closer look at customer
balanced scorecard, Schlumberger tracks granular financial metrics care spending, which revealed that the spending level appeared low
such as the cost of data communication per seat. In addition, the for two reasons. First, the company had just completed the rollout of
scorecard provides readers with an understanding of how these a major customer care application, and second, some customer care
costs compare to those of industry peers using benchmarking spending was being handled by the company’s business units outside
data from various sources, including third-party vendors and of the IT budget. As a result, Schlumberger decided not to adjust
its own IT services arm. To supplement these granular financial spending on customer care upward, despite its initial inclination
measures, Schlumberger also tracks the percentage of IT spending to do so. Following this early success Schlumberger’s next step in its
dedicated to a set of eight portfolio categories, which include IT balanced scorecard development is to include business metrics
business productivity–oriented categories such as finance and taken from the corporate scorecard.

© 2003 Corporate Executive Board


Financial Performance
Aligning IT Spending with Business Need
Schlumberger supplements a foundation of unit cost measures with measures of IT portfolio mix

Schlumberger’s IT Balanced Scorecard Baseline of granular Performance against industry


Sample Metrics unit cost metrics benchmarks puts portfolio mix
into comparative perspective
Financial Performance
• Projected cost for IT services
• Year-to-date cost IT services
• IT cost per seat IT Spending by Portfolio Category
• Cost of data communication per seat Illustrative
• IT spending by portfolio category
Current Percentage Previous
Objective Portfolio Category Benchmark
Project Performance of IT Spending Year
• Average application project delay Track Competitive Positioning 1% 0.5% 2%
Spending by
Operational Performance Portfolio Customer Care and Services 0.2% 1% 2%
• Quarterly activity increase on the corporate Category
knowledge portal Field Operations 10% 11% 7%
• Percentage of standardized PCs
Finance 14% 13.5% 10%
Resource Management
HR 2% 4% 4%
• IT headcount (excluding contractors)
Employee Efficiency 1% 1% 5%
User Satisfaction
• Tickets per registered user per month IT Efficiency and Support 43% 40% 40%
• Help-desk calls
• Help-desk first-call resolution rate Management, IT Operations, and Other 28.8% 29% 30%

Information Security
• Number of security breaches Spending tracked by portfolio Historical data allow decision makers
• Incident rate category to inform resource to track improvements or spotlight
• Percentage of infrastructure protected allocation decisions areas of potential underspending
• Percentage of sites with valid information
security audit
Source: Schlumberger; Working Council research.

© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecard Metrics 25


IT Balanced Scorecards 26

Project Performance
Avoiding Project Disaster
Monitoring Ongoing Project Performance to Avoid Surprises Project Progress and Status—At a minimum, companies measuring
A 2002 study by the Standish Group showed that out of 35,000 IT project performance on their IT balanced scorecards include three
projects conducted between 1994 and 2002, only 28 percent were basic metrics: the percentage of projects delivered on time, the
successfully completed, 23 percent failed, and the majority— percentage of projects delivered on budget, and the percentage
49 percent—either ran over budget or past schedule or ended up of projects delivered within scope.
with fewer features and functions than planned. In addition, larger
Post-Project Review—In addition to tracking project budget,
projects exhibited a higher rate of failure—68 percent of projects
schedule, and scope adherence, more progressive IT balanced
smaller than $500,000 were finished successfully, while only
scorecard practitioners also add metrics that highlight the
10 percent of projects larger than $3 million were deemed a success
performance of completed projects. These companies gather
at completion. In addition to consuming IT resources that could be
feedback from project staff on the project execution process and
used more effectively elsewhere, these project failures—especially
from the business sponsor or unit using the final product on the
those of larger projects—can have serious repercussions outside
solution’s usefulness and the service received from the project team.
of the IT organization. One example of the potential corporate
In addition, some IT organizations also track the failure rate of
impact of large IT projects that fail to deliver is NIKE’s deployment
deployed technologies as a proxy for the quality of project work.
of i2’s supply chain management software. The well-publicized
failure led to a one-quarter sales decrease of $100 million as a Project Contribution to Business Goals—Exemplar companies use the
result of problems with inventory levels. Another is Hershey, IT balanced scorecard to track project alignment with key business
whose $112 million SAP R3 implementation resulted in shipment goals, supplementing in-progress and postmortem project execution
disruptions just ahead of the 1999 holiday season and was blamed information with an articulation of how specific IT projects enable
for a 19 percent reduction in third-quarter profits. One of the key one or more IT corporate strategic imperatives (for example, systems
failure points cited was a lack of disciplined project planning and simplification efforts or business process standardization).
management. The examples of NIKE and Hershey are extreme, but
by using the IT balanced scorecard, CIOs can potentially prevent
project disasters by tracking metrics at three levels:

© 2003 Corporate Executive Board


Project Performance
Project Performance Metric Maturity Trajectory
Relative Maturity of Project Performance Metrics
Illustrative

Exemplar
Linked to
Corporate Project Contribution to Business Goals
Strategy
Sample Metrics:
• Percentage of new development investment
resulting in new revenue streams
• Percentage of IT R&D investment leading
to IT service improvements

Progressive Practitioner
Post-Project Review
Metric
Altitude Sample Metrics:
Baseline
• Sponsor satisfaction score
Project Progress and Status • Early-life failure rate
Sample Metrics:
• Percentage of projects delivered
on time
• Percentage of projects delivered
on budget
• Percentage of projects delivered
within scope

Self-
Referential
Point-in-Time Life Cycle
Metric Time Horizon
Source: Working Council research.

© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecard Metrics 27


IT Balanced Scorecards 28

Project Performance
Highlighting IT’s Role as a Corporate Enabler
Company Background External Partnership Building—Tracks IT’s collaboration with
Bowne & Co., a New York-based document, information technology vendors and industry experts to identify business uses
management, and printing solutions provider, has $1 billion for specific technologies, as measured by the percentage of vendors
in revenues and 8,400 employees on Bowne’s partner “ecosystem” map that the IT organization has
partnered with. This map documents existing vendor relationships,
current spending levels, and contacts at other organizations that
Scorecard Background Bowne wishes to partner with.
In 2000, Bowne’s CEO engaged the services of a strategic planning
consultant who advocated the creation of corporate and functional IT Value Generation—Assesses IT’s performance in realizing the
balanced scorecards. The IT balanced scorecard, which is directly potential business impact of new technologies, as measured by the
mapped to Bowne’s IT strategy, has five main categories: financial percentage of technology R&D investment that leads to new IT
performance, project performance, operational performance, talent operational services.
management, and user satisfaction. The Excel-based scorecard
includes a list of metrics and owners for each category, a clear A Higher Corporate Profile Pays Off
definition for each metric, a description of how it will be measured,
By highlighting IT’s enablement of business strategies and proactive
and most importantly, a list of initiatives that the IT organization
approach to more efficient operations, Bowne’s CIO has been able
must undertake in order to be able to measure performance in each
to protect the IT budget during a time of spending cuts across the
category.
company. During the 2002 budgeting process, IT was spared any
cuts, while IT capital expenditures actually increased by 75 percent.
Spotlighting Project Impact on the Business
The project category of Bowne’s IT balanced scorecard includes
several metrics that seek to spotlight the direct impact of IT projects
on the business. These include measures of:

Competitive Value Creation—Gauges IT’s collaboration with each


individual business unit to develop differentiating technology-
enabled solutions, as measured by the percentage of new
development investment that results in new corporate products
or services.

© 2003 Corporate Executive Board


Project Performance
Ensuring Project Contribution of Business Benefits
Bowne’s IT organization links IT project work with corporate benefits

Scorecard provides insight into Objectives and metrics clearly


Bowne & Co.’s IT Balanced Scorecard IT value creation for business described in nontechnical terms
Sample Metrics
Financial Performance Project Performance
Illustrative
• Percentage year-over-year operational savings for specific IT
services Objective Metric Current Performance Initiatives
• TCO of IT products and services versus industry benchmarks
• Percentage variance in actual return versus business case estimate Competitive Value Percentage of investment Develop a new product
Project
Explore Market Need
Jan
Quat 1
Jan Jan Jan
Quat 1
Jan Jan Jan
Quat 1
Jan Jan Jan
Quat 1
Jan Jan
Creation—Partner in IT new product development life-cycle process
with business to seize development resulting (Amy Johnson*)
Final Quality Assurance Testing
Explore Market Need
Explore Market Need
Final Quality Assurance Testing

Project Performance Explore Market Need


Final Quality Assurance Testing
Final Quality Assurance Testing
Explore Market Need
Explore Market Need
Final Quality Assurance Testing market opportunities by in new production 30%
Explore Market Need
Final Quality Assurance Testing

deploying timely, cost- products or services Incorporate integration


Explore Market Need

Operational Performance effective, and integrated (Jim Smith*) standards


new solutions (Amy Johnson)
• System availability
• Number of help-desk calls per application External Partnership Percentage “strategic” Develop “supplier
• Percentage of existing systems conforming to architectural Building—Create partnership coverage interdependency” map
standards alliances and (Sue Miller*) (Sue Miller)
partnerships to 40%
Talent Management capitalize on new Incorporate supplier contribution
technologies on metric reporting process
• Percentage of individual training objectives met (Amy Johnson)
• Product and customer knowledge survey performance
• Percentage of employees with individual development plans IT Value Generation— Percentage of IT R&D Develop five-year technology
Recognize the potential investment resulting road map (including document
User Satisfaction business value in new in new IT operational 30% life cycle and infrastructure)
• Business sponsor satisfaction score technologies services (Jim Smith)
• End-user satisfaction score (Jim Smith)

Both metrics and initiatives have Scorecard describes initiatives


designated owners to ensure accountability that must be taken to enable
for data collection and execution reporting of designated metrics

* Names are for illustrative purposes only Source: Bowne; Working Council research.
and do not represent actual employees.
© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecard Metrics 29
IT Balanced Scorecards 30

Operational Performance
Drilling Down into the Operations Performance
Optimizing Operational Performance An Aggregated View of Operational Performance—More advanced
Most IT organizations track the performance of individual balanced scorecard practitioners are aggregating subsets of IT
operational systems, including servers, applications, and desktops dashboard metrics to present an overview of the performance for
using dashboards and alerting functionality to inform operational particular units of IT service (for example, desktop availability or
managers of service quality problems and system failures. However, Web front-end uptime). This approach takes an initial step toward
the IT department’s customers, the end users, are most likely denominating IT operational performance in terms relevant to
to judge IT’s operational performance based on their personal end users, providing the CIO and senior management with a more
experience, asking “Does my PC work?” or “Do I have Internet informed basis for cost and service-level trade-offs. In addition, IT
access?” As a result, the typical IT balanced scorecard provides little organizations are also using the operational category to track system
intelligence to CIOs and business decision makers as to how changes compliance with architecture standards.
in IT’s performance impact users. Metrics in the operational
Operational Performance from the End-User’s Perspective—Exemplar
category of the IT balanced scorecard range in maturity across the
companies take this customer-centric approach a step further,
following three levels:
focusing their operational scorecard metrics on how the end user
Measuring the Performance of Individual Systems—The majority actually experiences IT service and managing IT operations with
of companies include specific operational data such as network the end user in mind. While a network outage might severely
uptime or application availability in the operational category of impact IT’s uptime performance when viewed through the lens of
their IT balanced scorecard. This information is critical for the an operational dashboard, it has little impact on end users when it
ongoing management of performance levels, but of limited value as a happens in the middle of the night, or only impacts backup systems.
scorecard metric for two reasons. First, a report produced quarterly On the other hand, if a critical application is not available during
will provide operational managers with data only when it is too peak work hours, the impact on end users can be comparable to
late to act; and second, granular operational metrics provide little a total system crash. Consequently, to help improve operational
insight into where decision makers can cut costs without adversely performance, exemplars utilize scorecard metrics that track an
affecting service levels that matter to end users. aggregate measure of uptime for all systems that impact end users
and sensitize those metrics according to the impact of performance
problems on critical business processes.

© 2003 Corporate Executive Board


Operational Performance
Operational Performance Metric Maturity Trajectory
Relative Maturity of Operational Performance Metrics
Illustrative

Aggregated
Services Exemplar
Operational Performance from
the End-User’s Perspective
Sample Metrics:
• Peak time availability
• Critical process uptime

Progressive Practitioner
An Aggregated View of Operational
Performance
Level of
Metric Sample Metrics:
Detail • Web front-end uptime Putting the Business First
• Desktop availability “In the past, we’ve tended to take a more
• Percentage of systems compliant internally focused view of IT with our
with architecture standards metrics to determine things like data center
Baseline or network availability. However, customers
don’t necessarily care about what is causing a
Measuring the Performance network outage, but more about the fact that
of Individual Systems the system is down. …Now we’re attempting
Sample Metrics: to measure ourselves on things business people
think are important.”
• CRM availability
• LAN uptime Mike Hyzy
Manager of Global IT
Metrics, NCR Corp.
Individual
Systems
Low Metric Relevance to End User High

Source: “Formula for ROI,” PC Week (28 September 1998); Working Council research.

© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecard Metrics 31


IT Balanced Scorecards 32

Operational Performance
Balancing Business and IT Perspectives on Operations
Company Background The Scorecard as a Compliance Lever
T. Rowe Price Group, Inc., the Baltimore-based asset management While most companies track some form of system reliability
firm, has $925 million in revenues and 3,700 employees. The as part of their scorecard efforts, fewer organizations use their
company administers 80 funds and manages assets worth more than scorecards to track compliance with architectural standards and
$155 billion. project management processes or gauge the effectiveness of vendor
management efforts. The operational category of T. Rowe Price’s
IT balanced scorecard tracks compliance of new project work with
Scorecard Background existing architecture standards and the IT organization’s standard
IT management reporting practices at T. Rowe Price historically system development methodology. By highlighting areas where
suffered from two flaws. First, reports focused on operational business sponsors or project managers have deviated from T. Rowe
performance in minute detail while neglecting non-technical Price’s established standards, the scorecard provides a measure of
performance metrics. Second, even for operational metrics, the “name and shame” reporting, forcing discussion of non-standard
reports lacked established targets for adequate performance. work in the presence of senior decision makers. In addition, T.
To provide a holistic view of corporate IT’s performance with Rowe Price also includes a “competitive bids” metric as part of the
respect to financial, project, operational, talent management, and operations category. This metric is aimed at driving down overall IT
user satisfaction measures, T. Rowe Price created an IT balanced spending by encouraging sponsors to solicit external bids for new
scorecard in early 2001. T. Rowe Price updates the scorecard projects.
quarterly and uses it to highlight a continued commitment to
streamlining total IT expenditures and to showcase IT’s partnership
with the business.

Managing Performance from the End-User’s Perspective


Recognizing that end-user productivity should be the focus of IT
operational metrics, T. Rowe Price tracks the number of system
failures that impact the business user’s ability to do work.

© 2003 Corporate Executive Board


Operational Performance
Monitoring User Impact and Standards Compliance
T. Rowe Price utilizes the IT balanced scorecard to track business-focused
operational metrics and monitor architectural compliance

T. Rowe Price’s IT Balanced Scorecard High-Level Metrics focus on Metrics goals for current and upcoming
Sample Metrics Scorecard fits systems failures quarter allow more informed
onto one page with negative assessment of current performance
Financial Performance business impact and timeline for closing gap to goal
• Percentage change in EBITDA
• Weighted IT expense ratio versus industry
• Total IT expenditures Operational Performance
• Total expenditure on new functionality Illustrative
• NPV delivered during payback period
Objective Metric Current Current Status Next Comments
Project Performance Quarter Quarter Quarter
• Percentage of projects delivered on time, on budget, with Performance Target Target
acceptable customer satisfaction Integrate Percentage of projects Includes projects
• Percentage of active projects with approved project plan Solutions complying with existing in progress before
Using Defined architectural standards new standards
Architectures and complying with 10% 30% 35% created
Operational Performance standard development
methodology

Provide Reliable Number of failure Severe weather


Talent Management and Functional incidents with business damaged storage
3.5 3.0 2.5
• Percentage on non-entry-level positions filled internally Systems impact per quarter capability
• Average years’ tenure of solid performers
• Number of candidates interviewed per open position Effectively Percentage of On track
• Percentage of task forces/projects led by non-project managers Select and strategic projects bid
66% 65% 70%
Manage Sourcing competitively
User Satisfaction Relationships
• End-user perception of system stability
• Performance versus user expectation of delivery speed
• Price competitiveness of IT services versus external benchmark Scorecard
Scorecard supplements
supplements availability
traditional availability Spotlighting poor standards compliance
metrics
metrics with
with measure
measure ofof architectural
architectural and
and or methodology adherence at executive
methodological
methodological compliance
compliance ofof proposed
proposed and
and level acts as “name and shame” for
ongoing
ongoing project
project work
work business sponsors and project managers

Source: T. Rowe Price; Working Council research.

© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecard Metrics 33


IT Balanced Scorecards 34

Talent Management
Keeping and Attracting Top Performers
Talent at the Heart of the IT Enterprise Becoming a Talent Magnet—Exemplar IT organizations use the
The size of many IT departments has been decreasing across the IT balanced scorecard to focus senior executive attention on IT’s
past several years in response to cost pressures, and as a result, success in creating a “destination” for available talent, tracking
CIOs have been forced to focus on talent management in an effort metrics like the retention of high-potential staff, the number of
to maximize the productivity of existing staff. This involves both press citations of the IT organization and company-developed
upskilling the entire IT workforce and focusing on the retention of technologies, or the number of industry awards the company has
staff with critical skills. IT balanced scorecard practitioners use their won for its use of technology.
scorecards to better manage IT talent in several ways:

Keeping Track of IT Staff —As one of the four categories outlined in


the original Kaplan and Norton scorecard, talent management is
a category included almost universally in IT balanced scorecards.
The most common scorecard talent metrics include total IT staffing,
individual and average staff tenure, employee satisfaction, and
overall staff retention rate.

Measuring the Skills Bench—More progressive scorecard


practitioners supplement basic staff demographics with metrics to
track the organization’s existing skill sets and the career goals of IT
staff (for example, the number of staff trained in skills identified as
critical and the percentage of IT staff with individual development
plans). These metrics allow CIOs to better spotlight emerging gaps
or opportunities for staff development.

© 2003 Corporate Executive Board


Talent Management
Talent Management Metric Maturity Trajectory
Relative Maturity of Talent Management Metrics
Illustrative

Individual
HIPOs Exemplar
Becoming a Talent Magnet
Sample Metrics:
• Retention of high-potential
staff
• External citations of IT
achievement

Targeted Progressive Practitioner


Employee
Population Measuring the Skills Bench
Sample Metrics:
Baseline • Current skills inventory
• Percentage of staff with
Keeping Track of IT Staff individual development plans
Sample Metrics:
• Total number of staff
• Staff tenure (individual and
average)
• Employee satisfaction
• Overall retention rate

All Staff
Internal Focus of IT Brand Building External

Source: Working Council research.

© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecard Metrics 35


IT Balanced Scorecards 36

Talent Management
Investing in the Future
Company Background Leadership Development as a Foundation for Value Creation
AstraZeneca PLC, the global pharmaceuticals company, is one of the top AstraZeneca’s stated goal is to develop the leadership performance and
five pharmaceuticals companies in the world with more than $17.8 billion skills base of its existing IS staff at all levels of the organization. To
in revenues and has 58,000 employees. monitor its progress toward that goal, AstraZeneca’s IS organization
tracks the progress of two key objectives in the talent category of its IS
balanced scorecard:
Scorecard Background
AstraZeneca introduced the balanced scorecard in 1998 as a tool to help Developing and Demonstrating Leadership—The first set of talent
communicate corporate strategy and priorities, ensure alignment of management metrics on AstraZeneca’s IS balanced scorecard is
functional initiatives with those strategies and priorities, and monitor designed to measure the development of leadership capabilities within
execution of specific strategies. While adoption of the balanced scorecard the technology organization. These metrics include the percentage of
is widespread at AstraZeneca, there are no corporate templates that IS staff, IS leaders, and identified high-potential staff with established
individual functions must follow. The content of the IS balanced development plans, and the percentage of key IS roles filled.
scorecard is dictated by two principal drivers, the strategic priorities of the
company and the vision for the IS organization, which are reflected Developing and Demonstrating IT Skills and Disciplines—AstraZeneca’s
in three key priorities for the IS function: IS balanced scorecard also tracks performance in creating skills and
development opportunities for existing IS staff by including metrics
1. Deliver business value such as training attendance versus capacity and the percentage reduction
in skills shortages filled by external consultants. More specifically,
2. Transform IS efficiency and effectiveness AstraZeneca’s IS balanced scorecard reflects the IS organization’s
3. Develop IS leadership and capabilities emphasis on cultivating project management skills and developing global
service delivery capabilities, tracking metrics that measure compliance
The PowerPoint-based scorecard is updated on a quarterly basis. Each with project management methodologies and project manager skills
scorecard objective is assigned to an owner on the CIO’s global IS development, as well as those measuring collaborative tool usage, virtual
management team who updates the metrics that underpin that objective team productivity, and the number of rotational assignments undertaken
and submits the metrics and commentary to the assembled global by IS staff.
IS management team for review and discussion. The final scorecard
comprises the strategic agenda for the IS senior management team and A Multipurpose Performance Management Tool
is used to track ongoing execution of IS strategy. Once finalized, the “The IS balanced scorecard acts as an extremely powerful vehicle
scorecard is published on the company intranet. IS line mangers are for communicating AstraZeneca’s IS strategy and priorities to
strongly encouraged to discuss the balanced scorecard with their staff and the company, helps ensure that IS initiatives are clearly aligned to
define their team goals in a fashion that is consistent with the corporate IS business priorities and strategy, and allows effective tracking of IS
vision that is encapsulated in the scorecard. organization success in strategy execution.”
Balvinder Dhillon
AstraZeneca

© 2003 Corporate Executive Board


Talent Management
Developing IS Leaders
AstraZeneca employs its IS balanced scorecard to help execute
on its strategy of identifying and developing IS leaders

Scorecard categories mirror Scorecard provides reader with commentary


AstraZeneca’s global IS from metric owner and an outline of
strategic priorities required performance improvement steps

AstraZeneca’s
IS Balanced Scorecard Develop IS Leadership and Capabilities
Sample Metrics Illustrative
Current Previous
Project Performance Objective Metric
Performance Performance
Status Comments
• Ongoing benefits captured versus plan Develop and Percentage of IS staff development plan reviewed and actions implemented 60% 75%
• Project delivery progress versus budget, Demonstrate
schedule, and quality milestones Percentage of incumbent IS leaders with development plan in place 90% 92%
Leadership
Capability Percentage of global high-potential staff with development plan in place 80% 75%
Operational Performance Change management and business acumen programs in place No No
• Number of business interruptions caused Number of annual IS rotational assignments 10 11
by security intrusion
Diversity performance (scale of 1 to 5) 4 4
• Percentage of internal portals adopting
architecture guidelines Percentage of key roles filled 99% 92%
• Number of material adverse regulatory Develop and Percentage of projects following project management framework 70% 70%
compliance observations by FDA or Demonstrate
Percentage of projects with KPIs for visible value demonstration 80% 50%
other IS audits IS Professional
Skills and Percentage of project managers with development plan and career path 60% 40%
Disciplines
Percentage increase in project manager capability 20% 10%
Talent Management
Attendance versus capacity in existing learning programs 60% 72%
Percentage increase in collaborative tool usage 40% 30%
Improvement of “virtual” team productivity +5 -1
IS capabilities versus benchmarks (scale of 1 to 5) 3 2
Percentage reduction in skills shortages filled by consultants 20% 15%

Metrics concentrate on career planning Multiple metrics underscore Metrics also focus on desire
and capability development of existing IS organization’s emphasis on to strengthen global IS
staff, with a focus on high performers developing project management skills service delivery capabilities
Source: AstraZeneca; Working Council research.

© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecard Metrics 37


IT Balanced Scorecards 38

Talent Management
Tracking IT’s Brand Equity
Company Background Developing an IT Brand
Schneider National, Inc., the Green Bay, Wisconsin–based Schneider’s talent management metrics include typical internally
transportation and logistics provider, has $2.6 billion in revenues focused measures like staff turnover rates and employee tenure
and 21,000 employees. The company operates a fleet of 14,000 trucks with the company and in IT. However, in service to a corporate
and more than 40,000 trailers. initiative to boost brand awareness of Schneider and its products
and services, the IT organization also tracks two externally facing
metrics: the number of press citations of Schneider’s IT organization
Scorecard Background and Schneider-developed technologies, and the number of industry
Schneider National’s IT balanced scorecard was implemented in awards Schneider has won for its use of technology.
1998 and is composed of four major types of metrics: financial
performance, project performance, operational performance, By using the balanced scorecard to track the improvement of the
and talent management. The IT organization views the scorecard internal employment brand for IT, companies can reduce staff
as a tool to help align IT efforts with the needs of the business turnover, especially for high-potential staff who would be difficult
and reduce the time senior IT management needs to identify and to replace on the internal talent market and costly to replace on the
find solutions to service problems. The scorecard is updated on a external market. In fact, Schneider credits the ability to regularly
monthly basis and reviewed by the CIO, the IT vice presidents, and track the metrics in the talent management scorecard category with
IT directors. The scorecard includes the target, current month, and helping it achieve a significant reduction in annual staff turnover,
year-to-date performance for each metric. which has declined from 26 percent before the scorecard was
implemented to 7 percent, across a period of five years. Some of this
turnover has likely been stemmed as a result of the waning fortunes
of the overall economy across the past several years, but Schneider
claims that by providing decision makers with a consolidated view
of talent metrics, the IT organization is able to more effectively
deploy its resources toward improving the internal IT employment
offer, with reduced turnover as the result. In addition, by using
the scorecard to track press citations and awards, Schneider’s IT
organization has also begun developing an external IT brand,
becoming a destination for available talent in the industry.

© 2003 Corporate Executive Board


Talent Management
Measuring IT’s Reputation Inside and Outside the Organization
Schneider National utilizes its balanced scorecard to build IT’s brand as an employer of choice

Baseline
Baseline ofof internally focused
internally-focused
metrics
metrics supplemented
supplemented with with
Schneider National’s IT Balanced Scorecard metrics
metrics ofof external
external ITIT “brand”
“brand”
Sample Metrics
Financial Performance
• Central IT budget as a percentage of revenue Talent Management
• IT maintenance cost per load, per movement Illustrative
• Desktop total cost of ownership
• Data communications cost as a percentage of revenue Current Current
Year-to-Date
Objective Metric Month Month
Performance*
Project Performance Performance* Target*
• Percentage of projects on time, on budget, with
acceptable satisfaction score 1. Annual turnover rate 7% 7% 8%
• Percentage of budget allocated to unplanned projects Create an
2. Average years of employee experience at 8 8 7.9
Attractive IT
Operational Performance Schneider National
Environment
• Number of server outages per month 3. Percentage of planned staffing levels 97% 95% 94%
• Reduction of desktop applications
• Number of UNIX servers 4. Citations of Schneider National technology 6 10 14
Build a organization in the press
Talent Management World-
5. Schneider National technology and 1 2 1
Class IT
corporate awards
Organization
6. Average years of employee experience in IT 4.8 5 5.4

Tracking
Tracking ofof external
external citations
citations ofof Schneider
Schneider National’s IT organization
National National’s IT
boosts brand boosts
organization awareness,
brandpositively
awareness, impacting retention
positively rateretention
impacting and
increasing the visibility
rate and increasing theofvisibility
IT to available
of IT, totalent
in play talent

* Numbers are for illustrative purposes only, and Source: Schneider National; Working Council research.
do not represent actual performance or targets.
© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecard Metrics 39
IT Balanced Scorecards 40

User Satisfaction
Measuring IT’s Public Perception
User Satisfaction as an Early-Warning Indicator A Census-Level View of IT Performance—Best-in-class IT balanced
To combat the historical financial bias of corporate reporting and scorecards incorporate the results of formal satisfaction surveys
create a truly “balanced” perspective of corporate performance, covering the entire user base. Exemplars conduct surveys on at
Kaplan and Norton advocated the inclusion of customer satisfaction least an annual basis, either surveying the entire user population
metrics on the balanced scorecard. For IT organizations, these at once or by surveying a smaller but representative sample of the
“customers” are, in most cases, end users and business sponsors. population each quarter. Leading companies also take steps to
Faced with limited performance measurement resources, CIOs have document executive-level perspectives of IT performance rather
integrated satisfaction-surveying efforts into their IT performance than aggregating all user feedback into an undifferentiated average.
measurement toolkit to track the perception of IT service delivery The results of these two types of satisfaction surveys allow CIOs to
with three varying degrees of rigor: more quickly and effectively identify changes in user perception of
IT which may be connected to a decline in service levels or problems
Ad Hoc Assessment—In many cases, companies choose to measure with vendor contracts.
end-user satisfaction anecdotally, either because they see no need
or are unwilling to incur the potential costs involved in regularly
collecting the information. As a result, these scorecards do not truly
represent a “balanced” view of IT performance, relying almost
exclusively on financial metrics as a proxy for IT value.

Reactive Information Gathering—The majority of companies that do


track user satisfaction as part of their IT balanced scorecard process
opt for brief, random questionnaires in direct response to user
requests or process milestones, such as help-desk calls or project
completion. Although this approach provides IT departments
with an inexpensive way to collect a sample of user perspectives
on IT performance, it may not offer an accurate picture of user
satisfaction. Since these surveys tend not to be anonymous, users
may respond less candidly, and if the survey sample population is
generated based on help-desk tickets, the sample may be biased
toward users whose most recent interaction with IT has been
extreme, either unusually positive or unusually negative.

© 2003 Corporate Executive Board


User Satisfaction
User Satisfaction Metric Maturity Trajectory
Relative User Satisfaction Metric Sophistication
Illustrative

Census
Exemplar
A Census-Level View of IT Performance
Sample Metrics:
• “Customer satisfaction” drawn
from entire user population
(comprehensive perspective)
• “Senior executive satisfaction”
drawn from focused executive
feedback

Penetration
of User Base Progressive Practitioner
Reactive Information Gathering
Sample Metrics:
• “Customer satisfaction”
drawn from a random
Baseline selection of users (with
Ad Hoc Assessment potential sample bias)
Sample Metrics:
• None; scorecard includes
user testimonials (anecdotal
evidence)

Self-Selected
Users
Undifferentiated Business Sponsor/
End Users Seniority of Respondents Executive

Source: Working Council research.

© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecard Metrics 41


IT Balanced Scorecards 42

User Satisfaction
Listening to the Voice of the Customer
Company Background of the company’s approximately 5,000 users. In place since 1999,
J.D. Edwards & Company, the Denver-based ERP and supply chain the survey is composed of 26 questions that gauge user satisfaction
software maker, has $904 million in revenues and 4,954 employees. with existing IT products and services and requires only 15 minutes
The company serves 6,500 customers in more than 100 countries, to complete. The survey instrument is an internally developed
and 75 percent of revenues come from professional services. Web-based tool. The tool delivers the survey to IT customers and
compiles survey responses and results at a cost of less than $1,000
per year. Survey results are presented as a series of “dials” on the
Scorecard Background IT intranet. These dials include a scale from one to seven and show
Although not formally called a balanced scorecard, J.D. Edwards’ average scores for satisfaction with specific IT services. Viewers can
six corporate initiatives, which include “revenue and growth,” double-click and view individual comments. A single IT account
“customer satisfaction,” and the creation of a “knowledgeable and manager administers the survey, manages the survey process,
committed workforce,” approximate several of the major categories and analyses the results in addition to his/her regular account
of an IT balanced scorecard. management duties. In addition, the account manager collaborates
with IT services managers to ensure that action plans are created to
An Uncoordinated Approach to Customer Satisfaction address the problems highlighted by survey results, and also works
with the other account managers and IT service managers to select
While growing from $300 million in revenues in 1998 to
additional IT services to measure from a customer satisfaction
$1 billion in 2000, J.D. Edwards’ IT organization struggled to
perspective.
capture actionable end-user feedback that would allow it to better
enable the company’s growth strategies. The IT organization
faced two hurdles to these ambitions. First, the company often Giving Customers What They Want
relied on external benchmarking data to choose IT products and The IT department uses the feedback captured to iteratively test
services without input from internal customers. Second, when service improvements. For example, unfavorable survey feedback
the IT organization did administer end-user surveys, it did so on about the help desk led to the termination of a contractor’s services
an ad hoc basis, with users often receiving multiple surveys from and the creation of an internal help desk that was able to meet
multiple groups within IT. Bombarding end users with too many the established service-level goal. In another instance, quarterly
surveys generated sub-optimal response rates, leading to a non- user survey feedback revealed dissatisfaction with the company’s
representative view of IT’s performance. remote access service provider based on significant downtime and
unsatisfactory cost. In response to continued user dissatisfaction,
Combating Survey Fatigue the IT organization contracted with an alternative service provider
and upgraded its connectivity architecture, resulting in an improved
To solve this problem of survey “fatigue,” J.D. Edwards adopts a
user satisfaction score.
quarterly satisfaction survey, with each survey covering 25 percent

© 2003 Corporate Executive Board


User Satisfaction
Using End-User Feedback to Improve Service Quality
Annual survey generates detailed insight …and provides a vehicle for iterative
into drivers of user satisfaction… improvement of service delivery
Iterative Service Improvement Process
Illustrative
J.D. Edwards IT User Survey • Selected Questions
Q1 Survey Q2 Survey
IT Usage Assessment (Yes/No Response) Yes No
Off-site network access usage þ ¨ Remote Access Service Remote Access Service
Office move or expansion involvement þ ¨ 3
4
5 3
4
5
2 6 2 6

Utilize or coordinate voice services ¨ þ 1 7 1 7

Request application services ¨ þ Feedback: Feedback:


Low user satisfaction with remote Dissatisfaction remains
Conduct training classes in training classroom þ ¨ access service Corrective Step:
IT’s Overall Performance
Corrective Step: Change access provider
Overall performance of IT Services Notify access provider
Ability to deliver technical/business solutions and services
Communication about available services and new technologies
User Satisfaction with IT Services Q4 Survey Q3 Survey
Remote access Remote Access Service Remote Access Service
Internal voice services and support 4 4
3 5 3 5
2 6 2 6
PC services and support 1 7 1 7

IT problem resolution Options


Feedback: Feedback:
Network infrastructure 1. Extremely Unsatisfied Service quality goal is met Partial improvement
User Satisfaction with IT Intranet 2. Very Unsatisfied
3. Unsatisfied
Corrective Step: Corrective Step:
Accessibility 4. Neutral
None required Roll out new connectivity application
Value-Add (Write-in Response) 5. Satisfied
Additional Functionality (Write-in response) 6. Very Satisfied A Call to Action
7. Extremely Satisfied
User Satisfaction with Voice Services “To put on a survey and collect all the feedback and then not act
Voice services information availability on the results would be a mortal sin for everybody.”
Value-Add (Write-in Response) Frank Catalfamo, Manager IT
Additional Functionality (Write-in response) Metrics and M&As, J.D. Edwards

Source: J.D. Edwards; Working Council research.


© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecard Metrics 43
IT Balanced Scorecards 44

User Satisfaction
Collecting the Executive View of IT Performance
Company Background Each respondent is asked to rate the corporate IT function on a
Eli Lilly and Company, the Indianapolis-based pharmaceutical scale of 1 to 5, with 5 being very satisfied. The aggregate results of
manufacturer, has revenues of $11 billion and 43,700 employees. the survey are then presented as part of the IT balanced scorecard,
with individual responses kept confidential to encourage an honest
appraisal of IT’s performance. The intranet-based scorecard allows
Scorecard Background decision makers to drill down into top-level scorecard metrics,
In order to better track IT’s contributions at the enterprise level, the including underlying data, commentary from the metric “owner,”
IT organization at Eli Lilly created an IT balanced scorecard in 1999. and an overview of action steps that will be taken to improve
performance.
Enfranchising Senior Business Decision Makers
Recognizing that internal customer satisfaction surveys often fail Heeding the Business’s Advice
to make the important distinction between the feedback of end In addition to the numerical feedback provided by executives,
users and truly empowered business decision makers, Eli Lilly’s Lilly also solicits qualitative feedback. This not only helps
CIO supplements the IT organization’s end-user satisfaction survey capture executive perceptions of current IT service, but is also
by conducting an annual feedback exercise with 12 senior business an opportunity to gather future service requirements. So, by
executives. Feedback is gathered from each executive by the CIO in capitalizing on this occasion for dialogue provided by the executive
a 30 to 60 minute interview, using six qualitative questions designed satisfaction survey, Eli Lilly’s CIO gets both direction for the present
to gauge senior-level perceptions of IT value creation. The questions and a glimpse into what business sponsors want IT to be in the
ask business leaders to assess: future. Eli Lilly has used this formal feedback from senior business
executives to guide and improve its IT efforts in several concrete
• IT’s contribution to business value creation ways. In response to feedback that the business divisions required
• IT contribution to business process improvement more clarity on IT’s funding decisions, Eli Lilly’s IT organization
redesigned its portfolio prioritization process to provide more
• IT contribution to company competitive advantage transparency to business heads. In addition, executive survey
feedback prompted Lilly’s IT organization to expand the size and
• IT contribution to corporate business strategy
scope of its “reverse mentoring” program, in which junior IT
• IT operational performance staff serve as mentors to senior business executives, helping them
understand IT terms and concepts.
• Overall satisfaction with IT service

© 2003 Corporate Executive Board


User Satisfaction
Assessing IT Value Creation
Eli Lilly supplements a user satisfaction survey with executive feedback to inform performance improvement efforts

Survey conducted annually by CIO in individual Only aggregate results of executive surveys are published
sessions with 12 senior business executives to the scorecard to encourage honest feedback
Eli Lilly’s IT Balanced Scorecard
Sample Metrics Executive Management Feedback Collection Form
Illustrative
Financial Performance
Joe Smith
Executive Name: ________________ Marketing Dir. Date: ______________
Business Unit: _______________ 4/20/2002
• Economic Value Added (EVA)
• Unit service cost IT is not at all IT is exactly
meeting the needs meeting the needs Expectations Future Requirements
Project Performance of the corporation of the corporation
• Project Portfolio 1. IT Contribution to Business 1 £ 2R 3 £ 4 £ 5 £ Not enough clarity Update portfolio
Value Creation around IT cost prioritization process
Operational Performance
2. IT Contribution to Business 1 £ 2 £ 3R 4 £ 5 £ Close relationship in Creation of reverse
• Base investment Process Improvement redesigning process mentoring program
• Service levels versus external benchmarks
• Compliance with enterprise methodologies and 3. IT Contribution to 1 £ 2R 3 £ 4 £ 5 £ More proactive Charter advanced
standards Company Competitive technology scanning technology group
Advantage
Talent Management 4. IT Contribution to 1 £ 2 £ 3 £ 4R 5 £ Qualitative feedback gathered
• Diversity Corporate Business Strategy includes executive expectations,
• Progress on people
• Key capabilities
5. IT Operational Performance 1 £ 2R 3 £ 4 £ 5 £ perception of IT’s achievement with
regard to those expectations, and
6. Overall Satisfaction with IT 1 £ 2 £ 3R 4 £ 5 £ future service requirements
Service
User Satisfaction
An Occasion for Dialogue
“The executive management feedback surveys are a useful tool for
understanding and responding to business perceptions of IT’s performance.
The results are not as important as the dialogue between IT and the business.”
Roy Dunbar, CIO, Eli Lilly

Source: Eli Lilly; Working Council research.

© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecard Metrics 45


IT Balanced Scorecards 46

Information Security
A Proactive Approach to Security
Extended Enterprise Increases Security Risks Proactively Assessing Potential Vulnerabilities—More advanced
As organizations’ reliance on Web-based transactions, remote balanced scorecard practitioners include metrics to proactively
and wireless computing, and connectivity to external partners assess potential vulnerabilities. In most cases, these metrics
continues to grow, the number of potential information are still included in the operational performance category
security vulnerabilities has also increased. This trend toward of the scorecard, not in a stand-alone information security
greater enterprise permeability, coupled with the tragic events category. They include measures like the percentage of systems
of September 11 and high-profile security incidents such as compliant with security standards, the percentage of network
2003’s Slammer virus, has caused senior decision makers access points covered by intrusion detection systems, and the
to focus their attention on the unpredictable and costly percentage of IT security audit issues remedied by the agreed-
information security threats posed to the extended enterprise. upon deadline.
IT organizations exhibit three levels of sophistication in
Hardwiring Security Compliance—Exemplar IT organizations
managing that potential information security spending using
create a separate information security category on their IT
a balanced scorecard.
balanced scorecards. By elevating security metrics to the
Reactively Tracking Security Incidents—Most companies category level, exemplars seek to clearly communicate urgency
have not yet added an information security category to of these measures to senior IT staff and business sponsors.
their IT balanced scorecard, instead tracking a handful of Exemplars also supplement retroactive report of security
technical security measures in the scorecard’s operational incidents with tracking the implementation of and compliance
category. These measures are aimed primarily at retroactively with security policies and preventive measures. Typical metrics
documenting the number of security incidents detected per include the percentage of staff receiving security training and
period and the time required to respond to those incidents. the percentage of external partners in compliance with security
standards.

© 2003 Corporate Executive Board


Information Security
Information Security Metric Maturity Trajectory
Relative Maturity of Information Security Metrics
Illustrative
Exemplar
Hardwiring Security Compliance
Policy Sample Metrics:
• Percentage of staff receiving security
training
• Percentage of external partners in
compliance with security standards

Progressive Practitioner
Proactively Assessing Potential Vulnerabilities
Sample Metrics:
• Percentage of systems in compliance with
IT security standards
Focus of • Percentage of network access points
Measurement covered by intrusion detection systems
• Percentage of IT security audit issues
fixed by agreed deadline
Baseline
Reactively Tracking Security Incidents
Sample Metrics:
• Number of security incidents detected
per period
• Average time to respond to incidents

Operational
Reactive Proactive
Metric Posture
Source: Working Council research.

© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecards 47


IT Balanced Scorecards 48

Information Security
Safeguarding the Enterprise
Company Background Incident Rate—Documents the number of security incidents
Schlumberger Limited is a $13.5 billion global technology per 1,000 employees per year. Reported incidents are logged in
services company with 76,000 employees of more than 140 a central database and categorized by severity as serious, major,
nationalities. The company has operations in 140 countries and catastrophic to allow for year-over-year comparison and
and includes three primary business segments: Schlumberger trend analysis.
Oilfield Services, the world’s largest oilfield services company
Percentage of Infrastructure Protected—Focuses on the
and the leading supplier of technology services and solutions
protection of PCs, servers, and network infrastructure from
to the international oil and gas industry; WesternGeco, jointly
malicious attacks. Protection regimes include deployment
owned with Baker-Hughes, the world’s largest surface seismic
of anti-virus software on PCs, maintaining servers to close
company; and SchlumbergerSema, an IT services company
all known vulnerabilities, and applying security patches,
providing consulting, systems integration, and managed
and configuring networks according to established security
services to the energy, public sector, telecommunications, and
guidelines..
finance markets.
Percentage of Sites Audited for Security—Tracks the percentage
Scorecard Background of Schlumberger sites that have had an information security
audit within the past 12 months. Audits are conducted by
Schlumberger introduced its IT balanced scorecard in 2002
Schlumberger’s operational security group, and the results
as part of a larger IT initiative to better understand the
are cataloged in the central database to allow tracking of
comparative IT spending levels and align that spending more
remediation progress for any identified vulnerabilities.
closely with business needs.

Focusing Attention on Information Security


Assuming a Proactive Security Posture
Schlumberger’s focus on proactive vulnerability identification
To better track security spending and the progress of its
and remediation provides the IT organization with the
ongoing information security initiatives, Schlumberger has
knowledge of current system vulnerabilities. The scorecard
created a stand-alone information security category on its IT
also helps ensure that established security standards are being
balanced scorecard. Schlumberger tracks a mix of metrics that
followed, and that security audits are being conducted on an
gauges both past information security performance as well as
ongoing basis. In addition, by prominently including security
proactive preventive measures, including:
metrics on the IT balanced scorecard, Schlumberger increases
security awareness among its IT staff and business sponsors.

© 2003 Corporate Executive Board


Information Security
Increasing Information Security Awareness and Performance
Schlumberger includes both retroactive and proactive metrics in its security category

Schlumberger’s IT Balanced Scorecard Scorecard includes both retroactive Clear definition


Sample Metrics and proactive security measures of metrics
Financial Performance
• Projected cost for IT services
• Year-to-date cost IT services
• IT cost per seat Information Security
• Cost of data communication per seat Illustrative
• Spending by portfolio category
Objective Metric Definition Current Benchmark
Project Performance Performance
• Average application project delay Safeguard the Incident Rate per Rate defined as the number 8.6 7.7
Enterprise 1,000 Employees of catastrophic/major/serious Information
Operational Performance per Year Security incidents per 1,000 employees per
• Quarterly activity increase on the corporate year within the company
knowledge portal Percentage of Indicator of the degree of protection from 40% 29%
• Percentage of standardized PCs Infrastructure malicious attack of PCs, servers, and network
Protected infrastructure; protection for PCs means
Resource Management the device has updated antivirus and security
• IT headcount (excluding contractors) patches; protection for servers means no high
vulnerabilities; for network infrastructure
User Satisfaction protection means that they are configured
• Tickets per registered user per month according to our published standards
• Help-desk calls Percentage of Percentage of sites that have conducted an 10% 40%
• Help-desk first-call resolution rate Sites with Valid information security audit within 12 months;
Information audits are conducted by Operations Security
Security Audit and management personnel
Information Security

Information security
elevated to category level

Source: Schlumberger; Working Council research.

© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecards 49


IT Balanced Scorecards 50

Enterprise Initiatives
Giving Corporate Business Initiatives Their Due
Detailing IT’s Contribution to Enterprise Initiatives Spotlighting Enterprise Business Initiatives—Exemplar
In addition to its day-to-day operational role, IT often plays a companies create a separate category on the IT balanced
critical role in enabling discontinuous enterprise-level business scorecard that tracks metrics of IT’s contribution to major
initiatives, such as post-merger integration or business process corporate projects or initiatives, often over a multiyear period,
reengineering. By setting aside one category of the IT balanced for initiatives such as merger integration. These metrics act as
scorecard to track IT’s contribution to the execution of these a vehicle for communicating IT’s value creation and are usually
major business initiatives, CIOs are able to better manage the removed from the scorecard once the initiative has been
function’s enablement efforts and communicate those efforts to completed.
senior business decision makers and initiative sponsors. These
communications efforts range from the nonexistent
to the programmatic:
An IT-Centric View of Performance—In most cases, IT
balanced scorecards focus solely on IT’s efforts to better
manage its own activities. The link between corporate
strategic initiatives and the metrics tracked on the IT balanced
scorecard is tenuous at best, as the scorecard has not been
developed in conjunction with the corporate strategy, but
rather as a stand-alone IT performance measurement tool.
An Uncoordinated View of IT’s Contribution—More advanced
practitioners use the IT balanced scorecards to manage the
effective execution of IT work related to enterprise initiatives.
However, relevant metrics are typically dispersed across
the scorecard’s operational and project categories, making
it difficult for the CIO to articulate and business owners of
enterprise initiatives to understand IT’s full contributions to
the business’s efforts.

© 2003 Corporate Executive Board


Enterprise Initiatives
Enterprise Initiatives Metric Maturity Trajectory
Relative Maturity of Enterprise Initiatives Metrics
Illustrative

Exemplar
Spotlighting Enterprise Business Initiatives
Corporate
Strategy Sample Metrics:
• Percentage of acquired company
systems integrated in M&A category
• Number of business process steps
enabled by technology in Process
Reengineering category

Progressive Practitioner
Uncoordinated View of IT’s Contribution
Sample Metrics:
Altitude • Percentage of acquired systems retired
of Link to in Operational Performance category
Strategy
• Number of business process
enablement projects completed in
Project Performance category
Baseline
An IT-Centric View of Performance
Sample Metrics:
• No metrics linked to execution
of enterprise business initiatives

IT Strategy
Dispersed Consolidated
Metric Location on Scorecard
Source: Working Council research.

© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecards 51


IT Balanced Scorecards 52

Enterprise Initiatives
Putting IT on the Enterprise Stage
Organization Background agencies to establish an enterprise architecture by 2004 and to
The Federal Aviation Administration has an annual budget move services to the Web as part of the eGovernment initiative.
of $14 billion and is responsible for oversight of air traffic This urgency in developing and implementing an enterprise
control, airport regulation and safety, and air regulation and architecture and migrating to electronic service delivery has
certification. led the FAA to create a stand-alone eGovernment category on
the IT balanced scorecard, distinguishing these efforts from
the other project-related metrics tracked in the scorecard’s
Scorecard Background operational performance category.
In early 2001, the FAA’s IT department deployed an IT
balanced scorecard that tracks strategy implementation in six
Improving Customer Service and Ease of Use
key areas: financial performance, operational performance,
talent management, user satisfaction, security and safety, and The eGovernment category of the FAA’s IT balanced scorecard
eGovernment. The FAA CIO and the CIOs of each of the FAA’s includes three main objectives. The first is to make interaction
divisions review the scorecard on a weekly basis, and the same with the FAA easier and faster through the migration to Web-
group reviews the scorecard’s categories and metrics yearly based technologies. The second is to ensure the consistent
to ensure that they are aligned with the organization’s stated delivery of services by adopting standard guidelines for Web
strategy. The scorecard includes the current status, two-year architecture, interface design, and content presentation.
targets, and interim milestones for each metric, and documents Finally, the FAA aims to ensure that data shared internally and
potential risks that could jeopardize the achievement of those externally are timely and accurate.
targets. It is published on the FAA’s intranet site and is visible
to the entire IT organization. Generating Cost Savings Through Higher Visibility
By elevating the IT organization’s eGovernment efforts to the
Regulation Pushes eGovernment onto IT the Balanced category level on the IT balanced scorecard, the FAA has not
Scorecard only increased IT staff awareness of its importance, but has
The 1996 Clinger-Cohen Information Technology also made those efforts more visible to both the organization’s
Management Reform Act requires that all United States federal IT and non-IT leadership. This increased visibility allowed the
departments and agencies create and implement an enterprise CIOs of the individual lines of business to realize more than
IT architecture plan. More recently, the 2002 Presidential $2.5 million in savings per year by collectively purchasing
Management Agenda calls for all federal departments and standard infrastructure.

© 2003 Corporate Executive Board


Enterprise Initiatives
Increasing Awareness of IT’s Contribution to Enterprise Initiatives
The FAA’s IT organization elevates eGovernment efforts to the category level to underscore IT’s enterprise contribution

Current and next year targets as well as project


milestones listed to inform “gap to goal” discussions

FAA’s IT Balanced Scorecard eGovernment (Illustrative)


Sample Metrics Objective Metric Current Current Status Milestones Potential Risks
Performance Year Target
Financial Performance
Reduce the Burden on Percentage of 90% 100% • Inventory baselined • Tools for
• Percentage of IT investment dollars managed using Customers by Better applications • E-authentication e-authentication
best practices as determined by appraisal Leveraging Web-Based migrated to Web service interfaces services may be
• Percentage reduction in adaptation cost per system Technologies defined required

Operational Performance
• Percentage of program components compliant with Ensure Effective Service Percentage of 7% 10% • Milestones to • Possible change
the “to be” enterprise architecture Delivery Capabilities FAA Web sites be collected and in due date
• Percentage of shared components, standards, and that comply with updated by CIO
Web architecture, office
infrastructures across multiple applications requirements,
style, and content
Talent Management guidelines
• Percentage of program managers with project
management certification Ensure that Data and Percentage of FAA 60% New 50% New • Systems using • Ability to
Information That Are information systems 15% Legacy 10% Legacy approved data monitor
User Satisfaction Used to Conduct Critical critical to agency elements baselined compliance will
• Number of information quality complaints received Agency Business or Publicly business or used • Additional 250 data be challenging
Disseminated Are Timely, to disseminate elements approved
Accurate, Accessible, information publicly • Full governance
Understandable, and Secure that use approved process developed
Security and Safety* and approved
data elements

eGovernment

Key agency-level “eGovernment” project receives own Potential risks to success are listed
scorecard category to allow IT organization to monitor on scorecard in advance to allow
“Webification” and enterprise architecture ambitions creation of mitigation strategies

* Metrics not released by FAA. Source: FAA; Working Council research.

© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecards 53


54

© 2003 Corporate Executive Board


Scorecard Development
and Life-Cycle Management
Balanced scorecard exemplars view the use of IT balanced scorecards as on ongoing process rather than a discrete event. To ensure that the
IT balanced scorecard is a useful decision-making tool, exemplars regularly refresh the scorecard to maintain its alignment with the IT
strategy, adjusting the categories and metrics accordingly. In addition to regular revision of scorecard categories and metrics, exemplars also
take steps to help ensure the timeliness and accuracy of scorecard data and facilitate scorecard adoption, codifying data collection roles and
responsibilities and creating templates and incentives to encourage scorecard use.

Scorecard Rollout p. 56

Data Collection and Quality Assurance p. 58

Scorecard Review and Revision p. 60


Facilitating Scorecard Adoption p. 62

© 2003 Corporate Executive Board 55


IT Balanced Scorecards 56

Scorecard Rollout
A Process, Not a Document
Company Background 4. Metrics Definition—The same CTO-led team then creates a
Bowne & Co., a New York-based document, information definition for each of the metrics on the IT balanced scorecard.
management, and printing solutions provider, has $1 billion in This definition includes a one-sentence description, an overview
revenues and 8,000 employees. of the measurement technique used to track the metric—for
example, a user survey or a dashboard—and a list of initiatives
that must be undertaken to enable the tracking of the metric
The Balanced Scorecard as a Closed-Loop Process using the outlined technique (for instance, the development or
Bowne’s IT organization implemented its balanced scorecard in purchase of new reporting software).
2000 as part of a corporate-wide scorecard initiative, spurred by the
arrival of a new CEO. This scorecard implementation and life-cycle 5. Assigning Metric Ownership—Each of the metrics on the
process involves seven key steps: scorecard is assigned an owner who reports directly to the CTO.
The metric owners are directly involved in the scorecard creation
1. Kick Off Training for IT Staff—Bowne kicks off its balanced and update process, and a percentage of their bonus is contingent
scorecard initiative with a three-day, off-site training session for on their scorecard-related duties.
all of the company’s divisional and functional senior managers,
including those from IT. The training includes a basic overview 6. Data Collection and Quality Assurance—The frequency of data
of balanced scorecard concepts and terminology, but the session collection varies by metric according to several factors, including
focuses on an exercise to help participants initially align their the cost of data collection, the corporate financial reporting cycle,
unit’s strategy with the corporate strategy and then develop a set and the volatility of the business climate.
of metrics to measure contribution to each strategy. This process
7. Scorecard Review and Revision—The CIO and the other 11 Bowne
also allows each unit leader to communicate the unit’s own
corporate officers meet every six months to review the scorecards
substrategies to his or her peers.
for every area, including IT. The applicability of all metrics is
2. Ongoing Strategy Mapping—Each year, Bowne creates a top- reviewed annually by the CIO, the CTO, and their direct reports,
down corporate strategy which becomes the basis for a corporate and approximately 20 percent of metrics turn over each year.
balanced scorecard. Each business unit and function, including
Beyond the initial consulting fees for the balanced scorecard
IT, then devolves its strategies from the corporate strategic plan.
training, Bowne’s IT organization invested approximately 120
3. Metrics Selection—Building off of the IT strategy, a team person-days in managing the balanced scorecard process in 2002.
including the corporate CTO and his six direct reports—senior This effort was distributed to the largest extent across the 20-person
directors for global operations, client integration, enterprise IT management team.
architecture, and corporate systems, and vice presidents of IT,
HR, and Finance—creates a set of recommended metrics to track
IT’s progress against each strategic pillar. These recommendations
are then presented to the CIO for final approval.
© 2003 Corporate Executive Board
Scorecard Rollout
The IT Balanced Scorecard Life Cycle
Bowne creates a “closed-loop” IT balanced scorecard process
Bowne’s IT Balanced Scorecard Adoption Process
1. Kick Off Training for IT Staff Illustrative 7. Scorecard Review and Revision

Process Costs
• Balanced Scorecard 101 for divisional and
functional senior managers • CIO, CTO, and corporate officers review
Initial consulting fees scorecard every six months
• Initial strategy alignment and metrics
selection exercise • Metrics revisited annually by CTO-led group
$10,000 license fee for
software and $3,500/year
2. Ongoing Strategy Mapping maintenance and support costs 6. Data Collection and Quality Assurance

120 person-days/year for


ongoing process management

• Data collection frequency varies by metric


based on cost of collection, the corporate
• Annual IT strategy devolved from corporate financial reporting cycle, and volatility of the
strategy business climate

3. Metrics Selection 4. Metrics Definition 5. Assigning Metric Ownership


Metric Brainstorming
Metric
Brainstorming
ü

• Team of CTO and direct reports creates list • CTO-led team creates standard definitions
of metrics for all metrics, defines measurement • Owners assigned to each metric are
• List refined by using analysis of each potential technique and data collection processes, and responsible for scorecard completion
metric’s strengths and weaknesses outlines initiatives that must be completed to • Owners report to CTO and their bonuses
• Final approval by CIO allow tracking of metrics are linked to their scorecard-related duties
Source: Bowne; Working Council research.
© 2003 Corporate Executive Board Scorecard Development and Life-Cycle Management 57
IT Balanced Scorecards 58

Data Collection and Quality Assurance


Making Data Collection a Part of the Job
Assigning Data Collection Responsibility metric experts to collect the individual data elements required to create
While establishing and continually updating categories and metrics is critical scorecard metrics.
for the success of an IT balanced scorecard, those metrics are useless if the
underlying data are outdated or inaccurate. Most IT organizations have 3. Data Proofing—Once the required data have been extracted from source
loosely defined responsibility for metrics collection, adding names of “metrics systems and aggregated by the regional collection agents, they are passed
owners” to their scorecards. Exemplars define a clear collection process, along to the collection coordinator, who conducts an initial check of data
designating an owner for each metric, and in some cases, basing a portion of completeness and accuracy.
the owner’s compensation on timely, accurate delivery of data. By creating 4. Data Verification—If the required data are not present or a particular data
greater accountability for reporting, CIOs maximize the scorecard’s decision- point seems out of line with past performance, the collection coordinator
making effectiveness, ensuring the completeness and accuracy of data, as well then directly taps the relevant local metric expert for verification.
as its comparability over time.
5. Data Finalization—Once the local metric expert verifies the accuracy of
Company Background the data they are passed by the collection coordinator to the owners of
Cemex, S.A. de C.V., the Mexico-based manufacturer of aggregates, concrete, the objectives on Cemex’s IT balanced scorecard. There are 16 objective
and cement, has $6.7 billion in revenues and 26,000 employees. The company owners, drawn from the ranks of Cemex’s IT directors and managers.
has operations in more than 30 countries.
6. Data Analysis—The objective owners are charged with analysis of the data
Scorecard Background related to their assigned objectives. Cemex’s IT organization deliberately
absolves objective owners of the mechanics of data collection, believing
Cemex’s IT balanced scorecard was created as part of a corporate balanced
their time is better spent on trend and variance analysis and metric target
scorecard initiative, launched in 2001 by the corporate VP of planning
setting.
and finance. IT was among the first functional areas to adopt the balanced
scorecard methodology, and since then the scorecard has undergone some 7. Scorecard Creation—Post-analysis, the objective owners work with the
changes, including a reduction in the number of objectives and metrics collection coordinator to create the balanced scorecard, aggregating
measured. A subset of the IT balanced scorecard metrics feeds directly into the metrics for each of their objectives and providing comments with respect
enterprise-level balanced scorecard. to performance against pre-defined targets.
Building a Data Collection ‘Infrastructure’ 8. Quarterly Presentation—Finally, the objective owners present the balanced
Cemex IT initially struggled with the collection of balanced scorecard data scorecard to the CIO and IT management quarterly. This group identifies
based on varying levels of metric standardization and a lack of explicit areas for performance improvement and develops initiatives to address
collection responsibility. To help ensure scorecard data is aggregated in a those problems. The scorecard is then made available to all IT staff via
timely, accurate fashion from across its distributed operations, Cemex’s IT the IT intranet, while only the metrics that flow upward to the corporate
organization has laid out an eight-step process for scorecard data collection: balanced scorecard are shared outside of IT.

1. Data Request—To populate the scorecard in advance of its quarterly review, The collection and quality assurance process, from initial request by the
a data collection coordinator (a staff member of a corporate CIO direct collection coordinator to delivery of the requested information to objective
report) requests information from a network of regional collection agents. owners, takes approximately two weeks and involves 30 IT staff, who each
spend a small portion of their time on balanced scorecard activities. Time
2. Data Collection—The four regional collection agents (Asia, North America, spent by individuals varies slightly by the number of metrics collected in each
South & Central America, and Europe) coordinate with a network of local region, and Cemex’s IT organization claims that the total time investment of
all staff involved is equivalent to 1 to 2 FTEs for two weeks.
© 2003 Corporate Executive Board
Data Collection and Quality Assurance
Routinizing Scorecard Data Aggregation
Cemex’s IT organization clearly defines a data collection process and roles to ensure timely, accurate scorecard data
Cemex’s IT Balanced Scorecard Data Collection Process
Illustrative
1 Data Request 2 Data Collection 3 Data Proofing

• Collection coordinator receives data from regional


• Collection coordinator (a direct report to director • Four regional collection agents coordinate with local collection agent and conducts initial accuracy check
level) requests data from regional collection agents metric experts to collect requested data • In case of data inaccuracies, data are sent back to local
metric experts for verification

4 Data Verification
Quarterly Scorecard Presentation Where the Going Gets Tough
8
“It is easy to defi ne a great model
on paper for a balanced scorecard.
However, a consistent and reliable
process for data collection and analysis
• IT’s balanced scorecard is presented to CIO and IT is key in order to make it work.”
management quarterly by objective owners • Local metric experts verify data and communicate
• After presentation, scorecard is made available to IT Sergio J. Escobedo them back to collection coordinator
IT Planning, Cemex
department

7 Scorecard Creation 6 Data Analysis 5 Data Finalization

• Collection coordinator aggregates data and populates • Objective owner receives and analyzes data for • Collection coordinator receives final data and passes
balanced scorecard template assigned objective them to objective owner

Source: Cemex; Working Council research.


© 2003 Corporate Executive Board Scorecard Development and Life-Cycle Management 59
IT Balanced Scorecards 60

Scorecard Review and Revision


Making the Scorecard a Living Document
Cultivating a Long-Term Balanced Scorecard Perspective conservation and profitability within IT to reflect the “new reality”
While offering a snapshot of IT’s current performance, the IT of required cost cutting in 2002. In addition, the weightings of
balanced scorecard can also help CIOs and business decision metrics carried over from year to year have been changed and new
makers track IT performance over time. This requires a sustained metrics have been added to underscore the need for cost discipline
commitment by IT management, including review of the scorecard within IT. Corning’s aim is to change the set of scorecard metrics
on an ongoing basis to align scorecard metrics with changing only as much as is absolutely required in order to preserve a
business goals and strategies, and the creation of a process to help comparable multi-year view of IT performance.
drive adoption of the scorecard to all levels of the IT organization.
Building a Repeatable Process
Company Background To calibrate the scorecard’s metrics with the IT strategy, senior IT
Corning Incorporated, the Corning, New York–based manufacturer leaders engage in an annual series of three meetings. During the
of fiber-optic and telecommunication equipment, has $3.2 billion in initial meeting in June, the CIO and senior IT managers review the
revenues and 23,200 employees. current scorecard metrics to evaluate if each can be mapped to some
aspect of the coming year’s IT strategy, identifying opportunities
for removing old metrics or areas where new metrics are needed.
Scorecard Background The second meeting occurs in September, and the group focuses on
Although the balanced scorecard has not been adopted at the evaluating a list of potential new metrics for use in the coming year.
corporate level, Corning’s IT organization has been using a balanced A third meeting in December results in the selection of new metrics
scorecard since 2000. The scorecard is composed of four categories: and their addition to the scorecard.
value creation, customer-facing process excellence, internal process
excellence, and talent management.
Moving from Lagging to Leading Indicators of IT Performance
Across the past four years, this reevaluation process has also
Ensuring Scorecard Evolution shifted the focus of Corning’s IT balanced scorecard metrics
To maintain the continued viability of its IT balanced scorecard, from lagging indicators of IT performance toward more forward-
Corning adapts it to keep pace with shifting business and IT looking measures, which Corning calls “windshield” metrics. For
strategies. The company is currently using the fourth generation of example, the operational metrics on Corning’s scorecard have
its IT balanced scorecard, and while the four main categories have shifted from tracking post-deployment service interruptions in
remained the same on a year-over-year basis, the individual metrics 2000 to improving the pre-deployment planning process in 2003,
within those categories have changed over time. For example, in helping the company to avoid potential problems before they occur.
response to the sharp decline of its telecommunications-related In addition, Corning has used the review to reduce the number of
businesses in 2001, Corning modified the objective of its financial individual scorecard metrics from 15 in 2002 to 7 in 2003.
performance category from enabling business profitability to cash
© 2003 Corporate Executive Board
Scorecard Review and Revision
Ensuring Continued Scorecard Comparability and Relevance
Corning adjusts its IT strategy and metrics to respond to the changing business environment…
Objectives and Metrics for Financial Performance Category of Corning’s IT Balanced Scorecard
2000–2003

As corporate business strategies change in …metrics that are retained in scorecard change …and new metrics are added
response to external economic factors… weighting based on changing business priorities… to reflect new demands on IT

2000 2001 2002 2003


Objective Maximize the value of dollars invested Complete, commit to, and communicate a Today’s reality—conserve cash, return Support the company’s strategy at a
in IT comprehensive IT strategy for identified organization to profitability, protect the future significantly lower cost
Metrics • Contribution to business cost • Identified business/functional units have an IT • Performance to spending • Meet commitments for worldwide
and revenue objectives (100) strategy (125) targets (90) IT cost (150)
• Percentage core service offerings • Contributions to business profit objectives • Strengthen IT strategies (160) • Updated and integrate unit IT
at benchmark (50) (75) strategies (100)
• Percentage IT spend in application • Percentage IT planned spending in application
development (50) development (50)
• Variance to global IT budget (50)

…while migrating from lagging to leading measures at the level of individual metrics
Operational Performance Metrics of Corning’s IT Balanced Scorecard
2000–2003

Metrics focus on performance of Forward-looking operational Measures of customer feedback added Metrics focus on pre-deployment systems
systems that have already been deployed expectations added to existing metrics to proactively improve service offering planning to avoid potential future problems

2000 2001 2002 2003


Operational • Application availability (100) • Application availability (100) • Percentage of service commitments • Strengthen and expand
Performance • Percentage core infrastructure • Percentage core infrastructure offerings achieving commitments met (150) processes to plan and
Metrics offerings achieving commitments (75) • Implement customer feedback coordinate work (250)
(100) • IT expectations established for Corning employees (75) process and system (100)

Source: Corning; Working Council research.


© 2003 Corporate Executive Board Scorecard Development and Life-Cycle Management 61
IT Balanced Scorecards 62

Facilitating Scorecard Adoption


Encouraging Scorecard Use at All Levels of the IT Organization
Driving Scorecard Adoption This document lists more detailed targets than those reflected on
Corning’s second step in creating a sustainable IT balanced the balanced scorecard, as well as actual performance for each
scorecard process was driving the adoption of the scorecard scorecard metric. These automotive metaphors are derived from
throughout all levels of the IT organization. To achieve this, a larger IT improvement initiative called “Journey to Excellence.”
Corning employs two strategies: The CIO chose to continue the metaphor in the IT balanced
scorecard, referring to it as the “road map” for IT improvement. The
Creating a Customizable Template—Corning’s enterprise-level “odometer” is the device that Corning uses to assess its progress
scorecard is fed by IT scorecards for each IT functional area and along the “journey” and includes a maximum of 1000 “miles,”
business unit–based IT organization. The corporate scorecard’s with 250 miles allocated to each of the scorecard’s equally weighted
four categories are evenly weighted for purposes of incentive categories. Within a scorecard category, each metric target has
compensation, and while each functional or business unit–based IT an assigned “mileage,” with each incremental mile representing
organization must use the corporate-standard balanced scorecard an improvement in performance. The readings on the odometer
template, they are free to alter the category weightings as they see fit are linked to the performance bonuses for individual IT staff and
as those weightings serve to determine their functional or business can result in payouts ranging from zero percent to 140 percent of
unit–level incentive payouts. For example, in the case of the head potential bonus payments. The rule of thumb for bonus calculation
of infrastructure services, a portion of his incentive payout would is that if total mileage is below 600 miles out of a possible 1,000, no
be linked to the performance of the enterprise-level scorecard, bonus is paid out. Based on the early use of this process with the 10
while another portion would be contingent upon functional senior IT executives, Corning has expanded the program, and by
scorecard performance. That said, business unit scorecards must 2002 Corning had linked scorecard performance with the bonus
include metrics in each of the four categories, but individual structure of approximately 200 corporate IT staff.
categories can be weighted as low as 5 percent. By using a system
of tiered scorecards, Corning aims to encourage a baseline level of
scorecard standardization while allowing each individual IT group
the freedom to focus its scorecard efforts on specific categories or
metrics. Balancing Individual
Linking Scorecard Use to Individual Compensation—To encourage
and Enterprise Interests
the initial adoption of the IT balanced scorecard during its first “If you don’t link the balanced scorecard to individual
two years of use, Corning linked the compensation of its 10 most performance, all you have is a nice-looking document.”
senior IT staff to scorecard performance. To reinforce this link Richard Fishburn
between overall IT and individual performance, Corning appends a CIO, Corning Incorporated
document that it calls an “odometer” to the balanced scorecard.

© 2003 Corporate Executive Board


Facilitating Scorecard Adoption
Making the Scorecard Relevant to the Whole IT Organization
To facilitate scorecard adoption, Corning creates …and makes a substantial portion of bonus compensation
a tiered system of customizable scorecards contingent on achievement of scorecard goals
for each functional and business unit IT group…
2003 Scorecard Performance “Odometer*” for Talent Management
Illustrative
Enterprise-Level IT
Balanced Scorecard Enterprise-level IT
scorecard fed by
functional and business Talent Management
Financial Performance: 25%
unit IT scorecards Objective: Effectively transition the workforce to new model Max. = 100 miles
Project Performance: 25%
By the end of February 2002, each unit CIO and the IT shared Clearly defined
Operational Performance: 25% services leader will have a transition plan for their employees. performance
targets linked
Talent Management: 25% Corporate scorecard All units meet their plan, some late ¨ 60 miles with value to
weights all categories individual
equally All unit meet their plan þ 80 miles
All units meet their plan, some early ¨ 100 miles

These measures will be confirmed by a random survey


Infrastructure Services Display Technologies of employees taken Q2, Q3, and Q4
Scorecard Scorecard
Financial Performance: 30% Financial Performance: 25% Objective: Meet employees’ learning plans Max. = 150 miles

Each unit CIO will count the total number of learning plan items Bonus
Project Performance: 30% Project Performance: 35% on 2003 Learning Plans and measure achievement of those compensation
plans. All will use the following definitions of success: of 200 IT
Operational Performance: 20% Operational Performance: 15% employees
a. 88 percent of learning plan items met ¨ 90 miles dependent
Talent Management: 20% Talent Management: 25%
to varying
b. 94 percent of learning plan items met ¨ 120 miles
degrees on
c. 100 percent of learning plan items met þ 150 miles “odometer
Set of core metrics Business units mileage” of
required at business and functions use corporate
unit/functional level, corporate template balanced
but units/functions but can weight each scorecard
can add supplemental category as they Total Actual/Possible 230/250 miles
metrics see fit

* For a more detailed version of Corning’s odometer please see p. 84. Source: Corning; Working Council research.
© 2003 Corporate Executive Board Scorecard Development and Life-Cycle Management 63
IT Balanced Scorecards 64

Designing and Implementing an IT Balanced Scorecard


Key Takeaways
1. Basic Principles of Scorecard Design—Exemplar IT balanced scorecards exhibit six key characteristics:
a concise set of top-level metrics expressed in nontechnical terms; metrics devolved from the
annual IT strategic plan; senior-level scorecard ownership; enterprise-standard metrics definitions;
scorecard drill-down capability and metric content; and clear links between individual compensation
and scorecard performance.

2. Metrics in the Ascent—In addition to the cardinal IT balanced scorecard categories of financial
performance, project performance, operational performance, customer satisfaction, and talent
management, progressive companies are elevating metrics tracking information security and
enterprise initiatives to the category level to ensure they receive the required IT and business sponsor
visibility.

3. Ensuring Continued Scorecard Relevance—Exemplar IT balanced scorecard practitioners create


closed-loop processes for updating scorecard categories and metrics as business and IT strategies
change.

© 2003 Corporate Executive Board


Designing and Implementing an IT Balanced Scorecard (continued)
Key Takeaways
4. Formalizing Data Collection—Scorecards are only useful decision-making tools if decision makers
have confidence in the freshness and accuracy of the information they contain. Consequently,
exemplar IT balanced scorecard practitioners are creating clearly defined collection processes and
task-specialized roles to ensure data quality.

5. Facilitating Scorecard Adoption—In addition to linking individual compensation with IT balanced


scorecard performance, exemplars are also deploying tiered scorecard structures to help drive
adoption across the IT organization.

Questions for Discussion


1. What are the major hurdles to the adoption of an IT balanced scorecard?
2. What key “lessons learned” do experienced IT balanced scorecard practitioners have for IT organizations
that are considering implementing an IT balanced scorecard?
3. What metrics are companies tracking on their IT balanced scorecards that lead to more effective
decision making?
4. What specific decisions are companies using their IT balanced scorecard to make? What positive
outcomes have they seen as a result?
5. What strategies have companies employed to reduce the administrative cost of maintaining an IT
balanced scorecard?
© 2003 Corporate Executive Board Scorecard Development and Life-Cycle Management 65
66

© 2003 Corporate Executive Board


Appendix I
Collected IT Balanced Scorecards
This appendix includes complete versions of the corporate IT balanced scorecards used by the companies profi led in this brief. In some cases the
scorecard format and language have been slightly modified to allow for greater comparability and to remove company-specific terminology.

Schlumberger p. 68 Federal Aviation Administration p. 78

Bowne p. 69 Cemex p. 80

T. Rowe Price p. 72 Corning p. 82

AstraZeneca p. 74 J.D. Edwards (End-User Survey) p. 85

Eli Lilly p. 77

© 2003 Corporate Executive Board 67


IT Balanced Scorecards 68

Schlumberger’s Information Technology Balanced Scorecard


Current
Category Metric Benchmark Variation
Performance
Financial Performance Projected cost for IT services
Year-to-date cost for IT services
IT cost per seat
Cost of data communication per seat
Quality of Service and User Satisfaction Tickets per registered user per month
Help-desk first-call resolution rate
Project Timeliness Average application project delay
Core Services Activity increase on corporate knowledge portal
Percentage of standardized PCs
Portfolio Strategy Competitive positioning
Percentage of spending Customer care and services
by category
Field operations
Finance
HR
Employee efficiency
IT efficiency and support
Management, IT operations, and other
People IT headcount
Projected Cost of Outsourcing To SchlumbergerSema
To SchlumbergerSema-Network and Infrastructure Solutions
To external providers
Security Incident rate
Percentage of infrastructure protected
Percentage of sites with valid IS audit

Source: Schlumberger; Working Council research.

© 2003 Corporate Executive Board


Bowne’s Information Technology Balanced Scorecard
Financial
Current
Objective Metric Owner Initiatives Status Owner
Performance
Maximize earnings Standard cost for service compared to industry • Develop cost models for products and services
contribution
Percentage utilization of Bowne technology versus • Implement process for time reporting, to enable utilization
business demand plan tracking
Enable revenue growth
Return on technology investment • Develop a reporting process to monitor the return on the
technology investment
Optimize IT investment Percentage of the projected ROI achievement • Audit business cases for validation of ROI percentages
Year-over-year percentage of operating savings • Centralize contract and license negotiations and purchasing
Reduce cost base
of like services efforts where efficiencies would result

Client
Current
Objective Metric Owner Initiatives Status Owner
Performance
Understand Bowne General client survey results • Develop client survey
strategies and processes
Provide undisputed IT General client survey results • Partner with Marketing to showcase IT skills (e.g., create technical
expertise competitive analysis sheets for products, tech papers)
• Develop client survey
Total cost of ownership of identified products and • Conduct benchmarking study comparing IT costs versus market
Deliver superior service services compared to industry standards (companies with similar services and levels)
at a competitive price Percentage technology availability (includes: prod- • Pilot SLAs
uct delivery, unplanned business interruptions) • Pilot technology availability measurement process
Percentage variance of actual ROI versus
vs. ROIROI
proposed • Audit business cases
in businessincase
proposed business case • Implement estimating system
Enable productivity Actual vs.
versus
planned
planned
business
business
productivity
productivity
for imple-
for • Develop project plan for composition work
workflflow
ow
improvements mentation of keyofinitiatives
implementation key initiatives • Develop technology road map
• Implement NetFax
• Integrate BowneLink for eDistribution
Provide Bowne with General client survey results • Create a technical competitive analysis sheet for products
competitive advantages
Source: Bowne; Working Council research.

© 2003 Corporate Executive Board Appendix I: Collected IT Balanced Scorecards 69


IT Balanced Scorecards 70

Bowne’s Information Technology Balanced Scorecard


Secure the Base
Current
Objective Metric Owner Initiatives Status Owner
Performance
Percentage of technology availability (includes • Implement project tracking and monitoring process
product delivery, unplanned business • Pilot SLAs
Manage product and interruptions) • Pilot technology availability measurement process
service quality
Internal audit (including standards, policies, • Develop standard process for products and services implementation
procedures, percentage on project central) (including system development, testing, deployment)
Security audit results • Establish security infrastructure (including security office, strategy,
Protect information policies)
assets • Develop asset library
• Data center consolidation and redundancy
Number of tickets per application • Develop root cause analysis process

Optimize IT processes Percentage of systems adhering to enterprise • Define, publish, and enforce enterprise systems standards based
systems standards on best practices
• Develop enterprise technology inventory

Partner with Clients


Current
Objective Metric Owner Initiatives Status Owner
Performance
Improve and expand Customer (business sponsor) satisfaction grade • Implement standards and process to IT products and solutions
products and solutions
Supply cost-efficient Percentage of utilization versus plan (labor and • Develop a process for time reporting to enable utilization tracking
capacity that can be equipment)
flexed with business
cycles
Supply and enable global Percentage of technology availability of CRM • Complete CRM implementation
Client Relationship system
Management capabilities
Percentage completion of identified IT-business • Complete IT-business alignments with three business units,
alignments among business units and between operations platforms, and one corporate function
Align with the business
the business units and processes
units and processes
Adherence to timelines for delivery of IT-busi-
ness alignments
Source: Bowne; Working Council research.

© 2003 Corporate Executive Board


Bowne’s Information Technology Balanced Scorecard
Create Breakthrough Value
Current
Objective Metric Owner Initiatives Status Owner
Performance
Partner with Business to Percentage of investment in IT new product Develop a new product development life-cycle process
seize market opportunities by development resulting in new production products
deploying timely, cost-effective or services
and integrated new solutions
Create alliances and Percentage “strategic” partnership coverage Develop “supplier ecosystem” map
partnerships to capitalize on
new technologies
Recognize the potential Percentage of IT R&D investment resulting in Develop five-year technology road map (including document life
business value in new operational application cycle and infrastructure)
technologies

People
Current
Objective Metric Owner Initiatives Status Owner
Performance
Recruit, train, and develop Regretted turnover • Assess management needs for new-hire selection skills
a diverse high-performance Percentage of successful targeted selection • Provide targeted new-hire selection training as required
workforce
Percentage of individual training objectives met • Implement plans for pilot group
employee survey results
Develop employee product Results technology employee “business knowledge” • Establish focus groups with other business units to provide
knowledge alignment with the survey (product knowledge and customer alignment product knowledge exchange
business tools) • Customer-supplier mapping conducted with business segments
• Develop technology “Business Knowledge Survey”
Employee survey results • Align and enhance existing compensation and incentives system
Align compensation and incen-
tives to the business strategy Percentage of objectives met (relative percentage • Implement performance management system
of objective—e.g., threshold, target, stretch) • Implement the S.M.A.R.T. goal-setting program
Percentage of employees with individual • Implement training
development plans
Provide opportunities for Percentage of goals achieved as documented in the • Refine and implement description and policies for technical and
career growth S.M.A.R.T. system managerial career tracks (the career ladder)
Employee survey results • Develop five-star skills matrix assessment development
program for the support center
Source: Bowne; Working Council research.

© 2003 Corporate Executive Board Appendix I: Collected IT Balanced Scorecards 71


IT Balanced Scorecards 72

T. Rowe Price’s Information Technology Balanced Scorecard


Financial Performance
Current Quarter Current Quarter Next Quarter
Objective Metric Status Comments
Performance Target Target
Increase EBITDA Percentage change in EBITDA

Maximize fund Weighted expense ratio versus industry


performance Percentage of funds rated by Morningstar at four stars or above
Total IT expenditures (millions)
Manage IT expenditures
Total expenditures on delivering new functionality
Percentage of strategic projects initiated with a cost-benefit
Maximize business units’ analysis
ability to add value
NPV delivered during payback period

Partner User Satisfaction


Current Quarter Current Quarter Next Quarter
Objective Metric Status Comments
Performance Target Target
“Keep my systems running” Systems stability (partner survey)
“Quickly implement IT’s sense of urgency (partner survey)
solutions”
IT’s ability to demonstrate a competitive price (partner survey)
“Demonstrate a
competitive price” Price competitiveness—external benchmark

“Offer me innovations that Innovation to enhance business value (partner survey)


create business value”

Source: T. Rowe Price; Working Council research.

© 2003 Corporate Executive Board


T. Rowe Price’s Information Technology Balanced Scorecard
Internal Process (Project and Operational Performance)
Current Quarter Current Quarter Next Quarter
Objective Metric Status Comments
Performance Target Target
Integrate solutions using defined Percentage of projects complying with architectural standards
architectures, platforms, and processes
Operational
Provide reliable and functional systems Rate of failure incidents impacting business
Excellence
Effectively select and manage sourcing Percentage of strategic solutions competitively bid
relationships
Deliver according to plan Percentage of active projects delivered on time, on budget, with
customer satisfaction
Percentage of active projects with approved project plan
Business Provide world-class customer service External customer rating
Unit
Alliance World-class service (partner survey)
First-call resolution rate
Understand business unit strategies Percentage of IT training spent in business units
and operations
Propose compelling business cases for Percentage of discretionary spending sponsored by IT
Solutions IT solutions
Leadership Anticipate applications of technology Number of IT person-hours spent at industry event
in the Financial Services industry Percentage of pilot projects evolving into full projects

Learning and Growth (Talent Management)


Current Quarter Current Quarter Next Quarter
Objective Metric Status Comments
Performance Target Target
Hire, develop, and retain solid performers Percentage of non-entry-level positions filled internally
Average tenure of solid performers (years)
Number of candidates interviewed per open position
Foster an environment that encourages and Contribution recognition rating (associate survey)
recognizes contribution
Communicate and lead at all levels Communication and leadership rating (associate survey)
Percentage of task forces and projects led by those other than project
managers
Source: T. Rowe Price; Working Council research.

© 2003 Corporate Executive Board Appendix I: Collected IT Balanced Scorecards 73


IT Balanced Scorecards 74

AstraZeneca’s Information Systems Vision and Balanced Scorecard


AstraZeneca’s Vision for IS Globally

AstraZeneca’s IS Balanced Scorecard

Deliver Business Value


Metric Current Previous Actions
Objective Comments
Illustrative Status Status Taken
Be a creative, fast, effective Number of business units incorporated into shared service model
organization
Access management project implementation progress versus milestones
Percentage of staff incorporated into corporate directory
Growth through key products Percentage of Priority 1 requirements delivered for key brands
Number of brands supported globally by IS
Number of IS activities identified to support individual brands
Win in the United States U.S. ERP project implementation progress versus milestones
Campaign management tools implementation versus milestones
Secure the flow of new products Number of clinical studies utilizing IS tools
R&D supply chain project implementation progress versus milestones
Be the first choice for customers Percentage of consumer data migrated to new database
Percentage of help-desk tools updated

Source: AstraZeneca; Working Council research.

© 2003 Corporate Executive Board


AstraZeneca’s Information Systems Balanced Scorecard
Transform IS Efficiency and Effectiveness
Metric Current Previous Actions
Objective Comments
Illustrative Status Status Taken
Improve quality and regulatory Self assessments for IS quality policy in place
compliance process and practices Percentage of IS units with trained quality manager
Annual avoided IS costs
Percentage of targeted savings delivered
KPIs and measures of benefits developed
Vendor audit database rolled out
Percentage reduction in re-validation work
Corrective actions from audits implemented
Rating versus goals for quality workshops
Global quality training content developed
Step-up project portfolio IS portfolio management process in place
management IS “value model” established
Two-year IS portfolio model ready
Agreement on IS portfolio by all stakeholders
Deliver information management Percentage of intranet policies and standards implemented
basics to support informed Number of intranet sites killed versus reduction target
decision making
Percentage of portal projects with robust business case
Annual avoided costs from global licenses
Project progress versus schedule estimates
Percentage of projects using standards business case
Identify and implement application Application support best practices document available
support best practices
Percentage of business units operating with new application best practice guideline
Manage the global IT Actual IT infrastructure cost savings versus plan
infrastructure End-user satisfaction survey versus target
Number of material adverse regulatory compliances observed by FDA or other IS audits
Number of business interruptions caused by security intrusion

Source: AstraZeneca; Working Council research.

© 2003 Corporate Executive Board Appendix I: Collected IT Balanced Scorecards 75


IT Balanced Scorecards 76

AstraZeneca’s Information Systems Balanced Scorecard


Transform IS Efficiency and Effectiveness (continued)
Metric Current Previous Actions
Objective Comments
Illustrative Status Status Taken
Develop KPIs Initiative KPIs developed
IT infrastructure services KPIs
Support and maintenance KPIs
Portfolio and project KPIs
Financial and resource management KPIs
Supplier KPIs
Leadership KPIs
Security KPIs

Develop IS Leadership and Capabilities


Metric Current Previous Actions
Objective Comments
Illustrative Status Status Taken
Develop and demonstrate leadership Percentage of IT staff development plan reviewed and actions implemented
capability
Percentage of incumbent IS leaders with development plan in place
Percentage of global high-potential staff development plan in place
Change management and business acumen programs in place
Number of annual IS rotational assignments
Diversity performance (scale of 1 to 5)
Percentage of key roles filled
Develop and demonstrate IT professional Percentage of projects following project management framework
skills and disciplines
Percentage of projects with KPIs for visible value demonstration
Percentage of project managers with development plan and career path
Percentage increase in project manager capability
Attendance versus capacity of existing learning programs
Percentage increase in collaborative tool usage
Improvement of “virtual” team productivity
IS capabilities versus benchmarks (scale of 1 to 5)
Percentage reduction in skills shortages filled by consultants
Source: AstraZeneca; Working Council research.

© 2003 Corporate Executive Board


Eli Lilly’s Information Technology Balanced Scorecard

Source: Eli Lilly; Working Council research.


© 2003 Corporate Executive Board Appendix I: Collected IT Balanced Scorecards 77
IT Balanced Scorecards 78

FAA’s Information Technology Balanced Scorecard


eGovernment
Current Current
Objective Metric Status Milestones Risks
Performance Year Target
Ensure effective service Web site usability as determined by formal external appraisal • First report due to Department of
delivery capabilities Percentage of agency Web sites that are compliant with agency and Federal Transportation CIO in Q2 FY03
policies, rules, and regulations
Achieving “green” on eGovernment scorecard
Percentage of automated transactions using e-authentication services
Reduce the burden on Percentage of transactions fully automated • Inventory baselined
agency customers by bet- Reduction in customer time to perform automated services versus manual baseline • Agreement reached with Department
ter leveraging Web-based of Transportation on inventory and
technologies Reduction in agency time and cost to deliver automated services versus manual definition of “complete”
baseline • e-authentication strategy baselined
Increase in scores on customer satisfaction survey for service quality and • e-authentication service interfaces
availability defined
• e-authentication services available
Number of Federal agencies with which information, applications, or enterprise • Department of Transportation
license agreements are shared acknowledge completion
Ensure that data and Percentage of critical data and information that meet quality standards • Systems using approved data elements
information that are Percentage of critical data and information for which there is adequate “infrastruc- baselined
used to conduct critical ture” to maintain quality, including registration, stewardship, and a quality process • Additional 250 data elements approved
agency business or publicly • Full governance process developed and
disseminated are timely, Number of information quality complaints received and percentage resolved approved
accurate, accessible, under- Increase in score on customer satisfaction survey for satisfaction with agency data • Governance in place to ensure new
standable, and secure and information systems use approved data elements
Percentage of information systems critical to agency business or used to
disseminate information publicly that use approved data elements

Source: FAA; Working Council research.

© 2003 Corporate Executive Board


FAA’s Information Technology Balanced Scorecard
Business Value
Current
Current
Objective Metric Year Status Milestones Risks
Performance
Target
Provide the right Number of professional IT certificates held by the • Agreement reached on criteria used to decide if a
mix of qualified workforce in areas deemed critical to the agency, certified program manager is needed to run a program,
IT professionals such as security, architecture, Web management, including different levels of certification
and IT tools for IT program management • Agreement reached on set of recognized credentials
each business • Policy decided on how to address existing program
need of the managers
agency • Policy decided on how to phase in certification
requirement for new program managers
• Agreement reached on who specifically will be certified
and when
• Agreements reached with certifying organizations
Standardize Percentage of components, standards, and • Agreement on governance process for architecture
and simplify infrastructures shared across multiple applications
the enterprise Extent to which architectures are documented in
architecture accordance with Office of Management and Budget
to ensure IT and Department of Transportation requirements
investments are
aligned with Percentage of local business unit financial and
FAA business human resources systems retired because of
processes functionality provided by Enterprise Resource
Planning system

Source: FAA; Working Council research.


© 2003 Corporate Executive Board Appendix I: Collected IT Balanced Scorecards 79
IT Balanced Scorecards 80

Cemex’s Information Technology Balanced Scorecard


Financial
Objective Metric Current Performance Status
Optimize cost of IT services Total cost of IT operations per employee
Total cost of IT operations as percentage of total sales

Customer
Objective Metric Current Performance Status
Ensure consistently available and reliable services to IT users Percentage compliance with service level agreements
Ensure that IT solutions meet the business needs Business alignment index (executive management survey)
Meet employee IT service needs Level of service-mindedness and professionalism (end-user survey)

Source: Cemex; Working Council research.

© 2003 Corporate Executive Board


Cemex’s Information Technology Balanced Scorecard
Process
Objective Metric Current Performance Status
Excel at managing projects Percentage of deviation from plan of projects exceeding budgeted time
Percentage deviation from plan of projects exceeding budget
Percentage of IT projects budget managed by program management office
Operate data centers efficiently System availability
Number of unplanned system/network stoppages
Percentage of servers outside of CPU utilization standards
Maintain reliable office computing and telecom infrastructures Network reliability index
Percentage of obsolescence (PC, data, and voice networks)
Provide effective user support for IT services Weighted average of incident/problem resolution time by priority
Percentage of incidents resolved at first level

Learning and Innovation


Objective Metric Current Performance Status
Develop operational, coordination, and project management Percentage of IT staff certified in project management
capabilities in the IT staff Percentage of IT staff certified in coordination skills
Percentage of IT staff certified in operational cost structure understanding
Partner proactively with business leaders to establish trustworthy Percentage of eGroups initiatives with IT participation
communication
Business partner relationship index (end-user and executive survey)
Develop a customer service orientation within IT Percentage of IT staff completing courses in customer service
Percentage of IT staff passing courses in SLA development

Source: Cemex; Working Council research.


© 2003 Corporate Executive Board Appendix I: Collected IT Balanced Scorecards 81
IT Balanced Scorecards 82

Corning’s Information Technology “Road Map”


A Journey to Excellence
To facilitate its ongoing performance improvement initiative, “Journey to Excellence,” Corning’s IT organization has adopted a set of automotive
metaphors. The IT balanced scorecard is the “road map” for IT improvement and includes a maximum of 1,000 “miles,” with each mile designating
further progress toward established goals. Each of the scorecard’s four categories is equally weighted at 250 total miles. Corning supplements the
“road map” with an “odometer”—a more detailed view of scorecard metrics that allows Corning to assess its progress along against goals, and is one
of the inputs used to determine bonus compensation. For additional details about Corning’s “odometer” please see page 63.

Corning’s “Road Map” for 2003: IT Strategic Goals

Value Process Excellence IT Excellence People

Objective Objective Objective Objective


Support the company’s strategy at a Deliver the projects and services which Standardize the processes which are Effectively transition to, and begin
significantly lower cost our customers require critical to our success operating in, a new organizational
model

Measures of Success Measures of Success Measures of Success Measures of Success


1. Meet commitments for 3. Meet project 5. Strengthen and expand 6. Effectively transition
worldwide IT costs 150 miles commitments 125 miles processes to plan and the workforce to the
2. Update and integrate 4. Meet service coordinate work 250 miles new model 100 miles
unit IT strategies 100 miles commitments 125 miles 7. Meet employees’
learning plans 150 miles
Total 250 miles Total 250 miles Total 250 miles Total 250 miles

Source: Corning; Working Council research.

© 2003 Corporate Executive Board


Corning’s Information Technology “Odometer”
Value—Support the company’s strategy at a significantly lower cost
Objective Metric Potential Score Performance Scale
Meet commitments Worldwide IT spending performance versus goal 150 Miles 100% of goal 90 Miles
for worldwide IT 98% of goal 120 Miles
cost 96.5% of goal 150 Miles
Update and integrate Business and staff unit strategy creation maturity 100 Miles Unit IT strategy refreshed to reflect changes in business strategy 60 Miles
unit IT strategies trajectory (Includes IT strategy refreshment, Strategy refreshed, including opportunities to leverage global shared services 80 Miles
identification of shared services leverage points, organizations
and creation of a program portfolio) Strategy refreshed, shared services leveraged, and 2004 programs 100 Miles
incorporated into operating plan

Process Excellence—Deliver the projects and services which our customers require
Objective Metric Potential Score Performance Scale
Meet project Project performance index measure for 20 125 Miles Average score 1.8 75 Miles
commitments representative projects selected by unit CIO/ Average score 2.0 100 Miles
program office (Evaluation dimensions include Average score 2.2 125 Miles
business value and capabilities delivered, financial
and schedule performance, project management
quality)
Meet service SLA compliance for 10 key shared services 100 Miles Minimal threshold of commitments met 6 Miles/Service
commitments (Shared service managers/unit CIOs select All commitments met 8 Miles/Service
services) All commitments met, plans in place to reduce costs in 2004 10 Miles/Service
Average score across all units on general 25 Miles Average score 3.0 15 Miles
manager satisfaction survey (Five-point scale Average score 3.5 20 Miles
where 1 = does not meet expectations, 3 = Average score 4.0 25 Miles
meets expectations, 5 = exceeds expectations)

Source: Corning; Working Council research.


© 2003 Corporate Executive Board Appendix I: Collected IT Balanced Scorecards 83
IT Balanced Scorecards 84

Corning’s Information Technology “Odometer”

IT Excellence—Standardize the processes which are critical to our success


Objective Metric Potential Score Performance Scale
Strengthen and Definition, implementation, and measurement of the 250 Miles Process fully defined, pilots complete 30 Miles/Process
expand processes following processes: Process fully deployed and measured 40 Miles/Process
to plan and • Performance excellence • Business case analysis 2004 improvement plans in place 50 Miles/Process
coordinate work • Demand management • Prioritization
• Time tracking

People—Effectively transition to, and begin operating in, a new organizational model
Objective Metric Potential Score Performance Scale
Effectively transition Number of unit CIOs and IT shared services leaders 100 Miles All units meet their plan, some late 60 Miles
the workforce to creating a transition plan for their employees within All units meet their plan 80 Miles
new model designated time frame All units meet their plan, some early 100 Miles
Meet employees’ Percentage of staff development plan objectives met 150 Miles 88% of development plan objectives met 90 Miles
learning plans 94% of development plan objectives met 120 Miles
100% of development plan objectives met 150 Miles
1000 Miles

Source: Corning; Working Council research.


© 2003 Corporate Executive Board
J.D. Edwards’ IT User Satisfaction Survey

The J.D. Edwards Information Technology department would like you to take a few minutes and complete the following satisfaction
survey. The valuable feedback you provide will be used to improve the tools and services IT provides the JDE Enterprise. If any
question does not seem to apply to you, please skip to the next question. When you are Þnished, please click the Submit Survey
button at the end of the survey.
Thank you for your time and help.

General Information c. Speed and reliability of access to data on


1. Please rate your overall SATISFACTION with Þle servers
J.D. Edwards IT services d. Speed and reliability of e-mail system
a. Overall performance of J.D. Edwards e. Speed and reliability of network printing
IT Services 5. What could IT do to better meet your needs for
b. Ability to deliver technical/business solutions the network and network services?
and services
c. Communication about available services and
new technologies PC Support
d. Ease of doing business
6. Please rate your satisfaction with PC services
2. Please rate the CONFIDENCE you have that and support
the core IT services will be up and performing to a. Overall performance
your expectations-(Core services are Phone, PC, b. Promptness in responding to your
Network, Servers in the Data Center and problems/issues
Business Applications) c. PC equipment provided meets your needs
3. Please rate IT’s ability to be PROACTIVE in d. Ability of staff to understand and address
providing solutions to meet your business needs. the issue
e. Satisfaction with initial PC setup or EOL
replacement
Network 7. What could IT do to better meet your PC needs
and requirements?
4. Please rate your SATISFACTION with the J.D.
Edwards network and network services
(Connectivity to Internet, E-mail, Applications,
Data Files and Network Printing).
a. Overall performance
b. Speed and reliability of Internet access

Source: J.D. Edwards; Working Council research.

© 2003 Corporate Executive Board Appendix I: Collected IT Balanced Scorecards 85


IT Balanced Scorecards 86

J.D. Edwards’ IT User Satisfaction Survey

Help Desk Voice Services


8. Please rate your satisfaction with the IT 12. Please rate your SATISFACTION with the IT
Help Desk voice services (Phones, Cellular, Pager,
a. Overall Performance Calling Cards)
b. Timelessness in providing repairs or services a. Overall performance of internal voice
c. Courteousness of HD Staff services and support
d. Ability of HD staff to understand and address b. Information provided to you about voice
the issue services
e. Timelessness of status updates c. Promptness in responding to your
9. How could IT improve the Help Desk to better questions/concerns
meet your needs? d. Timeliness in providing solutions
e. Completeness of solutions
13. What could IT do to better meet your needs
Application Support for voice services?
10. Please rate your SATISFACTION with
maintenance and support of corporate
Field OfÞce Support
applications-(OnWorld, Expense Express,
Ariba, Rolling Thunder, Augeo, Aprimo, 14. Please rate your satisfaction with your local
On-Track, Personic) Þeld ofÞce support. If you work in the Denver
a. Overall performance of corporate applications Corporate ofÞce please skip to question #16.
b. Quality of Corporate Applications a. Overall performance of local IT staff
c. Promptness in responding to your application b. Courteousness of local IT staff
problems and issues c. Ability of local IT staff to understand and
d. Ongoing Support of corporate applications address the problem/issue
e. Ability to deliver approved strategic business d. Communication by local staff about
systems changes available services and new technologies
f. Ease of use of corporate applications e. Follow up and closure of reported
problems.
11. How can IT improve the corporate applications
to better meet your needs? Please be speciÞc 15. How could IT improve your local services to
regarding what application you are providing better meet your needs?
feedback on- OneWorld, Expense Express,
Ariba, Rolling Thunder, Augeo, Aprimo,
On-Track, Personic).

Source: J.D. Edwards; Working Council research.

© 2003 Corporate Executive Board


J.D. Edwards’ IT User Satisfaction Survey

Remote Access
16. Please rate your satisfaction with IT remote
access (or dial up) services. If you do not
utilize remote access in your position at JDE
please skip to question #18.
a. Ability to use broadband services top
connect to JDE
b. Ability to use wireless access in any
location that has it available
c. Communication about available remote
access services
d. Reliability of dial up access
e. Help Desk support of broadband services
17. How could IT improve remote access to
better meet your needs?
18. We welcome any other general comments
you may have about J.D. Edwards IT.

Thank you for completing the survey.

Please submit the survey by clicking on the button below.

Submit

Source: J.D. Edwards; Working Council research.

© 2003 Corporate Executive Board Appendix I: Collected IT Balanced Scorecards 87


88

© 2003 Corporate Executive Board


Appendix II
IT Balanced Scorecard Tools and Vendors

Despite this wealth of off-the-shelf offerings, Working Council research reveals that most companies do not seek outside help or buy
application packages to produce scorecard reports, but are instead using Excel or similar homegrown solutions which require manual
data input.
Kaplan and Norton’s balanced scorecard concept has spawned the creation of a new complement of tools and vendors aimed at
automating balanced scorecard data collection and presentation. Some of these vendors provide tools that enable the presentation of
data, offering Web-based solutions that provide structured data input and graphical presentation. In addition, leading database and
ERP vendors are also creating scorecard modules for their product suites, providing their customers with an easily integrated tool that
aggregates and displays basic scorecard data.
The following vendor briefs are designed to provide members who are considering the purchase of balanced scorecard software tools,
with an overview of the major players in the balanced scorecard space. Each brief includes vendor contact information, an outline of
selected functionality, and a selection of major customers.

© 2003 Corporate Executive Board 89


IT Balanced Scorecards 90

Balanced Scorecard Software Tools


Software that aggregates data from source systems and presents them in a balanced scorecard format. Data are usually semiautomatically or manually
updated.

Vendor and Product Contact Information Selected Functionality Selected Customers


Hyperion 1344 Crossman Avenue • Integrated dashboard and scorecard application Airbus
Sunnyvale, CA • Budgeting and forecasting tools can directly feed scorecard tools Boston Beer Company
U.S.A. 94089 • Business intelligence platform integrates Hyperion reporting tools with existing DHL
Tel: 408-744-9500 systems (CRM, ERP, legacy systems) France Telecom
Web: http://www.hyperion.com
Hyperion Performance
Scorecards

Panorama Business Views 50 John Street • Scorecard presentation tool Bank of America
Suite 600 • Modular, Web-based system providing drill-down capability Canadian Dept. of National Defence
Toronto, Ontario • Application training and integration services available GlaxoSmithKline
M5V 3E3, Canada Verizon Communications
Tel: 800-449-3804
pbviews Web: http://www.pbviews.com

ProDacapo Barnhusgatan 4, 4 tr • Web-based balanced scorecard, dashboard, and cost management tools ABB
SE-111 23 Stockholm, Sweden • Training, consulting, and tool implementation and integration services available ICA
Tel: +46 (0)8-622-25-00 • Pre-built with connectors to most existing IT systems Scania
Web: http://www.prodacapo.com
ProDacapo Balanced
Scorecard

QPR Sörnäisten rantatie 27 • Web-based balanced scorecard presentation tool Canon


A, 3. kerros • Integrates with SQL data warehouse Electrolux
00500 Helsinki, Finland Swisscom
Tel: +358 (0)9 4785 411 StoraEnso
Web: http://www.qpr.com
QPR Scorecard

© 2003 Corporate Executive Board


Balanced Scorecard Software Tools and Consulting
Vendors that provide both balanced scorecard training and consulting and software for data collection and presentation.

Vendor and Product Contact Information Selected Functionality Selected Customers


Cognos 3755 Riverside Drive • Product suite for strategy planning, monitoring, and presenting scorecard data BMW
Ottawa, ON • Uses OLAP data from any source Dow
K1G 4K9, Canada • Graphical presentation of data with historical view Harrah’s
Tel: 613-738-1440 • Consulting and training services available KeyCorp
Web: http://www.cognos.com NASA
Enterprise Scorecard Siemens

CorVu 3400 West 66th Street • Scorecard presentation tool that allows for drill-down into both the scorecard Bowne
Suite 445 data and strategy maps
Edina, MN • Consulting and training services available
U.S.A. 55435
Web: http://www.corvu.com
RapidScorecard

Crystal Decisions 895 Emerson Street • Web-based reporting and analysis tool Aetna
Palo Alto, CA • Allows for ad hoc customization of reports Canfor Corporation
U.S.A. 94301-2413 • Out-of-the-box connectivity for major ERP systems Fox Filmed Entertainment
Tel: 800-877-2340 • Consulting and training services available
Web: http://www.crystaldecisions.com
Crystal Performance
Scorecards

© 2003 Corporate Executive Board Appendix II: IT Balanced Scorecard Tools and Vendors 91
IT Balanced Scorecards 92

Bolt-Ons to Enterprise Systems


Presentation applications that integrate with vendors’ enterprise resource planning suites to extract data and present them in a balanced scorecard format.

Vendor and Product Contact Information Selected Functionality Selected Customers


Oracle 500 Oracle Parkway • Part of Oracle’s Business Intelligence suite First Union
Redwood City, CA • Presentation tool that reports defined PKIs and extracts data from existing Mobistar
U.S.A. 94065 Oracle systems
Tel: 650-506-7000
Business Intelligence Web: http://www.oracle.com
Applications

PeopleSoft 4460 Hacienda Drive • Presentation tool automatically extracts data from relevant modules of Danske Bank
Pleasanton, CA Peoplesoft suite Entergy
U.S.A. 94588-8618
Tel: 800-380-SOFT (7638)
Web: http://www.peoplesoft.com
Enterprise Scorecard

SAP Neurottstrasse 16 • Part of SAP Strategic Enterprise Management suite Avaya


69190 Walldorf, Germany • Automatically collects data from relevant SAP modules The Coca-Cola Company
Tel: +49-6227-7-47474 Henkel
Web: http://www.sap.com The Woolwich
mySAP Business
Intelligence Solution

SAS 100 SAS Campus Drive • Aggregates and presents data from SAS applications in a balanced scorecard Generali-Providencia
Cary, NC format ING Bank
U.S.A. 27513-2414 Quaker Chemical
Tel: 919-677-8000
Web: http://www.sas.com
SAS Balanced
Scorecard

© 2003 Corporate Executive Board


With Sincere Appreciation
Special Thanks
The Working Council for Chief Information Officers would like to express its gratitude to those individuals and organizations that have
been so generous with their time and expertise throughout the preparation of this study.

Mr. Balvinder Dhillon Mr. Richard Fishburn Mr. Jan Chodzko


IS Manager Vice President and Chief Information Officer IT Architect
AstraZeneca PLC Corning Incorporated Schlumberger Limited

Ms. Ruth Harenchar Mr. Roy Dunbar Mr. Bob Gravien


Chief Information Officer Vice President and Chief Information Officer Vice President Application Development
Bowne & Co. Eli Lilly and Company Schneider National, Inc.

Mr. Sergio Escobedo Mr. Robert Rovinsky Mr. Michael Butler


IT Planning Program Director, IT Finance
Cemex, S.A. de C.V. Strategy & Investment Analysis Division T. Rowe Price Group, Inc.
Federal Aviation Administration

Mr. Frank Catalfamo


Manager ITMetrics and M&As
J. D. Edwards & Company

Partial List of Participating Companies


Agere Systems Inc. E.ON AG L.L. Bean, Inc.
Agilent Technologies, Inc. Exxon Mobil Corporation Microsoft Corporation
Amersham plc Hallmark Cards, Inc. Philip Morris Companies Inc.
AXA Australia KEMET Corporation SBC Communications
Bechtel Corporation KeyCorp Yellow Corporation

© 2003 Corporate Executive Board With Sincere Appreciation 93


94

© 2003 Corporate Executive Board


Working Council for Chief Information Officers
ORDER FORM
IT Balanced Scorecards: End-to-End Performance Measurement for the Corporate IT Function is intended for broad dissemination among senior executives
and management. Members are welcome to unlimited copies without charge. Online ordering is available at www.cio.executiveboard.com. Alternatively, you
can call the Publications Department at 202-777-5921, e-mail your order to orders@executiveboard.com, or fax in the order form on this page.

Additionally, members interested in reviewing any of the Working Council’s past strategic research are encouraged to request a complete listing of our work.

Study Requested Quantity


IT Balanced Scorecards:
End-to-End Performance Measurement You may order an unlimited
for the Corporate IT Function number of copies without
additional charge.
CATALOG NO.: CIO1L9VDH ____________

Name & Title __________________________________________

Institution __________________________________________

Address __________________________________________

__________________________________________

__________________________________________

Telephone _______________________________________________

COPY AND FAX TO: Working Council for Chief Information Officers
Working Council for Chief Information Officers 2000 Pennsylvania Avenue NW
202-777-5270 Washington, DC 20006
Telephone: 202-777-5000
www.cio.executiveboard.com
95
© 2003 Corporate Executive Board
IT Balanced Scorecards 96

In-Person Research Presentations


An in-person research briefing is an interactive session that members can use to spark internal discussion and debate around strategic issues
that the Working Council has researched. A senior research director will travel to a member’s location to present and discuss research findings
that are most relevant to member issues and challenges. This in-person research briefing service allows member CIOs to share Council case
studies, practices, and insights with their staff and larger audiences within their organization.

Frequently Requested Presentations


1. Structural IT Cost Efficiency: What tools and processes are CIOs 6. Aligning IT with Corporate Strategy—Case Studies in Enterprise
using to achieve structural IT cost efficiency? Architecture Migration: How are premier companies creating self-
funding IT architectures that mirror and advance corporate strategy?
2. Strategic Vendor Management: How are pioneering organizations
managing IT vendor performance to maximize value and flexibility? 7. IT-Enabled Collaboration: What are the most effective uses of
IT-enabled collaboration? How are exemplars deploying these tools
3. Responsive IT Portfolio Prioritization: How are leading companies
to ensure efficiency and maximize collaboration?
structuring and reprioritizing IT portfolios to ensure continued
alignment with changing business strategies? 8. Deploying Enterprise Portals: What are the functional capabilities and
IT requirements of “world-class” portals and enterprise information
4. End-to-End Data Visibility: What technical and organizational
systems?
innovations are exemplars using to enable end-to-end data
visibility? 9. IT Strategic Planning Excellence: What are the leading practices for
aligning corporate and IT priorities?
5. Rapid Resource Redeployment: How are CIOs enabling rapid
resource redeployment in response to changing priorities? 10. Customer Relationship Management: How are exemplars rescoping
CRM initiatives for high-impact management of internal and
external customers?

Common Formats
Formal Research Facilitated Research Interactive Working One-to-One
Presentation Discussion Session Briefing

Photo Credits: Digital Imagery® copyright 1999 PhotoDisc, Inc.

© 2003 Corporate Executive Board In-Person Research Presentations 96


Working Council Project Support Desk
Custom Research for Strategic IT Staff, Available (Free) for the Asking

As a complement to our ongoing research, members are encouraged to take advantage of the Project Support Desk. A free resource
to assist senior IT staff with “work in the moment,” the Project Support Desk offers fast-turnaround research for business case
preparation, strategic planning exercises, and budgeting. Members describe their project goals and time constraints and then
decide what combination of literature search, fact retrieval, and networking contacts best suit their needs. In keeping with the
Working Council’s charter, the Project Support Desk does not conduct competitive research, benchmarking, or vendor assessments.

What are job descriptions


and career paths for “IT
Centers of Excellence?”
IT HR

What are the features


of world-class, IT finance–
reporting intranet sites? Project
ProjectSupport
SupportDesk
Desk
IT Finance ••Research
Researchresource
resourcefor forstrategic
strategic To
Torequest
requestaaproject,
project,contact
contact
ITITstaff,
staff,available
availablefree
freeofofcharge
charge your
yourrelationship
relationshipmanager,
manager,send
sendanan
••Customized,
Customized,member-initiated
member-initiated e-mail to ciopsd@executiveboard.com,
e-mail to ciopsd@executiveboard.com,
research
researchprojects
projects or
What are the key vendor orvisit
visitwww.cio.executiveboard.com.
www.cio.executiveboard.com
contracting trends impacting ••Fast
Fastturnaround
turnaround(overnight
(overnighttoto
my peers? three weeks)
three weeks)
IT Procurement ••For
Forideation
ideationininstrategic
strategicplanning,
planning,
data
data support, andqualitative
support, and qualitative
benchmarking
benchmarkingfor forbusiness
businesscases
cases
and
andbudgets
budgets
••Not
Notintended
intendedfor fortechnology
technology
What is the typical payback assessment,
period for call center assessment,benchmarking,
benchmarking,
ororcompetitive
competitiveresearch
research
technology investments?
BU CIO

© 2003 Corporate Executive Board Working Council Project Support Desk 97

Vous aimerez peut-être aussi