Académique Documents
Professionnel Documents
Culture Documents
IT Balanced Scorecards
End-to-End Performance Measurement for the Corporate IT Function
Working Council for Chief Information Officers Managing Director Publications Editor
Jaime M. Capellá Dave Engle
2000 Pennsylvania Avenue NW
Washington, DC 20006 Practice Manager
Kris van Riper
Telephone: 202-777-5000
Project Managers
Facsimile: 202-777-5100
Andrew Horne • Matt McWha
166 Piccadilly Consultants
London, W1J 9EF James Bilodeau • Carsten Schmidt
United Kingdom Senior Analyst
Telephone: +44-(0)20-7499-8700 Eric Tinson
Facsimile: +44-(0)20-7499-9700 Analysts
Kiran Mishra • Michael Scutari
www.cio.executiveboard.com Rich Flanagan • Brett Neely
Senior Director
Brian Foster
Directors
Sheldon Himelfarb • Matt Kelly
Associate Directors
Stuart Roberts • Ken Rona • Audrey Taylor
Executive Summary • iv
Member Self-Diagnostic • vii
An Exhaustive Compendium of IT Balanced Scorecard Metrics • 1
Introduction: Principles of Balanced Scorecard Design and Metrics Selection • 9
Best-in-Class IT Balanced Scorecard Metrics • 21
Financial Performance • 22
Project Performance • 26
Operational Performance • 30
Talent Management • 34
User Satisfaction • 40
Information Security • 46
Enterprise Initiatives • 50
Scorecard Development and Life-Cycle Management • 55
Scorecard Rollout • 56
Data Collection and Quality Assurance • 58
Scorecard Review and Revision • 60
Facilitating Scorecard Adoption • 62
Appendix I: Collected IT Balanced Scorecards • 67
Appendix II: IT Balanced Scorecard Tools and Vendors • 91
Order Form • 97
In-Person Research Presentations • 98
Working Council Project Support Desk • 99
Executive Summary
Yes No Yes No
1. Can I clearly articulate the link between IT operational 5. Can I communicate a holistic perspective of IT
and project activities and the organization’s stated performance consistently across various geographies
strategic business goals? and business units?
Yes No Yes No
2. Is there a process or mechanism in place to track the 6. Can I easily compare the performance of my IT
impact on service levels and satisfaction of ongoing function to that of industry competitors or companies
cost-efficiency efforts? with similar geographic dispersion or scale?
Yes No Yes No
3. Can I describe the performance of the IT function in 7. Do IT performance management meetings focus
a concise, non-technical, business-friendly fashion? almost solely on discussions of metric comparability
Yes No and validity rather than on making resource
4. Can I effectively communicate the value that IT creates allocation decisions?
for the business? Yes No
8. Do I have a sufficient understanding of the progress
and status of ongoing IT project work to allow for
corrective action if major projects are at risk for scope
creep, budget overruns, or schedule delays?
Diagnostic Evaluation
If four or more “No” answers, then adoption of an IT balanced scorecard
may facilitate IT performance management at your organization.
Diagnostic Evaluation
Total Number of “Yes” Answers Assessment
1–8 IT Balanced Scorecard Novice
9–16 Developing IT Balanced Scorecard Competency
17–22 Balanced Scorecard Exemplar
Tactical
50 percent of
Kaplan and Norton Harvard Fortune 1,000
companies
introduce the Balanced Business have adopted
Scorecard concept in 50% 50%
an Harvard Business
Review a corporate
Review article balanced
1992 scorecard
Balanced Scorecard Use in IT Departments IT Balanced Scorecard Adoption Among Working Council Members
Survey of More Than 345 Senior IT Executives Survey of 124 CIOs
30% Percentage
Percentage of Working
24% Council
of Working
Council 39% members
members using an IT
61% balanced
not using an
IT balanced scorecard
scorecard
2002 2003
© 2003 Corporate Executive Board Introduction: Principles of Balanced Scorecard Design and Metrics Selection 11
IT Balanced Scorecards 12
2 Monitoring Service Levels Decline in IT Capital Expenditures 5 Baselining IT’s Performance Share of IT Budget Allocated
While Cutting Expenses Reported Capital Expenditures, 752 Corporations with Respect to External to External Service Providers
• Problem: IT budgets cut while 2002 2003 (E) Providers
growing share of business • Problem: IT organizations lack data
processes enabled by IT 10%
to compare their performance against 19%
• Balanced scorecard provides that of outside service providers
ability to balance cost and service (10%) • Balanced scorecard provides baseline
quality in an informed manner (15%) performance, which can be used to 1994 2003 (E)
Source: Goldman Sachs Global Equity Research. assess feasibility of outsourcing Source: Gartner.
3 Demonstrating IT Value Q: Has Pressure to Demonstrate ROI 6 Depoliticizing Resource Politicization of Prioritization Process
to the Business Increased or Decreased in Past 12 Months? Allocation Decisions Survey of 1,077 CIOs, CTOs,
• Problem: Difficulty of tracing Survey of 365 CIOs, CTOs, • Problem: Resource allocation decisions and VPs of IT, July 2001
IT’s impact on business and VPs of IT, December 2002 made lacking project cost, benefit, risk Believe
Prioritization
performance Decreased information Process Is
• Balanced scorecard measures 3.3% • Balanced scorecard provides standard Depoliticized 38% Believe
broad range of granular, business- metrics to inform resource/quality 62% Prioritization
Stayed the Same 36.7% 60.0% Increased
focused metrics trade-off discussions Process Is
Politicized
Source: CIO Insight. Source: CIO Insight.
© 2003 Corporate Executive Board Introduction: Principles of Balanced Scorecard Design and Metrics Selection 13
IT Balanced Scorecards 14
4 Data Collection Automated data collection, in many Portions of required data collected,
Process cases integrated into monitored systems aggregated manually
Sample Metrics
Sample Metrics
• Percentage of projects
• Web server uptime delivering new business
• SAP availability functionality
• Help-desk first-call • Global desktop availability
resolution rate • Percentage of applications
meeting security standards
© 2003 Corporate Executive Board Introduction: Principles of Balanced Scorecard Design and Metrics Selection 15
IT Balanced Scorecards 16
© 2003 Corporate Executive Board Introduction: Principles of Balanced Scorecard Design and Metrics Selection 17
IT Balanced Scorecards 18
Financial Performance
• Cost of data communications per seat $497 $450
Connecting service cost with strategy • Relative spending per portfolio category N.A. N.A.
implementation and project progress
Project Performance 5 User Satisfaction
to facilitate principled trade-offs
• Percentage of new development investment resulting in 65% 70%
new revenue streams
2 Project Performance • Percentage of IT R&D investment leading to IT service 80% 90%
Quat 1 Quat 1 Quat 1 Quat 1
improvements
Identifying service delivery problems
Project Jan Jan Jan Jan Jan Jan Jan Jan Jan Jan Jan Jan
Explore Market Need
Final Quality Assurance Testing
Explore Market Need
Explore Market Need
Final Quality Assurance Testing
Explore Market Need
Final Quality Assurance Testing
User Satisfaction
• Entire user population 3.7/5 3.8/5 Focusing IT’s efforts on security
• Focused executive feedback 96% 98% spending and training
Providing a customer-focused view
• Comprehensive perspective 3.3/5 3.5/5
of IT operations
Information Security
4 Talent Management • Percentage of staff receiving security training 40% 70% 7 Enterprise Initiatives
• Percentage of external partners in compliance with 25% 50%
security standards
© 2003 Corporate Executive Board Introduction: Principles of Balanced Scorecard Design and Metrics Selection 19
20
Financial Performance p. 24
Project Performance p. 28
Operational Performance p. 32
Information Security p. 48
Enterprise Initiatives p. 52
Financial Performance
Institutionalizing Financial Rigor in IT
Financial Discipline at the Heart of the IT Balanced Scorecard Directional Granularity—Advanced IT balanced scorecard
Although the concept of the balanced scorecard was developed in practitioners adopt a more nuanced view of IT’s financial
order to overcome the traditional reliance on financial-only metrics performance, delineating IT expenditures by geography, by business
for performance measurement, a core set of financial metrics is a key unit, by technology category, or by technology life-cycle stage. This
component of exemplar IT balanced scorecards. These metrics are provides decision makers with at least the directional ability to
especially critical in the current climate of stagnating or shrinking target the most significant cost areas. Typical metrics include total
IT budgets, as accurately tracking and reporting cost and budget infrastructure spending, operating cost, or total IT spending by
performance is a prerequisite for corporate IT organizations. The business unit.
financial IT balanced scorecard category typically exhibits three
Measuring by Portfolio Mix—As a strategic performance
progressive levels of metrics sophistication:
management tool, the IT balanced scorecard needs to present the
An Aggregate Spending Overview—The baseline financial metric for strategic impact of IT investments. Exemplar companies supplement
most IT balanced scorecards is some absolute measure of company a foundation of granular financial metrics, such as total IT cost
total IT expenditure (for example, total IT spending or IT spending and cost of data communication per seat, with an understanding of
as a percentage of revenue). While important to track on an ongoing how these costs compare to industry peers, usually using external
basis and an essential measure of IT’s financial health, this type of benchmarking data. In addition, exemplars also track portfolio
metric fails to provide decision makers with actionable information mix—spending per portfolio category—to anchor IT’s financial
and context about IT spending’s link to the business’s strategic performance in a broader strategic context and help decision makers
imperatives. assess spending in each category relative to that in other categories,
as well as past spending levels.
High
Exemplar
Measuring by Portfolio Mix
Sample Metrics:
• Cost of data communications per seat
• Relative spending per portfolio
category
Progressive Practitioner
Directional Granularity
Granularity Sample Metrics:
of Measure
• Total IT spending by geography
• Total IT spending by business unit
• Total infrastructure spending
Baseline
An Aggregate Spending Overview
Sample Metrics:
• Total annual IT spending
• IT spending as a percentage
of company revenue
Low
Low High
Strength of Metric Link to Business Outcomes
Financial Performance
Establishing a Financial Frame of Reference
Company Background human resources, as well as enterprise-centric categories such as
Schlumberger Limited is a $13.5 billion global technology services competitive positioning, employee efficiency, and IT efficiency
company with 76,000 employees of more than 140 nationalities. and support. This metric of current portfolio “mix,” along with
The company has operations in 140 countries and includes three historical mix data, allows decision makers at Schlumberger to
primary business segments: Schlumberger Oilfield Services, the ensure that each of the portfolio categories receives sufficient
world’s largest oilfield services company and the leading supplier funding and to reallocate funding from one category to another
of technology services and solutions to the international oil and to better align with shifting business strategy.
gas industry; WesternGeco, jointly owned with Baker-Hughes, the
More Effective Identification of Standardization and Portfolio
world’s largest surface seismic company; and SchlumbergerSema, an
Optimization Opportunities
IT services company providing consulting, systems integration, and
managed services to the energy, public sector, telecommunications, One benefit Schlumberger reports from its use of the IT balanced
and finance markets. Other Schlumberger businesses include Smart scorecard is that it has been able to more programmatically identify
Cards & Terminals, the NPTest semiconductor testing solutions missed standardization opportunities. In one instance, a business
unit, Verification Systems, and Water Services. unit manager looking at the detailed financial metrics on the
scorecard realized that the unit’s IT cost per seat was higher than the
Scorecard Background company average. An in-depth review of the unit’s IT cost structure
Schlumberger introduced its IT balanced scorecard in 2002 as part revealed that it had not kept pace with the corporate infrastructure
of a larger IT initiative to better understand the comparative IT standardization initiative, resulting in higher operational costs. A
spending levels and align that spending more closely with business second benefit is that Schlumberger is able to make more informed
needs. portfolio decisions. For example, by reviewing the scorecard, the IT
department observed that the amount of IT investment directed to
Providing Granular Cost Transparency and Comparative Context customer care applications was unusually low compared to spending
As a foundation for the financial performance category of its IT in other portfolio categories. This triggered a closer look at customer
balanced scorecard, Schlumberger tracks granular financial metrics care spending, which revealed that the spending level appeared low
such as the cost of data communication per seat. In addition, the for two reasons. First, the company had just completed the rollout of
scorecard provides readers with an understanding of how these a major customer care application, and second, some customer care
costs compare to those of industry peers using benchmarking spending was being handled by the company’s business units outside
data from various sources, including third-party vendors and of the IT budget. As a result, Schlumberger decided not to adjust
its own IT services arm. To supplement these granular financial spending on customer care upward, despite its initial inclination
measures, Schlumberger also tracks the percentage of IT spending to do so. Following this early success Schlumberger’s next step in its
dedicated to a set of eight portfolio categories, which include IT balanced scorecard development is to include business metrics
business productivity–oriented categories such as finance and taken from the corporate scorecard.
Information Security
• Number of security breaches Spending tracked by portfolio Historical data allow decision makers
• Incident rate category to inform resource to track improvements or spotlight
• Percentage of infrastructure protected allocation decisions areas of potential underspending
• Percentage of sites with valid information
security audit
Source: Schlumberger; Working Council research.
Project Performance
Avoiding Project Disaster
Monitoring Ongoing Project Performance to Avoid Surprises Project Progress and Status—At a minimum, companies measuring
A 2002 study by the Standish Group showed that out of 35,000 IT project performance on their IT balanced scorecards include three
projects conducted between 1994 and 2002, only 28 percent were basic metrics: the percentage of projects delivered on time, the
successfully completed, 23 percent failed, and the majority— percentage of projects delivered on budget, and the percentage
49 percent—either ran over budget or past schedule or ended up of projects delivered within scope.
with fewer features and functions than planned. In addition, larger
Post-Project Review—In addition to tracking project budget,
projects exhibited a higher rate of failure—68 percent of projects
schedule, and scope adherence, more progressive IT balanced
smaller than $500,000 were finished successfully, while only
scorecard practitioners also add metrics that highlight the
10 percent of projects larger than $3 million were deemed a success
performance of completed projects. These companies gather
at completion. In addition to consuming IT resources that could be
feedback from project staff on the project execution process and
used more effectively elsewhere, these project failures—especially
from the business sponsor or unit using the final product on the
those of larger projects—can have serious repercussions outside
solution’s usefulness and the service received from the project team.
of the IT organization. One example of the potential corporate
In addition, some IT organizations also track the failure rate of
impact of large IT projects that fail to deliver is NIKE’s deployment
deployed technologies as a proxy for the quality of project work.
of i2’s supply chain management software. The well-publicized
failure led to a one-quarter sales decrease of $100 million as a Project Contribution to Business Goals—Exemplar companies use the
result of problems with inventory levels. Another is Hershey, IT balanced scorecard to track project alignment with key business
whose $112 million SAP R3 implementation resulted in shipment goals, supplementing in-progress and postmortem project execution
disruptions just ahead of the 1999 holiday season and was blamed information with an articulation of how specific IT projects enable
for a 19 percent reduction in third-quarter profits. One of the key one or more IT corporate strategic imperatives (for example, systems
failure points cited was a lack of disciplined project planning and simplification efforts or business process standardization).
management. The examples of NIKE and Hershey are extreme, but
by using the IT balanced scorecard, CIOs can potentially prevent
project disasters by tracking metrics at three levels:
Exemplar
Linked to
Corporate Project Contribution to Business Goals
Strategy
Sample Metrics:
• Percentage of new development investment
resulting in new revenue streams
• Percentage of IT R&D investment leading
to IT service improvements
Progressive Practitioner
Post-Project Review
Metric
Altitude Sample Metrics:
Baseline
• Sponsor satisfaction score
Project Progress and Status • Early-life failure rate
Sample Metrics:
• Percentage of projects delivered
on time
• Percentage of projects delivered
on budget
• Percentage of projects delivered
within scope
Self-
Referential
Point-in-Time Life Cycle
Metric Time Horizon
Source: Working Council research.
Project Performance
Highlighting IT’s Role as a Corporate Enabler
Company Background External Partnership Building—Tracks IT’s collaboration with
Bowne & Co., a New York-based document, information technology vendors and industry experts to identify business uses
management, and printing solutions provider, has $1 billion for specific technologies, as measured by the percentage of vendors
in revenues and 8,400 employees on Bowne’s partner “ecosystem” map that the IT organization has
partnered with. This map documents existing vendor relationships,
current spending levels, and contacts at other organizations that
Scorecard Background Bowne wishes to partner with.
In 2000, Bowne’s CEO engaged the services of a strategic planning
consultant who advocated the creation of corporate and functional IT Value Generation—Assesses IT’s performance in realizing the
balanced scorecards. The IT balanced scorecard, which is directly potential business impact of new technologies, as measured by the
mapped to Bowne’s IT strategy, has five main categories: financial percentage of technology R&D investment that leads to new IT
performance, project performance, operational performance, talent operational services.
management, and user satisfaction. The Excel-based scorecard
includes a list of metrics and owners for each category, a clear A Higher Corporate Profile Pays Off
definition for each metric, a description of how it will be measured,
By highlighting IT’s enablement of business strategies and proactive
and most importantly, a list of initiatives that the IT organization
approach to more efficient operations, Bowne’s CIO has been able
must undertake in order to be able to measure performance in each
to protect the IT budget during a time of spending cuts across the
category.
company. During the 2002 budgeting process, IT was spared any
cuts, while IT capital expenditures actually increased by 75 percent.
Spotlighting Project Impact on the Business
The project category of Bowne’s IT balanced scorecard includes
several metrics that seek to spotlight the direct impact of IT projects
on the business. These include measures of:
* Names are for illustrative purposes only Source: Bowne; Working Council research.
and do not represent actual employees.
© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecard Metrics 29
IT Balanced Scorecards 30
Operational Performance
Drilling Down into the Operations Performance
Optimizing Operational Performance An Aggregated View of Operational Performance—More advanced
Most IT organizations track the performance of individual balanced scorecard practitioners are aggregating subsets of IT
operational systems, including servers, applications, and desktops dashboard metrics to present an overview of the performance for
using dashboards and alerting functionality to inform operational particular units of IT service (for example, desktop availability or
managers of service quality problems and system failures. However, Web front-end uptime). This approach takes an initial step toward
the IT department’s customers, the end users, are most likely denominating IT operational performance in terms relevant to
to judge IT’s operational performance based on their personal end users, providing the CIO and senior management with a more
experience, asking “Does my PC work?” or “Do I have Internet informed basis for cost and service-level trade-offs. In addition, IT
access?” As a result, the typical IT balanced scorecard provides little organizations are also using the operational category to track system
intelligence to CIOs and business decision makers as to how changes compliance with architecture standards.
in IT’s performance impact users. Metrics in the operational
Operational Performance from the End-User’s Perspective—Exemplar
category of the IT balanced scorecard range in maturity across the
companies take this customer-centric approach a step further,
following three levels:
focusing their operational scorecard metrics on how the end user
Measuring the Performance of Individual Systems—The majority actually experiences IT service and managing IT operations with
of companies include specific operational data such as network the end user in mind. While a network outage might severely
uptime or application availability in the operational category of impact IT’s uptime performance when viewed through the lens of
their IT balanced scorecard. This information is critical for the an operational dashboard, it has little impact on end users when it
ongoing management of performance levels, but of limited value as a happens in the middle of the night, or only impacts backup systems.
scorecard metric for two reasons. First, a report produced quarterly On the other hand, if a critical application is not available during
will provide operational managers with data only when it is too peak work hours, the impact on end users can be comparable to
late to act; and second, granular operational metrics provide little a total system crash. Consequently, to help improve operational
insight into where decision makers can cut costs without adversely performance, exemplars utilize scorecard metrics that track an
affecting service levels that matter to end users. aggregate measure of uptime for all systems that impact end users
and sensitize those metrics according to the impact of performance
problems on critical business processes.
Aggregated
Services Exemplar
Operational Performance from
the End-User’s Perspective
Sample Metrics:
• Peak time availability
• Critical process uptime
Progressive Practitioner
An Aggregated View of Operational
Performance
Level of
Metric Sample Metrics:
Detail • Web front-end uptime Putting the Business First
• Desktop availability “In the past, we’ve tended to take a more
• Percentage of systems compliant internally focused view of IT with our
with architecture standards metrics to determine things like data center
Baseline or network availability. However, customers
don’t necessarily care about what is causing a
Measuring the Performance network outage, but more about the fact that
of Individual Systems the system is down. …Now we’re attempting
Sample Metrics: to measure ourselves on things business people
think are important.”
• CRM availability
• LAN uptime Mike Hyzy
Manager of Global IT
Metrics, NCR Corp.
Individual
Systems
Low Metric Relevance to End User High
Source: “Formula for ROI,” PC Week (28 September 1998); Working Council research.
Operational Performance
Balancing Business and IT Perspectives on Operations
Company Background The Scorecard as a Compliance Lever
T. Rowe Price Group, Inc., the Baltimore-based asset management While most companies track some form of system reliability
firm, has $925 million in revenues and 3,700 employees. The as part of their scorecard efforts, fewer organizations use their
company administers 80 funds and manages assets worth more than scorecards to track compliance with architectural standards and
$155 billion. project management processes or gauge the effectiveness of vendor
management efforts. The operational category of T. Rowe Price’s
IT balanced scorecard tracks compliance of new project work with
Scorecard Background existing architecture standards and the IT organization’s standard
IT management reporting practices at T. Rowe Price historically system development methodology. By highlighting areas where
suffered from two flaws. First, reports focused on operational business sponsors or project managers have deviated from T. Rowe
performance in minute detail while neglecting non-technical Price’s established standards, the scorecard provides a measure of
performance metrics. Second, even for operational metrics, the “name and shame” reporting, forcing discussion of non-standard
reports lacked established targets for adequate performance. work in the presence of senior decision makers. In addition, T.
To provide a holistic view of corporate IT’s performance with Rowe Price also includes a “competitive bids” metric as part of the
respect to financial, project, operational, talent management, and operations category. This metric is aimed at driving down overall IT
user satisfaction measures, T. Rowe Price created an IT balanced spending by encouraging sponsors to solicit external bids for new
scorecard in early 2001. T. Rowe Price updates the scorecard projects.
quarterly and uses it to highlight a continued commitment to
streamlining total IT expenditures and to showcase IT’s partnership
with the business.
T. Rowe Price’s IT Balanced Scorecard High-Level Metrics focus on Metrics goals for current and upcoming
Sample Metrics Scorecard fits systems failures quarter allow more informed
onto one page with negative assessment of current performance
Financial Performance business impact and timeline for closing gap to goal
• Percentage change in EBITDA
• Weighted IT expense ratio versus industry
• Total IT expenditures Operational Performance
• Total expenditure on new functionality Illustrative
• NPV delivered during payback period
Objective Metric Current Current Status Next Comments
Project Performance Quarter Quarter Quarter
• Percentage of projects delivered on time, on budget, with Performance Target Target
acceptable customer satisfaction Integrate Percentage of projects Includes projects
• Percentage of active projects with approved project plan Solutions complying with existing in progress before
Using Defined architectural standards new standards
Architectures and complying with 10% 30% 35% created
Operational Performance standard development
methodology
Talent Management
Keeping and Attracting Top Performers
Talent at the Heart of the IT Enterprise Becoming a Talent Magnet—Exemplar IT organizations use the
The size of many IT departments has been decreasing across the IT balanced scorecard to focus senior executive attention on IT’s
past several years in response to cost pressures, and as a result, success in creating a “destination” for available talent, tracking
CIOs have been forced to focus on talent management in an effort metrics like the retention of high-potential staff, the number of
to maximize the productivity of existing staff. This involves both press citations of the IT organization and company-developed
upskilling the entire IT workforce and focusing on the retention of technologies, or the number of industry awards the company has
staff with critical skills. IT balanced scorecard practitioners use their won for its use of technology.
scorecards to better manage IT talent in several ways:
Individual
HIPOs Exemplar
Becoming a Talent Magnet
Sample Metrics:
• Retention of high-potential
staff
• External citations of IT
achievement
All Staff
Internal Focus of IT Brand Building External
Talent Management
Investing in the Future
Company Background Leadership Development as a Foundation for Value Creation
AstraZeneca PLC, the global pharmaceuticals company, is one of the top AstraZeneca’s stated goal is to develop the leadership performance and
five pharmaceuticals companies in the world with more than $17.8 billion skills base of its existing IS staff at all levels of the organization. To
in revenues and has 58,000 employees. monitor its progress toward that goal, AstraZeneca’s IS organization
tracks the progress of two key objectives in the talent category of its IS
balanced scorecard:
Scorecard Background
AstraZeneca introduced the balanced scorecard in 1998 as a tool to help Developing and Demonstrating Leadership—The first set of talent
communicate corporate strategy and priorities, ensure alignment of management metrics on AstraZeneca’s IS balanced scorecard is
functional initiatives with those strategies and priorities, and monitor designed to measure the development of leadership capabilities within
execution of specific strategies. While adoption of the balanced scorecard the technology organization. These metrics include the percentage of
is widespread at AstraZeneca, there are no corporate templates that IS staff, IS leaders, and identified high-potential staff with established
individual functions must follow. The content of the IS balanced development plans, and the percentage of key IS roles filled.
scorecard is dictated by two principal drivers, the strategic priorities of the
company and the vision for the IS organization, which are reflected Developing and Demonstrating IT Skills and Disciplines—AstraZeneca’s
in three key priorities for the IS function: IS balanced scorecard also tracks performance in creating skills and
development opportunities for existing IS staff by including metrics
1. Deliver business value such as training attendance versus capacity and the percentage reduction
in skills shortages filled by external consultants. More specifically,
2. Transform IS efficiency and effectiveness AstraZeneca’s IS balanced scorecard reflects the IS organization’s
3. Develop IS leadership and capabilities emphasis on cultivating project management skills and developing global
service delivery capabilities, tracking metrics that measure compliance
The PowerPoint-based scorecard is updated on a quarterly basis. Each with project management methodologies and project manager skills
scorecard objective is assigned to an owner on the CIO’s global IS development, as well as those measuring collaborative tool usage, virtual
management team who updates the metrics that underpin that objective team productivity, and the number of rotational assignments undertaken
and submits the metrics and commentary to the assembled global by IS staff.
IS management team for review and discussion. The final scorecard
comprises the strategic agenda for the IS senior management team and A Multipurpose Performance Management Tool
is used to track ongoing execution of IS strategy. Once finalized, the “The IS balanced scorecard acts as an extremely powerful vehicle
scorecard is published on the company intranet. IS line mangers are for communicating AstraZeneca’s IS strategy and priorities to
strongly encouraged to discuss the balanced scorecard with their staff and the company, helps ensure that IS initiatives are clearly aligned to
define their team goals in a fashion that is consistent with the corporate IS business priorities and strategy, and allows effective tracking of IS
vision that is encapsulated in the scorecard. organization success in strategy execution.”
Balvinder Dhillon
AstraZeneca
AstraZeneca’s
IS Balanced Scorecard Develop IS Leadership and Capabilities
Sample Metrics Illustrative
Current Previous
Project Performance Objective Metric
Performance Performance
Status Comments
• Ongoing benefits captured versus plan Develop and Percentage of IS staff development plan reviewed and actions implemented 60% 75%
• Project delivery progress versus budget, Demonstrate
schedule, and quality milestones Percentage of incumbent IS leaders with development plan in place 90% 92%
Leadership
Capability Percentage of global high-potential staff with development plan in place 80% 75%
Operational Performance Change management and business acumen programs in place No No
• Number of business interruptions caused Number of annual IS rotational assignments 10 11
by security intrusion
Diversity performance (scale of 1 to 5) 4 4
• Percentage of internal portals adopting
architecture guidelines Percentage of key roles filled 99% 92%
• Number of material adverse regulatory Develop and Percentage of projects following project management framework 70% 70%
compliance observations by FDA or Demonstrate
Percentage of projects with KPIs for visible value demonstration 80% 50%
other IS audits IS Professional
Skills and Percentage of project managers with development plan and career path 60% 40%
Disciplines
Percentage increase in project manager capability 20% 10%
Talent Management
Attendance versus capacity in existing learning programs 60% 72%
Percentage increase in collaborative tool usage 40% 30%
Improvement of “virtual” team productivity +5 -1
IS capabilities versus benchmarks (scale of 1 to 5) 3 2
Percentage reduction in skills shortages filled by consultants 20% 15%
Metrics concentrate on career planning Multiple metrics underscore Metrics also focus on desire
and capability development of existing IS organization’s emphasis on to strengthen global IS
staff, with a focus on high performers developing project management skills service delivery capabilities
Source: AstraZeneca; Working Council research.
Talent Management
Tracking IT’s Brand Equity
Company Background Developing an IT Brand
Schneider National, Inc., the Green Bay, Wisconsin–based Schneider’s talent management metrics include typical internally
transportation and logistics provider, has $2.6 billion in revenues focused measures like staff turnover rates and employee tenure
and 21,000 employees. The company operates a fleet of 14,000 trucks with the company and in IT. However, in service to a corporate
and more than 40,000 trailers. initiative to boost brand awareness of Schneider and its products
and services, the IT organization also tracks two externally facing
metrics: the number of press citations of Schneider’s IT organization
Scorecard Background and Schneider-developed technologies, and the number of industry
Schneider National’s IT balanced scorecard was implemented in awards Schneider has won for its use of technology.
1998 and is composed of four major types of metrics: financial
performance, project performance, operational performance, By using the balanced scorecard to track the improvement of the
and talent management. The IT organization views the scorecard internal employment brand for IT, companies can reduce staff
as a tool to help align IT efforts with the needs of the business turnover, especially for high-potential staff who would be difficult
and reduce the time senior IT management needs to identify and to replace on the internal talent market and costly to replace on the
find solutions to service problems. The scorecard is updated on a external market. In fact, Schneider credits the ability to regularly
monthly basis and reviewed by the CIO, the IT vice presidents, and track the metrics in the talent management scorecard category with
IT directors. The scorecard includes the target, current month, and helping it achieve a significant reduction in annual staff turnover,
year-to-date performance for each metric. which has declined from 26 percent before the scorecard was
implemented to 7 percent, across a period of five years. Some of this
turnover has likely been stemmed as a result of the waning fortunes
of the overall economy across the past several years, but Schneider
claims that by providing decision makers with a consolidated view
of talent metrics, the IT organization is able to more effectively
deploy its resources toward improving the internal IT employment
offer, with reduced turnover as the result. In addition, by using
the scorecard to track press citations and awards, Schneider’s IT
organization has also begun developing an external IT brand,
becoming a destination for available talent in the industry.
Baseline
Baseline ofof internally focused
internally-focused
metrics
metrics supplemented
supplemented with with
Schneider National’s IT Balanced Scorecard metrics
metrics ofof external
external ITIT “brand”
“brand”
Sample Metrics
Financial Performance
• Central IT budget as a percentage of revenue Talent Management
• IT maintenance cost per load, per movement Illustrative
• Desktop total cost of ownership
• Data communications cost as a percentage of revenue Current Current
Year-to-Date
Objective Metric Month Month
Performance*
Project Performance Performance* Target*
• Percentage of projects on time, on budget, with
acceptable satisfaction score 1. Annual turnover rate 7% 7% 8%
• Percentage of budget allocated to unplanned projects Create an
2. Average years of employee experience at 8 8 7.9
Attractive IT
Operational Performance Schneider National
Environment
• Number of server outages per month 3. Percentage of planned staffing levels 97% 95% 94%
• Reduction of desktop applications
• Number of UNIX servers 4. Citations of Schneider National technology 6 10 14
Build a organization in the press
Talent Management World-
5. Schneider National technology and 1 2 1
Class IT
corporate awards
Organization
6. Average years of employee experience in IT 4.8 5 5.4
Tracking
Tracking ofof external
external citations
citations ofof Schneider
Schneider National’s IT organization
National National’s IT
boosts brand boosts
organization awareness,
brandpositively
awareness, impacting retention
positively rateretention
impacting and
increasing the visibility
rate and increasing theofvisibility
IT to available
of IT, totalent
in play talent
* Numbers are for illustrative purposes only, and Source: Schneider National; Working Council research.
do not represent actual performance or targets.
© 2003 Corporate Executive Board Best-in-Class IT Balanced Scorecard Metrics 39
IT Balanced Scorecards 40
User Satisfaction
Measuring IT’s Public Perception
User Satisfaction as an Early-Warning Indicator A Census-Level View of IT Performance—Best-in-class IT balanced
To combat the historical financial bias of corporate reporting and scorecards incorporate the results of formal satisfaction surveys
create a truly “balanced” perspective of corporate performance, covering the entire user base. Exemplars conduct surveys on at
Kaplan and Norton advocated the inclusion of customer satisfaction least an annual basis, either surveying the entire user population
metrics on the balanced scorecard. For IT organizations, these at once or by surveying a smaller but representative sample of the
“customers” are, in most cases, end users and business sponsors. population each quarter. Leading companies also take steps to
Faced with limited performance measurement resources, CIOs have document executive-level perspectives of IT performance rather
integrated satisfaction-surveying efforts into their IT performance than aggregating all user feedback into an undifferentiated average.
measurement toolkit to track the perception of IT service delivery The results of these two types of satisfaction surveys allow CIOs to
with three varying degrees of rigor: more quickly and effectively identify changes in user perception of
IT which may be connected to a decline in service levels or problems
Ad Hoc Assessment—In many cases, companies choose to measure with vendor contracts.
end-user satisfaction anecdotally, either because they see no need
or are unwilling to incur the potential costs involved in regularly
collecting the information. As a result, these scorecards do not truly
represent a “balanced” view of IT performance, relying almost
exclusively on financial metrics as a proxy for IT value.
Census
Exemplar
A Census-Level View of IT Performance
Sample Metrics:
• “Customer satisfaction” drawn
from entire user population
(comprehensive perspective)
• “Senior executive satisfaction”
drawn from focused executive
feedback
Penetration
of User Base Progressive Practitioner
Reactive Information Gathering
Sample Metrics:
• “Customer satisfaction”
drawn from a random
Baseline selection of users (with
Ad Hoc Assessment potential sample bias)
Sample Metrics:
• None; scorecard includes
user testimonials (anecdotal
evidence)
Self-Selected
Users
Undifferentiated Business Sponsor/
End Users Seniority of Respondents Executive
User Satisfaction
Listening to the Voice of the Customer
Company Background of the company’s approximately 5,000 users. In place since 1999,
J.D. Edwards & Company, the Denver-based ERP and supply chain the survey is composed of 26 questions that gauge user satisfaction
software maker, has $904 million in revenues and 4,954 employees. with existing IT products and services and requires only 15 minutes
The company serves 6,500 customers in more than 100 countries, to complete. The survey instrument is an internally developed
and 75 percent of revenues come from professional services. Web-based tool. The tool delivers the survey to IT customers and
compiles survey responses and results at a cost of less than $1,000
per year. Survey results are presented as a series of “dials” on the
Scorecard Background IT intranet. These dials include a scale from one to seven and show
Although not formally called a balanced scorecard, J.D. Edwards’ average scores for satisfaction with specific IT services. Viewers can
six corporate initiatives, which include “revenue and growth,” double-click and view individual comments. A single IT account
“customer satisfaction,” and the creation of a “knowledgeable and manager administers the survey, manages the survey process,
committed workforce,” approximate several of the major categories and analyses the results in addition to his/her regular account
of an IT balanced scorecard. management duties. In addition, the account manager collaborates
with IT services managers to ensure that action plans are created to
An Uncoordinated Approach to Customer Satisfaction address the problems highlighted by survey results, and also works
with the other account managers and IT service managers to select
While growing from $300 million in revenues in 1998 to
additional IT services to measure from a customer satisfaction
$1 billion in 2000, J.D. Edwards’ IT organization struggled to
perspective.
capture actionable end-user feedback that would allow it to better
enable the company’s growth strategies. The IT organization
faced two hurdles to these ambitions. First, the company often Giving Customers What They Want
relied on external benchmarking data to choose IT products and The IT department uses the feedback captured to iteratively test
services without input from internal customers. Second, when service improvements. For example, unfavorable survey feedback
the IT organization did administer end-user surveys, it did so on about the help desk led to the termination of a contractor’s services
an ad hoc basis, with users often receiving multiple surveys from and the creation of an internal help desk that was able to meet
multiple groups within IT. Bombarding end users with too many the established service-level goal. In another instance, quarterly
surveys generated sub-optimal response rates, leading to a non- user survey feedback revealed dissatisfaction with the company’s
representative view of IT’s performance. remote access service provider based on significant downtime and
unsatisfactory cost. In response to continued user dissatisfaction,
Combating Survey Fatigue the IT organization contracted with an alternative service provider
and upgraded its connectivity architecture, resulting in an improved
To solve this problem of survey “fatigue,” J.D. Edwards adopts a
user satisfaction score.
quarterly satisfaction survey, with each survey covering 25 percent
User Satisfaction
Collecting the Executive View of IT Performance
Company Background Each respondent is asked to rate the corporate IT function on a
Eli Lilly and Company, the Indianapolis-based pharmaceutical scale of 1 to 5, with 5 being very satisfied. The aggregate results of
manufacturer, has revenues of $11 billion and 43,700 employees. the survey are then presented as part of the IT balanced scorecard,
with individual responses kept confidential to encourage an honest
appraisal of IT’s performance. The intranet-based scorecard allows
Scorecard Background decision makers to drill down into top-level scorecard metrics,
In order to better track IT’s contributions at the enterprise level, the including underlying data, commentary from the metric “owner,”
IT organization at Eli Lilly created an IT balanced scorecard in 1999. and an overview of action steps that will be taken to improve
performance.
Enfranchising Senior Business Decision Makers
Recognizing that internal customer satisfaction surveys often fail Heeding the Business’s Advice
to make the important distinction between the feedback of end In addition to the numerical feedback provided by executives,
users and truly empowered business decision makers, Eli Lilly’s Lilly also solicits qualitative feedback. This not only helps
CIO supplements the IT organization’s end-user satisfaction survey capture executive perceptions of current IT service, but is also
by conducting an annual feedback exercise with 12 senior business an opportunity to gather future service requirements. So, by
executives. Feedback is gathered from each executive by the CIO in capitalizing on this occasion for dialogue provided by the executive
a 30 to 60 minute interview, using six qualitative questions designed satisfaction survey, Eli Lilly’s CIO gets both direction for the present
to gauge senior-level perceptions of IT value creation. The questions and a glimpse into what business sponsors want IT to be in the
ask business leaders to assess: future. Eli Lilly has used this formal feedback from senior business
executives to guide and improve its IT efforts in several concrete
• IT’s contribution to business value creation ways. In response to feedback that the business divisions required
• IT contribution to business process improvement more clarity on IT’s funding decisions, Eli Lilly’s IT organization
redesigned its portfolio prioritization process to provide more
• IT contribution to company competitive advantage transparency to business heads. In addition, executive survey
feedback prompted Lilly’s IT organization to expand the size and
• IT contribution to corporate business strategy
scope of its “reverse mentoring” program, in which junior IT
• IT operational performance staff serve as mentors to senior business executives, helping them
understand IT terms and concepts.
• Overall satisfaction with IT service
Survey conducted annually by CIO in individual Only aggregate results of executive surveys are published
sessions with 12 senior business executives to the scorecard to encourage honest feedback
Eli Lilly’s IT Balanced Scorecard
Sample Metrics Executive Management Feedback Collection Form
Illustrative
Financial Performance
Joe Smith
Executive Name: ________________ Marketing Dir. Date: ______________
Business Unit: _______________ 4/20/2002
• Economic Value Added (EVA)
• Unit service cost IT is not at all IT is exactly
meeting the needs meeting the needs Expectations Future Requirements
Project Performance of the corporation of the corporation
• Project Portfolio 1. IT Contribution to Business 1 £ 2R 3 £ 4 £ 5 £ Not enough clarity Update portfolio
Value Creation around IT cost prioritization process
Operational Performance
2. IT Contribution to Business 1 £ 2 £ 3R 4 £ 5 £ Close relationship in Creation of reverse
• Base investment Process Improvement redesigning process mentoring program
• Service levels versus external benchmarks
• Compliance with enterprise methodologies and 3. IT Contribution to 1 £ 2R 3 £ 4 £ 5 £ More proactive Charter advanced
standards Company Competitive technology scanning technology group
Advantage
Talent Management 4. IT Contribution to 1 £ 2 £ 3 £ 4R 5 £ Qualitative feedback gathered
• Diversity Corporate Business Strategy includes executive expectations,
• Progress on people
• Key capabilities
5. IT Operational Performance 1 £ 2R 3 £ 4 £ 5 £ perception of IT’s achievement with
regard to those expectations, and
6. Overall Satisfaction with IT 1 £ 2 £ 3R 4 £ 5 £ future service requirements
Service
User Satisfaction
An Occasion for Dialogue
“The executive management feedback surveys are a useful tool for
understanding and responding to business perceptions of IT’s performance.
The results are not as important as the dialogue between IT and the business.”
Roy Dunbar, CIO, Eli Lilly
Information Security
A Proactive Approach to Security
Extended Enterprise Increases Security Risks Proactively Assessing Potential Vulnerabilities—More advanced
As organizations’ reliance on Web-based transactions, remote balanced scorecard practitioners include metrics to proactively
and wireless computing, and connectivity to external partners assess potential vulnerabilities. In most cases, these metrics
continues to grow, the number of potential information are still included in the operational performance category
security vulnerabilities has also increased. This trend toward of the scorecard, not in a stand-alone information security
greater enterprise permeability, coupled with the tragic events category. They include measures like the percentage of systems
of September 11 and high-profile security incidents such as compliant with security standards, the percentage of network
2003’s Slammer virus, has caused senior decision makers access points covered by intrusion detection systems, and the
to focus their attention on the unpredictable and costly percentage of IT security audit issues remedied by the agreed-
information security threats posed to the extended enterprise. upon deadline.
IT organizations exhibit three levels of sophistication in
Hardwiring Security Compliance—Exemplar IT organizations
managing that potential information security spending using
create a separate information security category on their IT
a balanced scorecard.
balanced scorecards. By elevating security metrics to the
Reactively Tracking Security Incidents—Most companies category level, exemplars seek to clearly communicate urgency
have not yet added an information security category to of these measures to senior IT staff and business sponsors.
their IT balanced scorecard, instead tracking a handful of Exemplars also supplement retroactive report of security
technical security measures in the scorecard’s operational incidents with tracking the implementation of and compliance
category. These measures are aimed primarily at retroactively with security policies and preventive measures. Typical metrics
documenting the number of security incidents detected per include the percentage of staff receiving security training and
period and the time required to respond to those incidents. the percentage of external partners in compliance with security
standards.
Progressive Practitioner
Proactively Assessing Potential Vulnerabilities
Sample Metrics:
• Percentage of systems in compliance with
IT security standards
Focus of • Percentage of network access points
Measurement covered by intrusion detection systems
• Percentage of IT security audit issues
fixed by agreed deadline
Baseline
Reactively Tracking Security Incidents
Sample Metrics:
• Number of security incidents detected
per period
• Average time to respond to incidents
Operational
Reactive Proactive
Metric Posture
Source: Working Council research.
Information Security
Safeguarding the Enterprise
Company Background Incident Rate—Documents the number of security incidents
Schlumberger Limited is a $13.5 billion global technology per 1,000 employees per year. Reported incidents are logged in
services company with 76,000 employees of more than 140 a central database and categorized by severity as serious, major,
nationalities. The company has operations in 140 countries and catastrophic to allow for year-over-year comparison and
and includes three primary business segments: Schlumberger trend analysis.
Oilfield Services, the world’s largest oilfield services company
Percentage of Infrastructure Protected—Focuses on the
and the leading supplier of technology services and solutions
protection of PCs, servers, and network infrastructure from
to the international oil and gas industry; WesternGeco, jointly
malicious attacks. Protection regimes include deployment
owned with Baker-Hughes, the world’s largest surface seismic
of anti-virus software on PCs, maintaining servers to close
company; and SchlumbergerSema, an IT services company
all known vulnerabilities, and applying security patches,
providing consulting, systems integration, and managed
and configuring networks according to established security
services to the energy, public sector, telecommunications, and
guidelines..
finance markets.
Percentage of Sites Audited for Security—Tracks the percentage
Scorecard Background of Schlumberger sites that have had an information security
audit within the past 12 months. Audits are conducted by
Schlumberger introduced its IT balanced scorecard in 2002
Schlumberger’s operational security group, and the results
as part of a larger IT initiative to better understand the
are cataloged in the central database to allow tracking of
comparative IT spending levels and align that spending more
remediation progress for any identified vulnerabilities.
closely with business needs.
Information security
elevated to category level
Enterprise Initiatives
Giving Corporate Business Initiatives Their Due
Detailing IT’s Contribution to Enterprise Initiatives Spotlighting Enterprise Business Initiatives—Exemplar
In addition to its day-to-day operational role, IT often plays a companies create a separate category on the IT balanced
critical role in enabling discontinuous enterprise-level business scorecard that tracks metrics of IT’s contribution to major
initiatives, such as post-merger integration or business process corporate projects or initiatives, often over a multiyear period,
reengineering. By setting aside one category of the IT balanced for initiatives such as merger integration. These metrics act as
scorecard to track IT’s contribution to the execution of these a vehicle for communicating IT’s value creation and are usually
major business initiatives, CIOs are able to better manage the removed from the scorecard once the initiative has been
function’s enablement efforts and communicate those efforts to completed.
senior business decision makers and initiative sponsors. These
communications efforts range from the nonexistent
to the programmatic:
An IT-Centric View of Performance—In most cases, IT
balanced scorecards focus solely on IT’s efforts to better
manage its own activities. The link between corporate
strategic initiatives and the metrics tracked on the IT balanced
scorecard is tenuous at best, as the scorecard has not been
developed in conjunction with the corporate strategy, but
rather as a stand-alone IT performance measurement tool.
An Uncoordinated View of IT’s Contribution—More advanced
practitioners use the IT balanced scorecards to manage the
effective execution of IT work related to enterprise initiatives.
However, relevant metrics are typically dispersed across
the scorecard’s operational and project categories, making
it difficult for the CIO to articulate and business owners of
enterprise initiatives to understand IT’s full contributions to
the business’s efforts.
Exemplar
Spotlighting Enterprise Business Initiatives
Corporate
Strategy Sample Metrics:
• Percentage of acquired company
systems integrated in M&A category
• Number of business process steps
enabled by technology in Process
Reengineering category
Progressive Practitioner
Uncoordinated View of IT’s Contribution
Sample Metrics:
Altitude • Percentage of acquired systems retired
of Link to in Operational Performance category
Strategy
• Number of business process
enablement projects completed in
Project Performance category
Baseline
An IT-Centric View of Performance
Sample Metrics:
• No metrics linked to execution
of enterprise business initiatives
IT Strategy
Dispersed Consolidated
Metric Location on Scorecard
Source: Working Council research.
Enterprise Initiatives
Putting IT on the Enterprise Stage
Organization Background agencies to establish an enterprise architecture by 2004 and to
The Federal Aviation Administration has an annual budget move services to the Web as part of the eGovernment initiative.
of $14 billion and is responsible for oversight of air traffic This urgency in developing and implementing an enterprise
control, airport regulation and safety, and air regulation and architecture and migrating to electronic service delivery has
certification. led the FAA to create a stand-alone eGovernment category on
the IT balanced scorecard, distinguishing these efforts from
the other project-related metrics tracked in the scorecard’s
Scorecard Background operational performance category.
In early 2001, the FAA’s IT department deployed an IT
balanced scorecard that tracks strategy implementation in six
Improving Customer Service and Ease of Use
key areas: financial performance, operational performance,
talent management, user satisfaction, security and safety, and The eGovernment category of the FAA’s IT balanced scorecard
eGovernment. The FAA CIO and the CIOs of each of the FAA’s includes three main objectives. The first is to make interaction
divisions review the scorecard on a weekly basis, and the same with the FAA easier and faster through the migration to Web-
group reviews the scorecard’s categories and metrics yearly based technologies. The second is to ensure the consistent
to ensure that they are aligned with the organization’s stated delivery of services by adopting standard guidelines for Web
strategy. The scorecard includes the current status, two-year architecture, interface design, and content presentation.
targets, and interim milestones for each metric, and documents Finally, the FAA aims to ensure that data shared internally and
potential risks that could jeopardize the achievement of those externally are timely and accurate.
targets. It is published on the FAA’s intranet site and is visible
to the entire IT organization. Generating Cost Savings Through Higher Visibility
By elevating the IT organization’s eGovernment efforts to the
Regulation Pushes eGovernment onto IT the Balanced category level on the IT balanced scorecard, the FAA has not
Scorecard only increased IT staff awareness of its importance, but has
The 1996 Clinger-Cohen Information Technology also made those efforts more visible to both the organization’s
Management Reform Act requires that all United States federal IT and non-IT leadership. This increased visibility allowed the
departments and agencies create and implement an enterprise CIOs of the individual lines of business to realize more than
IT architecture plan. More recently, the 2002 Presidential $2.5 million in savings per year by collectively purchasing
Management Agenda calls for all federal departments and standard infrastructure.
Operational Performance
• Percentage of program components compliant with Ensure Effective Service Percentage of 7% 10% • Milestones to • Possible change
the “to be” enterprise architecture Delivery Capabilities FAA Web sites be collected and in due date
• Percentage of shared components, standards, and that comply with updated by CIO
Web architecture, office
infrastructures across multiple applications requirements,
style, and content
Talent Management guidelines
• Percentage of program managers with project
management certification Ensure that Data and Percentage of FAA 60% New 50% New • Systems using • Ability to
Information That Are information systems 15% Legacy 10% Legacy approved data monitor
User Satisfaction Used to Conduct Critical critical to agency elements baselined compliance will
• Number of information quality complaints received Agency Business or Publicly business or used • Additional 250 data be challenging
Disseminated Are Timely, to disseminate elements approved
Accurate, Accessible, information publicly • Full governance
Understandable, and Secure that use approved process developed
Security and Safety* and approved
data elements
eGovernment
Key agency-level “eGovernment” project receives own Potential risks to success are listed
scorecard category to allow IT organization to monitor on scorecard in advance to allow
“Webification” and enterprise architecture ambitions creation of mitigation strategies
Scorecard Rollout p. 56
Scorecard Rollout
A Process, Not a Document
Company Background 4. Metrics Definition—The same CTO-led team then creates a
Bowne & Co., a New York-based document, information definition for each of the metrics on the IT balanced scorecard.
management, and printing solutions provider, has $1 billion in This definition includes a one-sentence description, an overview
revenues and 8,000 employees. of the measurement technique used to track the metric—for
example, a user survey or a dashboard—and a list of initiatives
that must be undertaken to enable the tracking of the metric
The Balanced Scorecard as a Closed-Loop Process using the outlined technique (for instance, the development or
Bowne’s IT organization implemented its balanced scorecard in purchase of new reporting software).
2000 as part of a corporate-wide scorecard initiative, spurred by the
arrival of a new CEO. This scorecard implementation and life-cycle 5. Assigning Metric Ownership—Each of the metrics on the
process involves seven key steps: scorecard is assigned an owner who reports directly to the CTO.
The metric owners are directly involved in the scorecard creation
1. Kick Off Training for IT Staff—Bowne kicks off its balanced and update process, and a percentage of their bonus is contingent
scorecard initiative with a three-day, off-site training session for on their scorecard-related duties.
all of the company’s divisional and functional senior managers,
including those from IT. The training includes a basic overview 6. Data Collection and Quality Assurance—The frequency of data
of balanced scorecard concepts and terminology, but the session collection varies by metric according to several factors, including
focuses on an exercise to help participants initially align their the cost of data collection, the corporate financial reporting cycle,
unit’s strategy with the corporate strategy and then develop a set and the volatility of the business climate.
of metrics to measure contribution to each strategy. This process
7. Scorecard Review and Revision—The CIO and the other 11 Bowne
also allows each unit leader to communicate the unit’s own
corporate officers meet every six months to review the scorecards
substrategies to his or her peers.
for every area, including IT. The applicability of all metrics is
2. Ongoing Strategy Mapping—Each year, Bowne creates a top- reviewed annually by the CIO, the CTO, and their direct reports,
down corporate strategy which becomes the basis for a corporate and approximately 20 percent of metrics turn over each year.
balanced scorecard. Each business unit and function, including
Beyond the initial consulting fees for the balanced scorecard
IT, then devolves its strategies from the corporate strategic plan.
training, Bowne’s IT organization invested approximately 120
3. Metrics Selection—Building off of the IT strategy, a team person-days in managing the balanced scorecard process in 2002.
including the corporate CTO and his six direct reports—senior This effort was distributed to the largest extent across the 20-person
directors for global operations, client integration, enterprise IT management team.
architecture, and corporate systems, and vice presidents of IT,
HR, and Finance—creates a set of recommended metrics to track
IT’s progress against each strategic pillar. These recommendations
are then presented to the CIO for final approval.
© 2003 Corporate Executive Board
Scorecard Rollout
The IT Balanced Scorecard Life Cycle
Bowne creates a “closed-loop” IT balanced scorecard process
Bowne’s IT Balanced Scorecard Adoption Process
1. Kick Off Training for IT Staff Illustrative 7. Scorecard Review and Revision
Process Costs
• Balanced Scorecard 101 for divisional and
functional senior managers • CIO, CTO, and corporate officers review
Initial consulting fees scorecard every six months
• Initial strategy alignment and metrics
selection exercise • Metrics revisited annually by CTO-led group
$10,000 license fee for
software and $3,500/year
2. Ongoing Strategy Mapping maintenance and support costs 6. Data Collection and Quality Assurance
• Team of CTO and direct reports creates list • CTO-led team creates standard definitions
of metrics for all metrics, defines measurement • Owners assigned to each metric are
• List refined by using analysis of each potential technique and data collection processes, and responsible for scorecard completion
metric’s strengths and weaknesses outlines initiatives that must be completed to • Owners report to CTO and their bonuses
• Final approval by CIO allow tracking of metrics are linked to their scorecard-related duties
Source: Bowne; Working Council research.
© 2003 Corporate Executive Board Scorecard Development and Life-Cycle Management 57
IT Balanced Scorecards 58
1. Data Request—To populate the scorecard in advance of its quarterly review, The collection and quality assurance process, from initial request by the
a data collection coordinator (a staff member of a corporate CIO direct collection coordinator to delivery of the requested information to objective
report) requests information from a network of regional collection agents. owners, takes approximately two weeks and involves 30 IT staff, who each
spend a small portion of their time on balanced scorecard activities. Time
2. Data Collection—The four regional collection agents (Asia, North America, spent by individuals varies slightly by the number of metrics collected in each
South & Central America, and Europe) coordinate with a network of local region, and Cemex’s IT organization claims that the total time investment of
all staff involved is equivalent to 1 to 2 FTEs for two weeks.
© 2003 Corporate Executive Board
Data Collection and Quality Assurance
Routinizing Scorecard Data Aggregation
Cemex’s IT organization clearly defines a data collection process and roles to ensure timely, accurate scorecard data
Cemex’s IT Balanced Scorecard Data Collection Process
Illustrative
1 Data Request 2 Data Collection 3 Data Proofing
4 Data Verification
Quarterly Scorecard Presentation Where the Going Gets Tough
8
“It is easy to defi ne a great model
on paper for a balanced scorecard.
However, a consistent and reliable
process for data collection and analysis
• IT’s balanced scorecard is presented to CIO and IT is key in order to make it work.”
management quarterly by objective owners • Local metric experts verify data and communicate
• After presentation, scorecard is made available to IT Sergio J. Escobedo them back to collection coordinator
IT Planning, Cemex
department
• Collection coordinator aggregates data and populates • Objective owner receives and analyzes data for • Collection coordinator receives final data and passes
balanced scorecard template assigned objective them to objective owner
As corporate business strategies change in …metrics that are retained in scorecard change …and new metrics are added
response to external economic factors… weighting based on changing business priorities… to reflect new demands on IT
…while migrating from lagging to leading measures at the level of individual metrics
Operational Performance Metrics of Corning’s IT Balanced Scorecard
2000–2003
Metrics focus on performance of Forward-looking operational Measures of customer feedback added Metrics focus on pre-deployment systems
systems that have already been deployed expectations added to existing metrics to proactively improve service offering planning to avoid potential future problems
Each unit CIO will count the total number of learning plan items Bonus
Project Performance: 30% Project Performance: 35% on 2003 Learning Plans and measure achievement of those compensation
plans. All will use the following definitions of success: of 200 IT
Operational Performance: 20% Operational Performance: 15% employees
a. 88 percent of learning plan items met ¨ 90 miles dependent
Talent Management: 20% Talent Management: 25%
to varying
b. 94 percent of learning plan items met ¨ 120 miles
degrees on
c. 100 percent of learning plan items met þ 150 miles “odometer
Set of core metrics Business units mileage” of
required at business and functions use corporate
unit/functional level, corporate template balanced
but units/functions but can weight each scorecard
can add supplemental category as they Total Actual/Possible 230/250 miles
metrics see fit
* For a more detailed version of Corning’s odometer please see p. 84. Source: Corning; Working Council research.
© 2003 Corporate Executive Board Scorecard Development and Life-Cycle Management 63
IT Balanced Scorecards 64
2. Metrics in the Ascent—In addition to the cardinal IT balanced scorecard categories of financial
performance, project performance, operational performance, customer satisfaction, and talent
management, progressive companies are elevating metrics tracking information security and
enterprise initiatives to the category level to ensure they receive the required IT and business sponsor
visibility.
Bowne p. 69 Cemex p. 80
Eli Lilly p. 77
Client
Current
Objective Metric Owner Initiatives Status Owner
Performance
Understand Bowne General client survey results • Develop client survey
strategies and processes
Provide undisputed IT General client survey results • Partner with Marketing to showcase IT skills (e.g., create technical
expertise competitive analysis sheets for products, tech papers)
• Develop client survey
Total cost of ownership of identified products and • Conduct benchmarking study comparing IT costs versus market
Deliver superior service services compared to industry standards (companies with similar services and levels)
at a competitive price Percentage technology availability (includes: prod- • Pilot SLAs
uct delivery, unplanned business interruptions) • Pilot technology availability measurement process
Percentage variance of actual ROI versus
vs. ROIROI
proposed • Audit business cases
in businessincase
proposed business case • Implement estimating system
Enable productivity Actual vs.
versus
planned
planned
business
business
productivity
productivity
for imple-
for • Develop project plan for composition work
workflflow
ow
improvements mentation of keyofinitiatives
implementation key initiatives • Develop technology road map
• Implement NetFax
• Integrate BowneLink for eDistribution
Provide Bowne with General client survey results • Create a technical competitive analysis sheet for products
competitive advantages
Source: Bowne; Working Council research.
Optimize IT processes Percentage of systems adhering to enterprise • Define, publish, and enforce enterprise systems standards based
systems standards on best practices
• Develop enterprise technology inventory
People
Current
Objective Metric Owner Initiatives Status Owner
Performance
Recruit, train, and develop Regretted turnover • Assess management needs for new-hire selection skills
a diverse high-performance Percentage of successful targeted selection • Provide targeted new-hire selection training as required
workforce
Percentage of individual training objectives met • Implement plans for pilot group
employee survey results
Develop employee product Results technology employee “business knowledge” • Establish focus groups with other business units to provide
knowledge alignment with the survey (product knowledge and customer alignment product knowledge exchange
business tools) • Customer-supplier mapping conducted with business segments
• Develop technology “Business Knowledge Survey”
Employee survey results • Align and enhance existing compensation and incentives system
Align compensation and incen-
tives to the business strategy Percentage of objectives met (relative percentage • Implement performance management system
of objective—e.g., threshold, target, stretch) • Implement the S.M.A.R.T. goal-setting program
Percentage of employees with individual • Implement training
development plans
Provide opportunities for Percentage of goals achieved as documented in the • Refine and implement description and policies for technical and
career growth S.M.A.R.T. system managerial career tracks (the career ladder)
Employee survey results • Develop five-star skills matrix assessment development
program for the support center
Source: Bowne; Working Council research.
Customer
Objective Metric Current Performance Status
Ensure consistently available and reliable services to IT users Percentage compliance with service level agreements
Ensure that IT solutions meet the business needs Business alignment index (executive management survey)
Meet employee IT service needs Level of service-mindedness and professionalism (end-user survey)
Process Excellence—Deliver the projects and services which our customers require
Objective Metric Potential Score Performance Scale
Meet project Project performance index measure for 20 125 Miles Average score 1.8 75 Miles
commitments representative projects selected by unit CIO/ Average score 2.0 100 Miles
program office (Evaluation dimensions include Average score 2.2 125 Miles
business value and capabilities delivered, financial
and schedule performance, project management
quality)
Meet service SLA compliance for 10 key shared services 100 Miles Minimal threshold of commitments met 6 Miles/Service
commitments (Shared service managers/unit CIOs select All commitments met 8 Miles/Service
services) All commitments met, plans in place to reduce costs in 2004 10 Miles/Service
Average score across all units on general 25 Miles Average score 3.0 15 Miles
manager satisfaction survey (Five-point scale Average score 3.5 20 Miles
where 1 = does not meet expectations, 3 = Average score 4.0 25 Miles
meets expectations, 5 = exceeds expectations)
People—Effectively transition to, and begin operating in, a new organizational model
Objective Metric Potential Score Performance Scale
Effectively transition Number of unit CIOs and IT shared services leaders 100 Miles All units meet their plan, some late 60 Miles
the workforce to creating a transition plan for their employees within All units meet their plan 80 Miles
new model designated time frame All units meet their plan, some early 100 Miles
Meet employees’ Percentage of staff development plan objectives met 150 Miles 88% of development plan objectives met 90 Miles
learning plans 94% of development plan objectives met 120 Miles
100% of development plan objectives met 150 Miles
1000 Miles
The J.D. Edwards Information Technology department would like you to take a few minutes and complete the following satisfaction
survey. The valuable feedback you provide will be used to improve the tools and services IT provides the JDE Enterprise. If any
question does not seem to apply to you, please skip to the next question. When you are Þnished, please click the Submit Survey
button at the end of the survey.
Thank you for your time and help.
Remote Access
16. Please rate your satisfaction with IT remote
access (or dial up) services. If you do not
utilize remote access in your position at JDE
please skip to question #18.
a. Ability to use broadband services top
connect to JDE
b. Ability to use wireless access in any
location that has it available
c. Communication about available remote
access services
d. Reliability of dial up access
e. Help Desk support of broadband services
17. How could IT improve remote access to
better meet your needs?
18. We welcome any other general comments
you may have about J.D. Edwards IT.
Submit
Despite this wealth of off-the-shelf offerings, Working Council research reveals that most companies do not seek outside help or buy
application packages to produce scorecard reports, but are instead using Excel or similar homegrown solutions which require manual
data input.
Kaplan and Norton’s balanced scorecard concept has spawned the creation of a new complement of tools and vendors aimed at
automating balanced scorecard data collection and presentation. Some of these vendors provide tools that enable the presentation of
data, offering Web-based solutions that provide structured data input and graphical presentation. In addition, leading database and
ERP vendors are also creating scorecard modules for their product suites, providing their customers with an easily integrated tool that
aggregates and displays basic scorecard data.
The following vendor briefs are designed to provide members who are considering the purchase of balanced scorecard software tools,
with an overview of the major players in the balanced scorecard space. Each brief includes vendor contact information, an outline of
selected functionality, and a selection of major customers.
Panorama Business Views 50 John Street • Scorecard presentation tool Bank of America
Suite 600 • Modular, Web-based system providing drill-down capability Canadian Dept. of National Defence
Toronto, Ontario • Application training and integration services available GlaxoSmithKline
M5V 3E3, Canada Verizon Communications
Tel: 800-449-3804
pbviews Web: http://www.pbviews.com
ProDacapo Barnhusgatan 4, 4 tr • Web-based balanced scorecard, dashboard, and cost management tools ABB
SE-111 23 Stockholm, Sweden • Training, consulting, and tool implementation and integration services available ICA
Tel: +46 (0)8-622-25-00 • Pre-built with connectors to most existing IT systems Scania
Web: http://www.prodacapo.com
ProDacapo Balanced
Scorecard
CorVu 3400 West 66th Street • Scorecard presentation tool that allows for drill-down into both the scorecard Bowne
Suite 445 data and strategy maps
Edina, MN • Consulting and training services available
U.S.A. 55435
Web: http://www.corvu.com
RapidScorecard
Crystal Decisions 895 Emerson Street • Web-based reporting and analysis tool Aetna
Palo Alto, CA • Allows for ad hoc customization of reports Canfor Corporation
U.S.A. 94301-2413 • Out-of-the-box connectivity for major ERP systems Fox Filmed Entertainment
Tel: 800-877-2340 • Consulting and training services available
Web: http://www.crystaldecisions.com
Crystal Performance
Scorecards
© 2003 Corporate Executive Board Appendix II: IT Balanced Scorecard Tools and Vendors 91
IT Balanced Scorecards 92
PeopleSoft 4460 Hacienda Drive • Presentation tool automatically extracts data from relevant modules of Danske Bank
Pleasanton, CA Peoplesoft suite Entergy
U.S.A. 94588-8618
Tel: 800-380-SOFT (7638)
Web: http://www.peoplesoft.com
Enterprise Scorecard
SAS 100 SAS Campus Drive • Aggregates and presents data from SAS applications in a balanced scorecard Generali-Providencia
Cary, NC format ING Bank
U.S.A. 27513-2414 Quaker Chemical
Tel: 919-677-8000
Web: http://www.sas.com
SAS Balanced
Scorecard
Additionally, members interested in reviewing any of the Working Council’s past strategic research are encouraged to request a complete listing of our work.
Institution __________________________________________
Address __________________________________________
__________________________________________
__________________________________________
Telephone _______________________________________________
COPY AND FAX TO: Working Council for Chief Information Officers
Working Council for Chief Information Officers 2000 Pennsylvania Avenue NW
202-777-5270 Washington, DC 20006
Telephone: 202-777-5000
www.cio.executiveboard.com
95
© 2003 Corporate Executive Board
IT Balanced Scorecards 96
Common Formats
Formal Research Facilitated Research Interactive Working One-to-One
Presentation Discussion Session Briefing
As a complement to our ongoing research, members are encouraged to take advantage of the Project Support Desk. A free resource
to assist senior IT staff with “work in the moment,” the Project Support Desk offers fast-turnaround research for business case
preparation, strategic planning exercises, and budgeting. Members describe their project goals and time constraints and then
decide what combination of literature search, fact retrieval, and networking contacts best suit their needs. In keeping with the
Working Council’s charter, the Project Support Desk does not conduct competitive research, benchmarking, or vendor assessments.