Académique Documents
Professionnel Documents
Culture Documents
and Improving
Productivity
Part 1
What is Software Productivity?
Part 2
How to Measure Productivity?
Part 3
How to Improve Productivity?
Conclusions
1
Why Bother About Productivity?
Page 4
What is Productivity?
Page 5
Productivity is defined as
Output over Input.
Output:
the value delivered.
delivered
Input:
Input
the resources (e.g., effort) spent to
generate the output,
the influence of environmental factors
(e.g., complexity, quality constraints, time
constraints, process capability, team
distribution, interrupts, feature churn,
tools, design)
CTO, R&D Effectiveness
C.Ebert, May 2006, productivity-metrics.ppt All rights reserved © 2006, Alcatel
2
The Value Perspective
Creates a Systems View
Page 6
3
Productivity Directly Influences Feasibility
Page 8
Example:
Planning MUST Consider Productivity
Page 9
Scenarios:
cOptimized solution
(smallest distance to curve)
g e
Cost Size = dEffort optimized for schedule target
1
Produ 50
constraints
c ctivity
(hier: Zielwert wird nicht erreicht)
f Size
=
= 10 eTime optimized for cost target
Prod 150
Size uctiv
ity = fWithin time and cost constraints
Prod = 100 11 (reduced size)
uctiv
ity =
10 gWithin time and cost constraints
Time constraints Duration (increased productivity)
4
What Do We Address Here?
Page 11
Part 1
What is Software Productivity?
Part 2
How to Measure Productivity?
Part 3
How to Improve Productivity?
Conclusions
Reduce cost
e.g., Develop more valuable products for lower costs
Support estimation
e.g., Provide consolidated and parameterized history
data
Optimize processes
e.g., Early defect correction
5
Using GQM:
Goals and Questions around Productivity
Page 13
Reduce cost
y How can we improve productivity?
y What is the value-add of getting to higher maturity levels?
Support estimation
y Are there simple rules of thumb estimation mechanisms?
y Project schedules: do we overcommit?
y Outsourcing contracts: will we get what we ask for?
Attention:
Mere GQM Considered Harmful
Page 14
Combine goal-
goal-orientation with decision-
decision-support and
other operational management techniques.
CTO, R&D Effectiveness
C.Ebert, May 2006, productivity-metrics.ppt All rights reserved © 2006, Alcatel
6
The Essential Difficulty of Productivity
Measurement
Page 15
Output
Page 16
7
Typical Productivity Measurements –1
Page 17
8
Typical Productivity Measurements –3
Page 19
Embedded techniques
Productivity = Adjusted size / effort
Adjusted size is the estimated effort based on history and constraints.
Productivity is a normalization comparing estimated to actual effort
Earned value.
Weighted effort of completed work products
versus planned/realized effort.
functionality THEN
SEND '11'B1 TO LENKUNG;
ELSE /* Y = 0: DEFAULT */
SEND '00'B1 TO LENKUNG;
FIN;
FIN;
ELSE/*X = 0: DEFAULT */
SEND '00'B1 TO LENKUNG;
FIN;
FIN;
END; /*BSP1*/
9
Why “Output” Is Difficult –2
Page 21
Challenges:
Compare apples and apples
All things equal applies hardly
Avoid the traps of
scientific hype (my theory is better) and
industry hype (we can do it better)
Relate productivity gains to the actual change and not to a
mixture of changes overlaid with Hawthorne-like effects,
scope changes, restructuring, cost reduction, etc.
CTO, R&D Effectiveness
C.Ebert, May 2006, productivity-metrics.ppt All rights reserved © 2006, Alcatel
10
Case Study:
Benchmarking with Productivity Data –1
Page 24
Case Study:
Benchmarking with Productivity Data –2
Page 25
Process-
Process-related factors (more easily controlled)
multiple dependencies (e.g., HW-SW co-development)
engineering environments
requirements churn
processes and tools
staff, team composition
Product-
Product-related factors (not easily controlled)
nonfunctional constraints
solution complexity
customer participation
size of product
11
Case Study:
Benchmarking with Productivity Data –3
Page 26
Case Study:
Benchmarking with Productivity Data –4
Page 27
Usage:
Project parameters
y Development Platform = Mid-range,
y Methodology = Developed in-house
y Language Type = 4GL,
y Maximum Team size > 8,
y Your Project Delivery Rate, (PDR), was 10.0 hours per function point
For the two factors with the most significant impact on productivity,
Development Platform and Language Type, the chart below shows how your
Project Delivery Rate compares to projects with the same Development
Platform and Language Type.
Source: ISBSG,
Reality Checker,
2005
12
Case Study:
Benchmarking with Productivity Data –5
Page 28
Case Study:
Benchmarking with Productivity Data –6
Page 29
MB Time 1000
MB MTTD 100
100 10
Months
Days
10 1
1 0.1
10 100 1000 10 100 1000
ESLOC (thousands) ESLOC (thousands)
1
People
10
0.1
0.01 1
10 100 1000 10 100 1000
ESLOC (thousands) ESLOC (thousands)
Current Plan Current Forecast INTRO History QSM Telecommunications Database Average 1 SD
100% Total planned defect tuning factor
13
What Do We Address Here?
Page 30
Part 1
What is Software Productivity?
Part 2
How to Measure Productivity?
Part 3
How to Improve Productivity?
Conclusions
14
Software:
There Ain’t Be No Silver Bullet
Page 33
15
A Personal Advice
Page 35
Examples:
Try to model the distributed intelligence of termites, which build
huge settlements without the central architect and planning
Try to extract the conformance requirements in swarms which
interact and interface in multiple dimensions
Try to describe change management of viruses, such as
cholera which adapts to all type of environmental conditions
All these trials must fail because we impose accidental models
– and still it works in nature perfectly.
CTO, R&D Effectiveness
C.Ebert, May 2006, productivity-metrics.ppt All rights reserved © 2006, Alcatel
Proof of Concept:
Productivity Can Be Improved!
Page 36
16
What Do We Address Here?
Page 38
Part 1
What is Software Productivity?
Part 2
How to Measure Productivity?
Part 3
How to Improve Productivity?
Conclusions
17
Software Productivity Laws (continued)
Page 40
Conclusions
Page 41
18
Measurement Reading
Page 42
Page 44
THANK YOU!
www.alcatel.com
19