Vous êtes sur la page 1sur 29

Software Metrics In Use

D o n H a rvey

Agenda
u u u u u u u u

Project Software Statistics Why Software Metrics (Goals) Software Implementation Analysis Tracking System Status Defect Analysis Test Coverage Metrics Cost/Benefits Words of Caution/Warning

Project Software Statistics


u

Air Defence Command and Information System for the British Army designed to improve the safety of friendly aircraft during hostilities Installed in 50+ command cells and over 750 weapons systems interfacing via a secure battlefield communications system Worth ~$250M Primarily Ada 70 engineers in the software development teams 5 years development
3

u u u u

Why Software Metrics (Goals)


u

Assist Project Management by Providing metrics for Project Planning and Tracking e.g.:
l l l l l l l l l l

Improving the accuracy of estimation Determining software productivity Determining the cost to complete Determining forward load Monitoring performance of software teams Identification of potential problems Measuring the effectiveness of the test strategy Determining software quality Defect analysis Determining system software reliability
4

Software Implementation Analysis

Ada Package Development Data Collected


u u u u u u u

Software size (estimated) Actual number of lines of code (tracked) Number of errors found by inspection (tracked) Number of errors found during testing (tracked) Time for test and inspection (tracked) Number of man hours to date (tracked) Number of errors found in supporting documentation (tracked)
6

Ada Package Testing - Analysis


u u u u u u u u

Average man-hours/hard error at inspection Average man-hours/hard error by testing Average man-hours/tested line of code Hard error density Total man hours for fully tested code to date Forecast of number of errors remaining Forecast cost of clearing the remaining errors Forecast of man-hours to completion
7

Ada Package Metrics


1 2 3 4 COMPONENT PACKAGE IDENTITY No. OF ADA SOURCE LINES IN SPEC & BODY (i.e. ending in ;) TIME TAKEN TO PRODUCE ERROR FREE COMPILATION PRIOR TO TEST (Exclude review period and machine time during compilation) (Include any rework from the review) No. OF ERRORS FOUND BY CODE INSPECTION TIME TAKEN FOR CODE INSPECTION TIME TAKEN TO PRODUCE PACKAGE BUILD TEST SPECIFICATION TIME TAKEN TO PREPARE TEST ENVIRONMENT (e.g. test script, test harness, stubs etc.) TIME TAKEN TO TEST PACKAGE (Include time taken for debugging and reworking code) TIME TAKEN TO PRODUCE PACKAGE BULD TEST REPORT No. OF ERRORS FOUND IN ADA SOURCE DURING TEST PHASE No. OF ERRORS IN PACKAGE TEST OR DESIGN DOCUMENTATION DURING TEST PHASE TIME TAKEN FOR FORMAL REVIEW OF: PACKAGE BUILD TEST SPEC PACKAGE BUILD TEST REPORT ADA SOURCE (If changed since inspection) No. OF OTHER ERRORS FOUND THROUGHOUT DEVELOPMENT OF THE PACKAGE (Please Specify, use back of form if necessary) NAME DATE 1. Do NOT include unrelated diversion times in record, i.e. coffee break/lunch etc. in all Man-Hours fields 2. Description and use of form is described in the Procedure for the Collection of ADCIS Metrics EA/210/AD/3905 PREPARATION REVIEW MEETING SOFT ERRORS HARD ERRORS PREPARATION REVIEW MEETING : : Man-Hours

5 6 7 8 9 10 11 12 13

: : : : : : : : : : : : Man-Hours Man-Hours Man-Hours Man-Hours Man-Hours Man-Hours Man-Hours Man-Hours

14

: : :

15 16

NOTES:

Component Summary Sheet


Collation of Ada package Metrics Forms
2 3 4 5 PACKAGE IDENTITY NUMBER OF ADA SOURCE LINES IN SPEC & BODY TIME TAKEN TO PRODUCE ERROR-FREE COMPILATION PRIOR TO TEST NUMBER OF ERRORS FOUND BY INSPECTION TIME TAKEN FOR CODE INSPECTION SOFT HARD TOTAL PREP REV MTG PACKAGE <IDENT> 573 263.00 16 8 24 4.00 2.00 2.00 50.00 36.00 2.00 5 0 PACKAGE <IDENT> 300 75.00 PACKAGE <IDENT> 136 31.00 9 0 9 3.00 2.67 3.00 37.75 PACKAGE <IDENT> 49 2.00 3 0 3 0.50 0.25 3.00 8.00 2.00 2.00 0 0 PACKAGE <IDENT> 170 15.00 7 2 9 1.25 1.00 5.00 1.00 7.00 0.50 8 2 ACTUAL TOTAL FOR COMPT 1228 386 35 10 45 8.75 5.92 13 96.75 45 4.5 13 2

6 7 8 9 10 11 12 13

TIME TAKEN TO PRODUCE PACKAGE BUILD TEST SPEC TIME TAKEN TO PREPARE TEST ENVIRONMENT TIME TAKEN TO TEST PACKAGE TIME TAKEN TO PRODUCE PACKAGE BUILD TEST REPORT NUMBER OF ERRORS FOUND IN ADA SOURCE DURING TEST PHASE NUMBER OF ERRORS IN PACKAGE TEST OR DESIGN DOCUMENTATION DURING TEST PHASE TIME TAKEN FOR FORMAL REVIEW OF: PACKAGE BUILD TEST SPEC PREP PACKAGE BUILD TEST REPORT REVW ADA SOURCE (IF CHANGED SINCE INSPECTION) NUMBER OF OTHER ERRORS FOUND THRUOUT DEVELOPMENT OF PACKAGE

14 15

TOTAL MAN-HOURS/COMPT 614.32 AVERAGE NUMBER OF MAN-HOURS/LOC 0.58 MAN-HOURS/HARD ERROR (INSPECTION) 1.46 MAN-HOURS/HARD ERROR (TESTING) 9.11
2.00 1.00 0 0.10 0.10 0 1.00 1.00 0 06/12/90 362.00 03/02/91 75.00 21/01/91 77.42 01/11/90 17.95 06/02/91 81.95

3.10 2.10 0

DATE TOTAL NUMBERS OF MAN-HOURS TOTAL NUMBER OF MAN-HOURS/COMPT AVERAGE NUMBER OF MAN-HOURS/LINE AVERAGE MAN-HOURS/HARD ERROR (INSPECTION) AVERAGE MAN-HOURS/HARD ERROR (TESTING)

614.32 0.58 1.46 9.11

Not true data


9

System Summary Sheet


Collation of Component Summery Sheets
1 2 3 4 5 COMPONENT IDENTITY ACTUAL NUMBER OF ADA SOURCE LINES WRITTEN ACTUAL NUMBER OF ADA SOURCE LINES FULLY TESTED ESTIMATED NUMBER OF ADA LINES IN COMPONENT SOFT ERRORS FOUND BY INSPECTION HARD ERRORS FOUND IN INSPECTION TOTAL ERRORS FOUND IN INSPECTION 6 7 8 9 10 11 12 13 14 15 16 ERRORS FOUND DURING TESTING ERRORS IN DESIGN OTHER ERRORS FOUND IN DEVELOPMENT NUMBER OF MAN-HOURS TO DATE NUMBER OF MAN-HOURS FOR FULLY TESTED ADA SOURCE LINES ESTIMATED NUMBER OF MAN-HOURS FOR FULLY TESTED COMPONENT ESTIMATED MAN-HOURS TO COMPLETION <IDENT> 1228 792 1500 40 10 50 14 2 0 646.87 494.45 915.00 268.13 <IDENT> 2559 1166 2055 56 <IDENT> 7760 5388 6085 187 <IDENT> 2047 2047 2500 70 17 COMPONENT <IDENT> 7139 7139 18828 20 <IDENT> 11762 6693 15344 536 <IDENT> 11272 7480 14956 40 <IDENT> 3050 1844 13000 71 11 TOTAL FOR SYSTEM 46817 32549 74268 1020 298 1318 443 405 55 11993.14 10540.22 22435.34 10442.20 0.40 3.65 17.11 74.39

EST. NUMBER OF MAN HRS FOR FULLY TESTED SYSTEM


11 46 1 182 20 67 233 87 30 7 21 718 60 82 18 33 6 845.00 615.25 4337.45 3492.45 0.33 3.47 21.00 63.58 21 29 79 2 22 0 97 174 0 2639.37 2307.87 5290.89 2651.52 0.34 0.86 12.62 162 158 9 2664.55 2205.30 4409.42 1744.87 0.29 2.17 7.99 41.09

EST. MAN HRS TO COMPLETION


15 1 1 23 687.88 2567.12 974.50 967.85 678.88 1196.48 508.60 0.58 2.34 19.79 2329.12 2630.42 63.30 974.50 1190.16 215.66 0.48 3.82 18.23 934.85 2465.52 1497.67 0.13 13.49 24.50

AV. MAN HR PER LOC


0.61 1.69 16.12 0.43 1.32 16.67

AVERAGE NUMBER OF MAN-HOURS.LINE ACHIEVED FOR FULLY TESTED ADA MAN-HOURS/HARD ERROR (INSPECTION) MAN-HOURS/HARD ERROR (TESTING)

TESTED LINES OF CODE/HARD ERROR

Man hrs/hard error (inspection) 3.65 Man hrs/hard error (testing) 17.11 Not LOC Error density 1/74 true data
33.00 36.43 43.10 43.55 310.39

23.98

10

Tracking System Status

11

Component Process Graph


23000 22000 21000 20000 19000 18000 17000 16000 15000 14000 13000 12000 11000 10000 9000 8000 7000 6000 5000 4000 3000 2000 1000 0 WK 126 WK 130 WK 135 WK 139 WK 143 WK 147 WK 152 COMPONENT NAME
12

TESTED COMPILED ESTIMATE

LINES OF CODE

System Summary
32000 30000 28000 26000 24000 22000 20000 18000

LINES OF CODE 16000


14000 12000 10000 8000 6000 4000 2000 0 W K 157 W K 161 W K 165 W K 170 W K 174 W K 178 W K 183 W K 187 PROJECT WEEK NUMBER
13

TESTED COMPILED ESTIMATE

Component Estimated Completion Dates


210 200 190 180 170 160 150 140 130 WEEK 120 NUMBER 110 100 90 80 70 60 50 40 30 20 10 0
14

<Component id> <Component id> <Component id> <Component id> <Component id> <Component id> <Component id> <Component id> <Component id> <Component id> <Component id>

Percentage of Tested LOCs Against Estimates


105 100 95 90 85 80 75 70 65 60 55 50 45 40 35 30 25 20 15 10 5 0

<Component ids>

15

Defect Analysis

16

Distribution Of Defects By Project Phase


1400 1300 1200 1100 1000 900 NUMBER OF DEFECTS 800 700 600 500 400 300 200 100 0
REQT. LOG. DESIGN PHYS. DESIG CODE & TEST S/W INTEG. MAINT. SYS. INTEG. STG. C TEST

CURRENT PROJECT PHASE ACCREDITED PROJECT PHASE

PROJECT PHASE
17

Defect Identification - Inspection V Testing


270 260 250 240 230 220 210 200 190 180 170 160 150 140 130 120 110 100 90 80 70 60 50 40 30 20 10 0 139

ERROR COUNT

By Inspection By Testing

143

147

152

157

161

165

170

174

178

183

187
18

PROJECT WEEK NUMBER

Tracking Defects For Monitoring System Status


550 500 450

Software not yet stable ~50 defects yet to be analysed

400

350

No. of Defect Reports

300

~200 modifications yet to be implemented/proven

250

200

REPORTED QA CLEAR CR APPROVED

150

100

50

0
MAR APR MAY JUN JUL AUG S EP OCT NOV DEC JAN FEB MAR APR MAY JUN JUL AUG S EP OCT NOV DEC JAN FEB MAR APR MAY JUN

Month
19

Forecast of Errors
320 310 300 290 280 270 260 250 240 230 220 210 200 190 180 170 160 150 140 130 120 110 100 90 80 70 60 50 40 30 20 10 0

Used to justify estimate of forward load

ACTUALS ESTIMATED

213 215

220

225

230

235

240

245

250

255

260

265

270

275

280

285

290

WEEK NUMBER

20

Test Coverage Metrics

21

Integration Team Test Cases


220 210 200 190 180 170 160 150 140 130 120 110 100 90 80 70 60 50 40 30 20 10 0 218 221 224 227 230 233 237 240 243 246 249 252 255 258 261 WEEK NUMBER

Testing behind schedule but software is performing to specification


TOTALS CUMULATIVE TRIED CUMULATIVE PASSED CUMULATIVE

22

Acceptance Dry Run Test Cases


10500 10000 9500 9000 8500 8000 7500 7000 6500 6000 5500

Software performing to specification with some schedule recovery Problems being experienced plus schedule slippage
PLANNED TOTAL TESTED PASSED

TEST CAS ES
5000 4500 4000 3500 3000 2500 2000 1500 1000 500 0 216 219 222 225 228 231 234

Testing behind schedule but software performing to specification


251 253 256 259 263

WEEK NUMBER
23

Feedback Is Important
The results of the analysis of the metrics were provided:

By presentation at the project monthly review By Group Managers receiving a comprehensive report By Team leaders receiving an abridged report for their area By publishing summary information on the notice board
24

Words of Caution
Do Not Launch Into Collecting Metrics Without First:
u u

Defining what information you require Determining what questions will illicit that information Determining if metrication is the right way to answer those questions Defining what metrication will provide the answers
25

Words of Warning
u

Expect a lack of enthusiasm from those supplying data unless:


l l l

It is made clear that they are not being assessed They can be convinced that the programme is to improve the process They are kept fully informed of the results of the analysis

Automate the process as much as you can


26

Costs
u

There were no costs incurred by engineers in supplying data for defect analysis Completion of the Package Software Metric Form accounted for less than 1% of the cost of software implementation and test Completion of the Integration Metrics Form accounted for less than 0.2% of the cost of the build production and testing The process of extracting data from the test logs for the reliability models was not an overhead as it was required by the contract The time taken to assemble the data and producing the report was in the order of 2 days each month following the initial setting up of the required INGRES reports and spreadsheets to receive data and produce graphics

27

Benefits
u u

Much greater visibility as to the status of the Project Reduced risk by identifying failure hotspots/poor performance early Provided a measure of the Quality of the product at each stage of the development life-cycle Much greater awareness by engineers of the need to improve quality on the project Provided an alternative (more accurate?) view of progress through everyone contributing Provided a measure of the effectiveness of Standards and Procedures
28

Contact Details
email: don.harvey@gecm.com Phone +44 (0) 1276 696901

29

Vous aimerez peut-être aussi