Académique Documents
Professionnel Documents
Culture Documents
D o n H a rvey
Agenda
u u u u u u u u
Project Software Statistics Why Software Metrics (Goals) Software Implementation Analysis Tracking System Status Defect Analysis Test Coverage Metrics Cost/Benefits Words of Caution/Warning
Air Defence Command and Information System for the British Army designed to improve the safety of friendly aircraft during hostilities Installed in 50+ command cells and over 750 weapons systems interfacing via a secure battlefield communications system Worth ~$250M Primarily Ada 70 engineers in the software development teams 5 years development
3
u u u u
Assist Project Management by Providing metrics for Project Planning and Tracking e.g.:
l l l l l l l l l l
Improving the accuracy of estimation Determining software productivity Determining the cost to complete Determining forward load Monitoring performance of software teams Identification of potential problems Measuring the effectiveness of the test strategy Determining software quality Defect analysis Determining system software reliability
4
Software size (estimated) Actual number of lines of code (tracked) Number of errors found by inspection (tracked) Number of errors found during testing (tracked) Time for test and inspection (tracked) Number of man hours to date (tracked) Number of errors found in supporting documentation (tracked)
6
Average man-hours/hard error at inspection Average man-hours/hard error by testing Average man-hours/tested line of code Hard error density Total man hours for fully tested code to date Forecast of number of errors remaining Forecast cost of clearing the remaining errors Forecast of man-hours to completion
7
5 6 7 8 9 10 11 12 13
14
: : :
15 16
NOTES:
6 7 8 9 10 11 12 13
TIME TAKEN TO PRODUCE PACKAGE BUILD TEST SPEC TIME TAKEN TO PREPARE TEST ENVIRONMENT TIME TAKEN TO TEST PACKAGE TIME TAKEN TO PRODUCE PACKAGE BUILD TEST REPORT NUMBER OF ERRORS FOUND IN ADA SOURCE DURING TEST PHASE NUMBER OF ERRORS IN PACKAGE TEST OR DESIGN DOCUMENTATION DURING TEST PHASE TIME TAKEN FOR FORMAL REVIEW OF: PACKAGE BUILD TEST SPEC PREP PACKAGE BUILD TEST REPORT REVW ADA SOURCE (IF CHANGED SINCE INSPECTION) NUMBER OF OTHER ERRORS FOUND THRUOUT DEVELOPMENT OF PACKAGE
14 15
TOTAL MAN-HOURS/COMPT 614.32 AVERAGE NUMBER OF MAN-HOURS/LOC 0.58 MAN-HOURS/HARD ERROR (INSPECTION) 1.46 MAN-HOURS/HARD ERROR (TESTING) 9.11
2.00 1.00 0 0.10 0.10 0 1.00 1.00 0 06/12/90 362.00 03/02/91 75.00 21/01/91 77.42 01/11/90 17.95 06/02/91 81.95
3.10 2.10 0
DATE TOTAL NUMBERS OF MAN-HOURS TOTAL NUMBER OF MAN-HOURS/COMPT AVERAGE NUMBER OF MAN-HOURS/LINE AVERAGE MAN-HOURS/HARD ERROR (INSPECTION) AVERAGE MAN-HOURS/HARD ERROR (TESTING)
AVERAGE NUMBER OF MAN-HOURS.LINE ACHIEVED FOR FULLY TESTED ADA MAN-HOURS/HARD ERROR (INSPECTION) MAN-HOURS/HARD ERROR (TESTING)
Man hrs/hard error (inspection) 3.65 Man hrs/hard error (testing) 17.11 Not LOC Error density 1/74 true data
33.00 36.43 43.10 43.55 310.39
23.98
10
11
LINES OF CODE
System Summary
32000 30000 28000 26000 24000 22000 20000 18000
<Component id> <Component id> <Component id> <Component id> <Component id> <Component id> <Component id> <Component id> <Component id> <Component id> <Component id>
<Component ids>
15
Defect Analysis
16
PROJECT PHASE
17
ERROR COUNT
By Inspection By Testing
143
147
152
157
161
165
170
174
178
183
187
18
400
350
300
250
200
150
100
50
0
MAR APR MAY JUN JUL AUG S EP OCT NOV DEC JAN FEB MAR APR MAY JUN JUL AUG S EP OCT NOV DEC JAN FEB MAR APR MAY JUN
Month
19
Forecast of Errors
320 310 300 290 280 270 260 250 240 230 220 210 200 190 180 170 160 150 140 130 120 110 100 90 80 70 60 50 40 30 20 10 0
ACTUALS ESTIMATED
213 215
220
225
230
235
240
245
250
255
260
265
270
275
280
285
290
WEEK NUMBER
20
21
22
Software performing to specification with some schedule recovery Problems being experienced plus schedule slippage
PLANNED TOTAL TESTED PASSED
TEST CAS ES
5000 4500 4000 3500 3000 2500 2000 1500 1000 500 0 216 219 222 225 228 231 234
WEEK NUMBER
23
Feedback Is Important
The results of the analysis of the metrics were provided:
By presentation at the project monthly review By Group Managers receiving a comprehensive report By Team leaders receiving an abridged report for their area By publishing summary information on the notice board
24
Words of Caution
Do Not Launch Into Collecting Metrics Without First:
u u
Defining what information you require Determining what questions will illicit that information Determining if metrication is the right way to answer those questions Defining what metrication will provide the answers
25
Words of Warning
u
It is made clear that they are not being assessed They can be convinced that the programme is to improve the process They are kept fully informed of the results of the analysis
Costs
u
There were no costs incurred by engineers in supplying data for defect analysis Completion of the Package Software Metric Form accounted for less than 1% of the cost of software implementation and test Completion of the Integration Metrics Form accounted for less than 0.2% of the cost of the build production and testing The process of extracting data from the test logs for the reliability models was not an overhead as it was required by the contract The time taken to assemble the data and producing the report was in the order of 2 days each month following the initial setting up of the required INGRES reports and spreadsheets to receive data and produce graphics
27
Benefits
u u
Much greater visibility as to the status of the Project Reduced risk by identifying failure hotspots/poor performance early Provided a measure of the Quality of the product at each stage of the development life-cycle Much greater awareness by engineers of the need to improve quality on the project Provided an alternative (more accurate?) view of progress through everyone contributing Provided a measure of the effectiveness of Standards and Procedures
28
Contact Details
email: don.harvey@gecm.com Phone +44 (0) 1276 696901
29