Vous êtes sur la page 1sur 68

PMMT Methodologies

28th October 2011

Waterfall lifecycle
Feasibility study Requirements specification Overall design

Detailed design Code and unit test Integration

Implementation Operations and maintenance

Figure 6.1 The waterfall model of system development lifecycles


Source: Derived from Computer, May 1988, 1988 IEEE

The b model
Inception Analysis

Design

Production

Acceptance

Production

Operation

Design

Evaluation

Analysis

Inception

Figure 6.2 The b model


Source: Derived from N D Birrell & M A Ould, A Practical Handbook for Software Development, Cambridge University Press, 1985

The V model
Customer requirements Requirements definition Requirements specification High-level design Technical specification Develop/agree acceptance criteria Generate system test plan Produce integration test plan Acceptance criteria System test plan Integration test plan Activity Product Accepted system Customer acceptance Tested system

System testing

Integrated system Integration and test

Low-level design Produce module test plan

Module specifications Module production

Module test plan

tested modules

Developed modules

Module testing

Source: Adopted and reproduced with the permission of the National Computing Centre Limited from the STARTS Guide, 1987, which was supported by the Department of Trade and Industry

The incremental approach


Detailed design Feasibility study Requirements definition & development planning High-level design Code & unit test System integration Installation & system test

Operation

Increment 1

Detailed design

Feedback/ validation/ verification

Code & unit test

System integration

Installation & system test

Operation

Increment 2

Detailed design

Code & unit test

System integration

Installation & system test

Operation

Increment 3

Figure 6.4 The incremental model



5

The spiral model evolutionary development


Determine objectives, alternatives, constraints Evaluate alternatives, identify and resolve risks

Plan next phases


Figure 6.5 Boehms spiral model
Source: Based on Computer, May 1988, 1988, IEEE

Develop, verify nextlevel product



6

Traditional approach to analysis


Supply terms of reference and information TORs, information Review draft requirements specification Review comments

Users
Specification

Developers
Revise requirements specification Agreed requirements specification Produce high-level system design High-level design specification

Analyse requirements

Analysis notes etc

Specify requirements

Draft requirements specification

To program specification, coding etc

Limited user involvement; driven by developers

Figure 6.6 The traditional approach



7

Structured systems development


Provide data and review analysis results TORs, information Assist/agree in producing logical view Assist/agree in preparing logical design Review physical design

Information

Information, decisions

Review responses

Users Developers
Revise physical design Agreed physical design

Questions

Questions

Questions, suggestions

Suggested design

Analyse current physical system

Agreed physical description

Derive logical description

Agreed logical description

Produce logical design

Agreed logical design

Produce physical design

Draft physical design

To program specification, coding etc

Greater user involvement at all stages; driven by users

Other approaches
Rapid Application Development (RAD) Object-Oriented development Component-based development Extreme programming Package-based IS projects Soft systems methodology Socio-technical approach Business Process Reengineering (BPR)

Generic process model


Delivery to customer Preparation of documentation Site surveys and preparation Staff training and familiarisation Acceptance testing System commissioning Customer takeover In-service live running Warranty, support & maintenance Enhancements

Pre-project work

Project start-up

Data acquisition and take-on System development

Post-project review

Development stage

Completion stage

Operational stage

Project Initiation Document (PID) Project plan Quality plan Risk mgmt plan Project org'n Project admin.

Requirements specification Technical specification Module specifications Prototypes H/W and S/W modules System test results Factory acceptance

Site acceptance Trained staff Commissioned system

Fixes Enhancements

Post-project review report

Figure 7.1 Generic process model for software development


10

Pre-project work
Establish scope and objectives Invite tenders for supply Evaluate tenders, select suppliers Agree contractual framework Set up subcontracts if necessary

11

Project start-up
Finalise scope agree Project Initiation Document (PID):
Objectives: what the projects about Scope: whats in (and out) Constraints: timescale, budget, methods, etc Authority: who represents the customer? Resources: people, equipment, finance etc.

Agree roles and responsibilities Develop initial plans

12

Development stage
Requirements definition Design Implementation (doing the development work) Integration and testing System testing

13

Completion stage
Delivery to the customer Training and familiarisation Acceptance testing Customer acceptance System commissioning Final customer takeover

14

Operational stage
Live running of system Fixing problems not found in tests Enhancements and modifications

15

Post-project review
Technical methods and standards: how effective, problems and resolution Project risks: how handled Contractual issues: how resolved Customer/supplier relationship issues Stakeholder management issues Team resourcing issues Project performance against plans

16

Work breakdown structure (WBS)


Project

Top level

Conduct investigation

Prepare report

Second level

Conduct interviews

Investigate other systems

Analyse requirements

Investigate packages

Investigate hardware

Interview Managing Director

Interview Finance Director

Interview Stores Manager

Interview Sales Manager

etc.

Third level

Figures 8.1 - 8.3 Work breakdown structures



17

Product breakdown structure (PBS)


Project products

Top level

Specialist products

Management products

Second level

Analysis products

Feasibility report

Interview notes

Requirements catalogue

Data flow diagrams

Package reports

Third level

Figures 8.5 8.8 Product breakdown structures



18

Product flow diagram


Draft requirements catalogue Create requirements catalogue Interview notes Create data flow diagrams Draft data flow diagrams Review data flow diagrams Agreed data flow diagrams Add extra requirements Review requirements catalogue Reviewed requirements catalogue

Figure 8.9 PRINCE2 product flow diagram



19

Product description
Purpose Composition Derivation Quality/completion criteria Can add:
Format Related products Review methods

20

Work packages
Training course

Trainer's materials
Notes for session 1 Work package 1

Exercises

Handouts

Visual aids

Exercise 1A Exercise 1B

Handout on Planning

Slides for session 1

Work package 2

Notes for session 2

Exercise 2A

Handout on Scheduling

Slides for session 2

Figure 8.10 Work packages for a training course


21

Linear responsibility chart


Organisation breakdown
Project Manager

R = Responsible
Project Sponsor

Chief designer

Analysis team leader

Test manager

Development manager

A = Accountable C = Consultation I = Information


Interview notes

Project support assistant

OR, could use


Senior user

I = Initiation E = Execution A = Approval C = Consultation S = Supervision

I I I I I I I

A A A A A A A

R R R R R R R

I I I I I I I I

C C C I I I

Product / work package breakdown

Requirements catalogue Use case diagram Package review Report text Report illustrations Report appendices

Figure 8.11 Linear responsibility chart


22

Network diagram (activity-on-arrow)


Conduct interviews

Analyse requirements

Start Investigate other systems Investigate packages Produce report Finish

Investigate hardware

Figure 8.12 Network diagram (activity-on-arrow format)


23

Network with durations & critical path


Conduct interviews

3
Analyse requirements

Start

4
Investigate other systems Investigate packages Produce report Finish

8
Investigate hardware

Figure 8.13 Network diagram with durations & critical path added
24

Network diagram (activity-on-node)


8 11 Analyse requirements (3) 0 8 13 16

Conduct interviews (8) 0 Start 0 4 8 8 16 16 21


Finish

Investigate packages (8)

Produce report (5)

16

16

21

Investigate other systems (4)

13

Investigate hardware (5)

11

16

Figure 8.15 Network diagram (activity-on-node format)


25

IS estimating issues
Unique projects with much innovation Estimates often produced early before specification agreed No professional estimators Few published metrics available

26

Analogy method
Find a similar project:
Type of business Size of applications Scope of systems Technical methods and standards

Must adjust for:


Organisational culture Users level of computer literacy Degree of management support for project

27

Analysis and programming approaches


Explicitly estimate for one stage:
Analysis method analysis stage Programming method code/unit test stage

Extrapolate whole project outcome from stage estimate Must adjust for:
Project size Familiarity with business and technical environment Technical complexity

28

Direct estimate (from project breakdown)


Break down project (either using WBS or PBS approach) Estimate for each task / product Sum products to get stages Sum stages to get project Takes time and requires expert estimators

29

Delphi technique
Several estimators given specification of work and asked for estimates Summarised anonymously and results circulated to estimators Can revise estimates in the light of others ideas Method reduces personal disagreements and egobased issues

30

CoCoMo
Formulae based on thousands of delivered source instructions (KDSI) Basic, intermediate and detailed versions CoCoMo II now developed for wider range of development approaches Useful elapsed time formula: 2.5 x (estimated effort in man-months)0.33

31

Function point analysis


Based on analysis of inputs, outputs and files accessed in system Starts with unadjusted function points Then adjusts for:
technical complexity performance-influencing factors risks

32

Supporting activities
Can considerably inflate base estimates Includes:
Proportional activities:
Team leading/supervision Documentation Quality control Customer reviews Project management Systems management Configuration management Project office work

Elapsed-time activities:

33

Effort and elapsed time


Effort = total volume of work Elapsed time depends on effort and also:
How many resources are available What proportion of their time is available to the project Delays outside the teams control (eg lead times for hardware) Dependencies on others

34

Network diagram
8 11 Analyse requirements (3) 0 8 13 16

Conduct interviews (8) 0 Start 0 4 8 8 16 16 21


Finish

Investigate packages (8)

Produce report (5)

16

16

21

Investigate other systems (4)

13

Investigate hardware (5)

11

16

Figure 10.1 Dependency network with activity durations



35

Bar chart
Activities
Conduct interviews Investigate other systems Analyse requirements Investigate packages Investigate hardware Produce report

Days

10

15

20

25

30

35

Figure 10.3 Schedule for two-person team showing parallel activities



36

Bar chart with milestones added


Activities
Conduct interviews Investigate other systems Analyse requirements Investigate packages Investigate hardware Produce report Milestone

Days

10

15

20

25

30

35

Figure 10.6 Bar chart showing project milestones



37

Bar chart with overhead task added


Activities
Conduct interviews Investigate other systems Analyse requirements Investigate packages Investigate hardware Produce report Project management 35 Milestone

Days

10

15

20

25

30

Figure 10.7 Bar chart showing project management as continuous activity over project

38

Bar chart and resource histogram


Activities
Conduct interviews Investigate other systems Analyse requirements Investigate packages Investigate hardware Produce report Milestone

Days
3

10

15

20

25

30

35

Staff

2 1

Figure 10.8 Bar chart with resource histogram


39

The project and other plans


One document or PROJECT PLAN two?
Why? What? When? Where? Who?

QUALITY PLAN QUALITY PLAN How? How

RISK MANAGEMENT PLAN Why not?

One plan or three?

40

PRINCE2 plans
Programme Plan Project Plan

Stage Plan

Exception Plan

Team Plan

Figure 10.10 PRINCE2 plans


41

Contents of PRINCE2 project/ stage plan


Product breakdown structure Product flow diagram Activity network Financial budget Resource requirements Risk assessment Quality plan Gantt charts Product descriptions for major products

Figure 10.11 Contents of PRINCE2 project and stage plans


42

Project budget
BUDGET FOR: NEW CUSTOMER CONTACT SYSTEM Expenditure code and heading Mar A B C D E F G H I J Direct labour Sub-contract work Hardware Software Telecommunications Travel Accommodation and subsistence Project-specific training Support services Consultancy support 2 16 207 2 4 87 2 3 104 2 6 154 100 30 10 3 2 10 2 6 39 513 6 2 4 112 5 1 1 38 3 2 1 1 1 1 50 Apr 50 30 Monthly figures May Jun Jul 70 30 90 60 120 60 200 60 60 3 2 2 2 1 1 Totals Aug 70 30 Sep 30 480 210 300 90 70 14 11 10 13 17 74 1289

Contingency (10%) - items B-J only Monthly totals:

Figure 10.13 Example budget for an IT project


43

Effort monitoring timesheet


Name: Code
A/01 A/02 A/03 Dave Sims

Project: Code Mon


6.5

Personnel

Week ending: Thu Fri Sat Sun

19 May

Tue
5.0

Wed
7.5

Total
19.0

To go
Nil Nil 15.0

Code program CV004 Test program CV004 Code program EN025

7.5

3.0 4.0

10.5 4.0

M/03 M/02

Team meeting Complete timesheet

1.0

1.0 0.5

0.5

Figure 11.1 Effort monitoring timesheet



44

Bar chart illustrating progress

Figure 11.2 Bar chart used to illustrate project progress


45

Monitoring quality
Self-checking by author Peer review Team leader review Walkthrough Fagan inspection External review

46

Milestone slip chart

Figure 11.4 Milestone slip chart



47

Earned value analysis original plan

Program A Program B Program C Program D

Original budget 10,000

Program E

5 Scale of weeks

10

11

12

Figure 11.5 Earned value analysis original plan



48

EVA situation at second progress check


Program A Program B Program C

Original budget 10,000

Progress check Amount spent 3,000

Program D Program E

5 Scale of weeks

10

11

12

Figure 11.6 Earned value analysis situation at second progress check



49

EVA formulae
BCWS - budgeted cost of work scheduled ACWP - actual cost of work performed BCWP - budgeted cost of work performed BCWP-ACWP = Cost variance BCWP-BCWS = Schedule variance BCWP/ACWP = Cost performance index BCWP/BCWS = Schedule performance index
50

Monitoring and control cycle


Monitor progress

Evaluate progress

Satisfactory? Yes No

Assess possible control actions

Select and implement control action

Figure 12.1 Monitoring and control cycle



51

Trade-offs: the triple constraint


'Competitive edge' project

Time

Low-budget project

Safety-critical project

Cost
Figure 12.2 Time/cost/quality triangle for projects

Quality

52

Change control
Devise change control procedure

Identify change

Assess impact of change

Decide what to do

53

Recipients of progress reports


IT management Customer management

Progress report

Quality assurance

Users

Project team

54

Report contents (typical)


Period covered Narrative summary of progress Milestones achieved / deliverables completed Problems encountered (and solutions) Projected completion date Costs to date and predicted Changes identified and implemented

55

Reporting in PRINCE2
Project Initiation Document (PID) End stage assessment Mid-stage assessment Highlight report Checkpoint report Project closure report

56

Some definitions of Quality


Fitness for purpose Conformance to requirement Common business definitions The totality of features and characteristics of a product or service that bear on its ability to satisfy stated or implied needs. ISO8402, 1991

57

EFQM business excellence model

Figure 14.1 The business excellence model of EFQM



58

Five levels of software process maturity

Figure 14.2 The five levels of software process maturity


59

Structure of a quality management system

Figure 14.3 Quality management system structure



60

Quality control methods


Walkthrough/inspection External review

Cost

Management review Peer review Self-checking

g? tin es

Rigour (thoroughness)
Figure 14.4 Quality control methods compared
61

The cost of quality against time

Figure 14.5 Cost of quality against time


62

The software development lifecycle


Customer requirements Requirements definition Requirements specification High-level design Technical specification Develop/agree acceptance criteria Generate system test plan Produce integration test plan Acceptance criteria System test plan Integration test plan Activity Product Accepted system Customer acceptance

Tested system

System testing

Integrated system Integration and test

Low-level design Produce module test plan

Module specifications Module production

Module test plan

tested modules

Developed modules

Module testing

Source: Adapted and reproduced with the permission of the National Computing Centre Limited from the STARTS Guide, 1987, which was supported by the Department of Trade and Industry

63

Configuration management
Involves:
Identifying components of the system Giving each a unique version number Controlling changes to each component Tracking interaction of components Controlling release of components into live environment

64

Risk management process


Plan risk management approach Risk management plan

Identify risks

Assess risks

Risk register (risk log)

Assign actions and owners

Review risks

Projects objectives achieved

Figure 15.1 The risk management process


65

Risk breakdown structure


Project risks

Commercial risks
No or poor business case More than one customer Inappropriate contract type Penalties for nonperformance Ill-defined scope Unclear payment schedule Payments not linked to deliverables

Relationship risks
Unclear customer structure Poor access to stakeholders Internal customer politics Multiple stakeholders Users not committed to project Unwillingness to change Management and users disagree

Requirements risks
Requirements not agreed Requirements incomplete Requirements not detailed enough Ambiguity in requirements No single document of requirements Stringent nonfunctional reqts. Acceptance criteria not agreed

Planning and resource risks


PM not involved in initial planning Project very large with quick buildup Estimates not based on metrics Excessive reliance on key staff Developers lack key skills Inexperience on business area Inexperience of technology

Technical risks
Environment new to developers Development & live environments differ Restricted access to environment Unfamiliar system software Lack of technical support Unfamiliar tools/ methods/standards New/unproven technology used

Subcontract risks
No/little experience of suppliers Suppliers in poor financial state Difficulty to stage tests of items No choice of supplier Use of proprietary products Subcontracts not 'back-to-back' with main contract

Figure 15.2 Risk breakdown structure


66

Risk map
Likelihood of occurrence
High Medium Low

Potential scale of impact


Figure 15.3 Risk map

Small Moderate Large

67

Risk register (risk log)


For each risk:
Reference Title and description Current status Potential impact/s:
Description Size Likelihood

Risk owner Action/s Action log

68