Vous êtes sur la page 1sur 223

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) nd

Version 1.0

Team Crime Busters Sonal Verma Brian H. Park Akhil Pathania John Kraus Software 625 -Software Project Management Software Professor Ken Nidiffer Submission Date: December 8, 2008

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer

Version 1.0 12/08/2008

Revisions
Revision Number Date Section Name Description Section Undated Remarks

0.1

9/11/2008

Preface and 1.1 Project Overview Preface and 1.1 Project Overview Project Description Project Description Requirement Estimate Process Model Productivity Dependencies Reference Materials Complete

First draft Update document based on team review Added Figures Updated the package description First Draft First Draft First Draft First Draft Final Draft

Preface and 1.1 Preface, 1.1.1 and 1.1.2 All 1.1.1 5.3 2.1 3.4.4 5.2 1.4 Decided project name and team name

0.1

9/13/2008

0.1 0.12 0.12 0.12 0.13 0.14 0.15 1.0

9/14/2008 9/18/2008 9/21/2008 9/28/2008 10/25/2008 11/09/2008 11/16/2008 12/08/08

ii

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer

Version 1.0 12/08/2008

Preface
The following preface is a short narrative on the purpose of the Crime Data Collection, Aggregation and Assimilation System (CDCAAS) project. The CDCAAS will provide police officers and investigators with the ability to conveniently and quickly acquire and assimilate, from a variety of diverse electronic sources, information needed to help solve or prevent crimes or hunt down and apprehend criminals. CDCAAS will be a valuable tool for investigators seeking to connect the dots in their efforts to protect the public and bring wrongdoers to justice. CDCAAS is the first in a new series of applications software suites for System XYZ. System XYZ will be the next major product line for our company, making this project and this Software Project Management Plan (SPMP) extremely important to the continued success of the organization. Furthermore, the CDCAAS project embodies our companys ambitious set of strategic redirection actions to strengthen our responsiveness to customers, increase competitiveness by reducing costs, build market share, and field quality products that meet or exceed customer requirements. Our recent market study concluded that the features offered by the proposed set of software packages for the new System XYZ computer system will be crucial in achieving our goals for market penetration. This conclusion has been borne out by 700 advance orders received for the CDCAAS.

iii

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer

Version 1.0 12/08/2008

Abstract
The Software Project Management Plan (SPMP) contains detail documentation of the technical and management aspects involved in various stages in the development of the Crime Date Collection, Aggregation and Assimilation System (CDCAAS). The document adheres to the specification provided by the IEEE Standards (1058.1). This document serves as a controlling document for organizing and managing the resources in such a way that these resources deliver all the work required to complete the project within define scope, time and cost constraints. The document is intended to answer various challenges involved in managing the project like meeting project deadlines, integration of inputs needed to meet predefined objectives of providing a cost effective, efficient and automated way of assimilating data about the criminals from one place rather than from different resources. This document will be subject to revisions as the project progresses.

iv

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer

Version 1.0 12/08/2008

Project Charter
Sponsor Project Sponsor Information Name: Susan Smith Title: Vice President, New Software and Systems Engineering Business Need:
Development The Crime Data Collection, Aggregation and Assimilation system (CDCAAS) will provide enhanced functionality for more efficient searches for the crime data stored in the different kinds of biometric databases. This system shall be used by country wide law enforcement agencies. The primary goal is to provide users with a unified tool that will have the capability to work of different platforms, understand all secure formats, and communicate over broad range of medium without compromising on the stringent level of security that is specific to the government agencies and other corporations. Secondary goals include reduction in operational cost and increase in profit margin by providing stable and reliable software. System XYZ will allow the corporation to become the leading provider of Crime Data Collection, Aggregation and Assimilation software at all levels of government and in the commercial sector Continued improvement in the quality of life Improved overall data integrity and speed of data access to optimize decision making Faster and Higher rates case-closing through more efficient gathering and analysis of crime- related data. Table 1: Project Sponsor Information

Business Benefits:

Major Deliverables Product(s) or Service(s) Data Management Analysis and Statistics Schedule Constraints and Assumptions Planned Start Date:
January 28, 2008

Knowledge Distribution and Delivery Information Retrieval , Reports and Notification

Planned End Date:


January 28, 2010

Latest End Date:

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer

Version 1.0 12/08/2008

Major Deliverables Schedule Assumptions: CDCAAS will be developed during the next 24 months with many intermediate product releases during the above mentioned period. The alpha test is scheduled on or before Jan 28, 2010. Schedule Allocation is as follows: 24 months CDCAAS development 2 months alpha test at Fairfax, VA (in-house) 2 months beta test at Chelmsford, MA (customer site) 2 months rollout to customer sites which includes installation and training 6 months maintenance
Table 2: Major Deliverables

Schedule Constraints:

Project Manager: Project Manager:

Key Staffing Requirements Title: Title:


Table 3: Key Staffing Requirements

Date Available: Date Available:

Other Constraints and Assumptions Constraints


CDCAAS software will extend the System XYZ software. CDCAAS will extend the 11 existing System XYZ packages. Three modules will be extended through COTS packages, three will extend reusable portions of software, and the remaining five modules will be custom written. One of the five custom modules will be outsourced to Ivan Industries. Table 4: Other Constraints and Assumptions

vi

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer

Version 1.0 12/08/2008

Initial Cost Estimate for CDCAAS


Labor cost per hour: Hardware cost per unit: Number of Units: Kernel KSLOC Application Software Cost Equation I: (Number of Staff Hours) * Labor Cost Total Effort (Staff Months) = 1,030.05 164,807.34 Total Effort (Staff Hours) = Staff Hours Per Month = 160.00 $200.00 Labor Cost per hour = Application $32,961,467.66 Software Cost = Cost Per Package Package Name Database Management System Spreadsheet Reqm'ts & Config. Mgmt. Secure Communication Graphics Presentation Word Processor Project Management GPS Navigation Compiler Debugger & Test Electronic Inventory & Tracking Total $200 $1,500 700 20

Staff Months 93.31 41.14 174.39 185.41 41.11 98.27 41.13 135.15 41.09 102.50 76.54

Cost $2,985,839.13 1,316,356.25 5,580,610.84 5,933,252.92 1,315,393.59 3,144,759.78 1,316,050.46 4,324,790.21 1,315,009.16 3,279,969.71 2,449,435.62

1,030.05 $32,961,467.66

System Kernel Software (assuming Kernel size is 20K SLOC) COCOMO man-month effort = 7.39 * (KSLOC) ^ 1.2 Total Effort (Staff Months) = Total Effort (Staff Hours) = Labor cost per hour = 269.08 43,052.70 $200.00 Kernel Cost =

$8,610,540.45

vii

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer

Version 1.0 12/08/2008

System XYZ Hardware Equation 2: Cost per unit * No. of Units Cost per unit = $1,500 No. of Units = 700 Hardware Cost = $1,050,000.00

Subtotal = $42,622,008.12
Table 5-Initial Cost Estimate for CDCAAS

viii

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer

Version 1.0 12/08/2008

Project Scope
The Crime Data Collection, Aggregation and Assimilation System (CDCAAS) project provides police agencies such as the Federal Bureau of Investigation (FBI), Interpol, Department of Homeland Security, Law Enforcement and Corporations the capability for a secure, efficient and reliable mechanism to acquire detailed crime-solving related information in a cost-effective and time saving manner. All CDCAAS software will be designed, implemented, purchased, or subcontracted during the life of this project. This project will extend the existing System XYZ software packages divided into various methods of implementation. Three packages will be Commercial-Off-The-Shelf (COTS) products. Three packages will leverage reusable components as building blocks. Five packages will be custom designed and developed. One of the five custom packages will be outsourced to Ivan Industries. All sub-phases of the project including requirements, preliminary and detailed design, code, and unit testing will be conducted during the CDCAAS project. Subsequently, the CDCAAS project will conduct alpha and beta tests and lead the rollout of the software into production. Installation and user-training will be provided by the CDCAAS project. Additionally, this project will provide 24 hour help-desk service and support for a six-month maintenance period after completing installations at each site. Exclusions to the project include any connectivity or supporting hardware installations and delivery that are necessary for the CDCAAS software to function. Other notable exclusions include: Multi-Language Translation Packages, ability to edit CDCAAS Documents using other software, staffing support for Ivan industries, and extended support and maintenance plans. CDCAAS Hardware (Notebook Configuration): Two Universal 2009-B microprocessors, 3.4 GHz 20.1 inch display A three-button mouse point device 4 GB of main memory (SDRAM) 16 MB Video RAM 320 GB primary hard drive DVD/R/RW AMD CD-RW Combo Drive A 100 MB ZIP drive Integrated 2.0 MP camera Printer port Asynchronous port Integrated 802.11 g wireless LAN Four USB ports 1394/FireWire connector A LAN interface card ix

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer

Version 1.0 12/08/2008

A 56,000 bps capable fax/modem Anti-virus/security suite Video Card Speakers Head phones Laser printer 9-cell lithium ion battery MP3 30 GB audio/video player A Bar Code scanner Personal Assistant Device (416 megahertz, 1 GB RAM) 48-bit Color Flatbed Scanner (2400 dpi optical resolution) System ABC enhanced keyboard A CRT monitor (1,280 x 1,024 non-interlaced; high resolution; bit-mapped; 21 inch color display) Port replicator A stand for the CRT monitor Power connector Digital camcorder Internal Speakers Wireless Digital Phone with voice mail messaging and internet service CDCAAS Database Management System CDCAAS Spreadsheet CDCAAS Requirements and configuration Management CDCAAS Secure Communication CDCAAS Graphics Presentation CDCAAS Word Processor CDCAAS Project Management CDCAAS GPS Navigation CDCAAS Compiler CDCAAS Debugger & Test CDCAAS Electronic Inventory and Tracking

CDCAAS Notebook Bundled Devices

CDCAAS Application Packages:

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer

Version 1.0 12/08/2008

Table of Contents
Revisions ........................................................................................... ii Preface ............................................................................................. iii Abstract ............................................................................................ iv Project Charter .................................................................................... v Initial Cost Estimate for CDCAAS .............................................................. vii Project Scope...................................................................................... ix Table of Contents ................................................................................. xi Table of Figures ..................................................................................xiv Index of Tables .................................................................................. xvii 1. Introduction ..................................................................................... 1 1.1. Project Overview .......................................................................... 1 1.1.1. Project Description................................................................... 1 1.1.2. Product Summary ................................................................... 10 1.2 Project Deliverables ..................................................................... 15 1.2.1 Software Applications, Computer Software Configuration Item (CSCI)...... 15 1.2.2 Delivery Locations and Quantities ................................................ 16 1.2.3 Documentation....................................................................... 17 1.2.4 Delivery Customer Acceptance .................................................... 17 1.3 Evolution of the Software Project Management Plan ............................... 18 1.4 Reference Materials ...................................................................... 19 1.5 Definitions and Acronyms ............................................................... 19 1.5.1 Definitions ............................................................................ 19 1.5.2 Acronyms ............................................................................. 22 2. Project Organization ......................................................................... 27 2.1 Process Model ............................................................................. 27 2.1.1 Process Milestones .................................................................. 27 2.1.2 Baselines .............................................................................. 30 2.1.3 Reviews ............................................................................... 30 2.2 Organizational Structure ................................................................ 33 2.3 Organization Boundaries and Interfaces .............................................. 40 2.4 Project Responsibilities.................................................................. 42 2.4.1 Project Manager ..................................................................... 42 2.4.2 Assistant Project Manager.......................................................... 42 2.4.3 Chief Software Developer .......................................................... 42 2.4.4 Administrative Assistant ............................................................ 42 2.4.5 System Engineer/Analyst ........................................................... 42 2.4.6 Requirements Analysts.............................................................. 42 2.4.7 Technical Team Leader............................................................. 42 2.4.8 Software Developers ................................................................ 42 2.4.9 Testers ................................................................................ 43 2.4.10 Help Desk Technician .............................................................. 43 2.4.11 Project Specialists ................................................................. 43 xi

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer

Version 1.0 12/08/2008

3. Managerial Process ........................................................................... 57 3.1 Management Objectives & Priorities ................................................. 57 3.1.1 Goals & Objectives .................................................................. 57 3.1 Management Objectives & Priorities ................................................. 58 3.1.1 Goals & Objectives .................................................................. 58 3.1.2 Management Priorities .............................................................. 60 3.1.2 Management Priorities .............................................................. 61 3.2 Assumptions, Dependencies and Constraints ......................................... 62 3.2.1 Assumptions .......................................................................... 63 3.2.2. Dependencies ....................................................................... 63 3.2.3 Constraints ........................................................................... 63 3.2 Assumptions, Dependencies and Constraints ......................................... 64 3.2.1 Assumptions .......................................................................... 64 3.2.2. Dependencies ....................................................................... 64 3.2.3 Constraints ........................................................................... 65 3.3 Risk Management ......................................................................... 65 3.4 Monitoring and Controlling Mechanism ................................................ 67 3.4.1 Schedule .............................................................................. 67 3.4.2 Budget ................................................................................ 69 3.4.3 Quality Assurance ................................................................... 70 3.4.4 Productivity .......................................................................... 72 3.4.6 Measures .............................................................................. 76 3.5 Staffing Plan .............................................................................. 79 3.5.1 Obtaining ............................................................................. 79 3.5.2 Training ............................................................................... 79 3.5.3 Retaining.............................................................................. 79 3.5.4 Phasing out of personnel ........................................................... 80 3.5.5 Staff Expertise ....................................................................... 96 4. Technical Process............................................................................. 98 4.1 Methods, Tools & Techniques........................................................... 98 4.2 Software Documentation ............................................................... 101 4.3 Project Support Functions ............................................................. 105 4.3.1 Configuration Management ....................................................... 105 4.3.2 Quality Assurance .................................................................. 107 4.3.3 Verification and Validation ....................................................... 108 4.3.4 Test Evaluation ..................................................................... 109 4.3.4 Test and Evaluation ................................................................ 109 5. Work Packages, Schedule, and Budget ................................................... 110 5.1 Work Packages ........................................................................... 110 5.1.1 Work Breakdown Structure ....................................................... 110 5.1.2 Work Package Specifications ..................................................... 116 5.2 Dependencies ............................................................................ 118 5.3 Resource Requirement.................................................................. 136 5.3 Resource Estimate....................................................................... 145 xii

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer

Version 1.0 12/08/2008

5.4 Budget and Resource Allocation....................................................... 159 5.5 Schedule .................................................................................. 162 Additional Components ........................................................................ 172 1.1. Subcontracting Process ................................................................ 172 1.1.1 Selection of Subcontractors....................................................... 172 1.1.2. Coordinating with Subcontractors .............................................. 174 1.1.3. Integrating with Subcontractors................................................. 174 1.1.4. Controlling Subcontractors ....................................................... 175 2.1. Security Considerations................................................................ 175 3.1. Training Plans ........................................................................... 176 4.1. Alpha & Beta Test Plan ................................................................ 176 4.1.1. Alpha Testing ...................................................................... 176 4.1.2. Beta Testing ........................................................................ 178 5.1. Installation & Training Plans .......................................................... 178 6.1. Post Deployment Support Procedures ............................................... 179 Index .............................................................................................. 180 Appendix I........................................................................................... 1 Detailed Resource Estimate Spreadsheet (23 columns plus intermediate calculations) attached. ....................................................................... 1 Appendix II .......................................................................................... 1 Detailed Resource Estimate Spreadsheet with formulas revealed--attached. ......... 1 Appendix III ......................................................................................... 1 Work Package Specifications .................................................................. 1 Configuration Management ................................................................. 1 Communication ............................................................................... 2 Graphic Presentation ........................................................................ 3 Word Processing .............................................................................. 4 Project Management ......................................................................... 5 Global Positioning System Gps.............................................................. 6 Compile/Link/Runtime ...................................................................... 7 Language Independent Debugging And Testing .......................................... 8 Electronic Inventory and Tracking ......................................................... 9 Binder Back Cover ............................................................................... 10

xiii

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer

Version 1.0 12/08/2008

Table of Figures
Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure 1: Operational Architecture ............................................................. 3 2: Network Architecture.................................................................. 4 3: Product Technical Architecture ...................................................... 5 4: Software Architecture ................................................................. 6 5: Notebook Hardware Architecture .................................................... 7 6: Standalone Environment .............................................................. 8 7: Mobile Client/Server Architecture Only ............................................ 8 8: Client/Server Architecture with Central Location ................................ 9 9: Activities, Benchmarks and Success Indicators .................................. 30 10: CDCAAS Organization Chart ....................................................... 34 11: Software Division Organization Chart ............................................ 35 12: CDCAAS Project Organization Chart.............................................. 36 13: Analysis Design and Development Team......................................... 38 14: CDCAAS Project Team Structure.................................................. 39 15: Program Manager Organizational Interfaces .................................... 40 16: Project Manager Organizational Interfaces ..................................... 41 17: Responsibility Matrix Summary ................................................... 44 18: Total Project Responsibility Matrix .............................................. 45 19: Database Management System Package Responsibility Matrix ............... 46 20: Compiler Package Staffing ........................................................ 81 21: GPS Navigation Package Staffing ................................................. 83 22: Graphics Package Staffing ......................................................... 87 23: Total Project Staffing Chart ....................................................... 89 24: Total Project Staffing by Package ................................................ 91 26: Document Production Process Flow Chart ..................................... 104 27: Work Breakdown Structure (Overall) ........................................... 110 28: Figure 27: Work Breakdown Structure (Secure Communication) ........... 111 29: WBS GPS Navigation ............................................................... 111 30: WBS Database Management System ............................................. 112 31: WBS Spreadsheet .................................................................. 113 32: Dependencies (Part 1) ............................................................ 118 33: Dependencies (Part 2) ............................................................ 119 34: Dependencies (Part 3) ............................................................ 120 35: Dependencies (Part 4) ............................................................ 121 36: Dependencies (Part 5) ............................................................ 122 37: Dependencies (Part 6) ............................................................ 123 38: Dependencies (Part 7) ............................................................ 124 39: Dependencies (Part 8) ............................................................ 125 40: Dependencies (Part 9) ............................................................ 126 41: Phase Distribution (Project Overall) ............................................ 127 42: : Phase Distribution (Project Plans & Requirements)......................... 127 43: : Phase Distribution (Project Programming) ................................... 128 xiv

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer

Version 1.0 12/08/2008

Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure Figure

44: 45: 46: 47: 48: 49: 50: 51: 52: 53: 54: 55: 56: 57: 58: 59: 60: 61: 62: 63: 64: 65: 66: 67: 68: 69: 70: 71: 72: 73: 74: 75: 76: 77: 78: 79: 80: 81: 82: 83: 84: 85: 86: 87:

Distribution (Project Product Design)........................................... 128 Phase Distribution (Project Integration & Test) ............................... 129 Project Maintenance (Part 1) .................................................... 129 Project Maintenance (Part 2) .................................................... 130 Project Maintenance (Part 3) .................................................... 130 Project Maintenance (Part 4) .................................................... 131 Module Overall (Database Management) ....................................... 131 Module Plans & Requirements (Database Management) ..................... 132 Module Programming (Database Management) ................................ 132 Module Product Design (Database Management) .............................. 133 Module Integration & Test (Database Management) .......................... 133 Module Maintenance (Database Management Part 1) ....................... 134 Module Maintenance (Database Management Part 2) ....................... 134 Module Maintenance (Database Management Part 3) ....................... 135 Module Maintenance (Database Management Part 4) ....................... 135 Resource Loading Chart 1......................................................... 136 Resource Loading Chart 2......................................................... 136 Resource Loading Chart 3......................................................... 137 Resource Loading Chart 4......................................................... 137 Resource Loading Chart 5......................................................... 138 Resource Loading Chart 6......................................................... 138 Resource Loading Chart 7......................................................... 139 Resource Loading Chart 8......................................................... 139 Resource Loading Chart 9......................................................... 140 Resource Loading Chart 10 ....................................................... 140 Resource Loading Chart 11 ....................................................... 141 Resource Loading Chart 12 ....................................................... 141 Resource Loading Chart 13 ....................................................... 142 Resource Loading Chart 14 ....................................................... 142 Resource Loading Chart 15 ....................................................... 143 Resource Work Summary Report Chart ......................................... 143 Resource Work Availability Report Chart ....................................... 144 Software Productivity (SLOC/SM) by Application Domains .................. 146 COCOMO Summary Screen ........................................................ 149 COCOMO Product Parameters .................................................... 150 COCOMO Platform Parameters ................................................... 150 COCOMO Personnel Parameters ................................................. 151 COCOMO Project Parameters .................................................... 151 COCOMO Scale Parameters ....................................................... 152 COCOMO Equation Parameters ................................................... 152 COCOMO Phase Distribution-Project Overall................................... 153 COCOMO Phase Distribution-Project Plans & Requirements ................. 153 COCOMO Phase Distribution-Project Programming ........................... 154 COCOMO Phase Distribution-Project Product Design ......................... 154 xv

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer

Version 1.0 12/08/2008

Figure 88: COCOMO Phase Distribution-Project Integration & Test ..................... 155 Figure 89: Resource Estimate Spreadsheet Part 1 ......................................... 157 Figure 90: Resource Estimate Spreadsheet Part 2 ......................................... 158 Figure 91: CDCAAS System Master Schedule ................................................ 162 Figure 92: Debugger & Test Schedule........................................................ 163 Figure 93: CDCAAS Electronic Inventory & Tracking/Custom Development Detailed Schedule ......................................................................................... 164 Figure 94: CDCAAS COTS Development Detailed Schedule ............................... 165

xvi

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Kenneth Nidiffer

Version 1.0 12/08/2008

Index of Tables
Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table Table 1: Project Sponsor Information .......................................................... v 2: Major Deliverables ..................................................................... vi 3: Key Staffing Requirements ........................................................... vi 4: Other Constraints and Assumptions ................................................. vi 5-Initial Cost Estimate for CDCAAS ................................................... viii 6: Profile of Typical Product Users .................................................... 13 7: Software Applications, Computer Software Configuration Item (CSCI) ....... 16 8: Delivery Locations and Quantities .................................................. 17 9: Delivered Documentation ............................................................ 17 10: Spreadsheet Package Responsibility Matrix ...................................... 47 11: Requirements & Configurations Package Responsibility Matrix ............... 48 12: Secure Communications Package Responsibility Matrix ........................ 49 13: Graphics Extension Package Responsibility Matrix.............................. 50 14: Word Processor Package Responsibility Matrix .................................. 51 15: Project Management Package Responsibility Matrix ........................... 52 16: GPS Navigation Package Responsibility Matrix .................................. 53 17: Compiler Package Responsibility Matrix.......................................... 54 18: Debugger and Test Package Responsibility Matrix .............................. 55 19: Electronic Inventory and Tracking Package Responsibility Matrix ............ 56 20: Risk Management Description and Mitigation Strategies ....................... 67 21: Compiler Package Staffing ......................................................... 82 22: GPS Navigation Package Staffing .................................................. 84 23: Secure Communication Package Staffing ........................................ 86 24: Graphics Package Staffing .......................................................... 88 25: Total Project Staffing ............................................................... 90 25: Total Project Staffing by Package................................................. 92 27: Document Table .................................................................... 104 28: Work Breakdown Structure ........................................................ 115 29: Resource Estimate-Method 1...................................................... 147 30: Resource Estimate-Method 2...................................................... 148 31: Resource Estimate-Method 3...................................................... 148 32: Budget and Resource Allocation .................................................. 161 33: Current Price-Breakeven Table ................................................... 168 34: Decreased Price Breakeven Table ................................................ 169 35: Increased Price-Breakeven Table ................................................ 170 36: Optimum Price-Breakeven Table ................................................. 171 37: Training Plans ....................................................................... 176

xvii

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

1. Introduction
1.1. Project Overview 1.1.1. Project Description
Government Law Enforcement Agencies face new challenges everyday and to outsmart these challenges their dependency on technology is growing proportionally. Crime Data Collection, Aggregation and Assimilation System (CDCAAS) are a tool designed with enhanced functionality which will give these agencies an upper hand against Criminals. Our objective is to come up with a unified tool that will have the capability to work of different platforms, understand all secure formats, and communicate over broad range of medium without compromising on the stringent level of security that is specific to the government agencies and other corporations. CDCAAS will provide the corporate user with a pool of utilities that enables to take control of various technical and managerial aspects of the functionality, information control, data abstraction, easy access to information and scheduling. The system will also have the capability to be customized depending upon the changing corporate needs. The outcome of the project will be a unified system providing the government agencies with all the information ever collected for the entity under the scanner. Our goal of delivering the project on time with the committed capabilities, will write a new chapter of success for us. Capabilities delivered will include access control, identification, and data mining. Delivery of the system capabilities to the 700 advanced orders and subsequent orders will give the corporation a foot hold with financial stability. In addition, for each of the sites that have preordered systems we must install at least 50% of the systems and provide training on our application packages for 50 users. The enhancement for this application will involve producing additions to the following packages: 1. CDCAAS Database Management System - Custom Developed 2. CDCAAS Spreadsheet -COTS 3. CDCAAS Requirements and Configuration Management - Reuse 4. CDCAAS Secure Communication - Custom Developed 5. CDCAAS Graphics Presentation - COTS 6. CDCAAS Word Processor - Reuse 7. CDCAAS Project Management - COTS 8. CDCAAS GPS Navigation - Outsourced to Ivan Industries 9. CDCAAS Compiler - Reuse 1

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

10. CDCAAS Debugger & Test - Custom Developed 11. CDCAAS Electronic Inventory and Tracking - Custom Developed This system is designed to be marketable to a wide variety of clients as follows: It can be used at all levels of government from the federal government level organizations such as DHS, FBI, CIA and DOD all the way down to local law enforcement level. It can be customized for implementation in any corporation whether big or small to provide a one stop resource to manage information access and control. Wide customization capability of the CDCAAS system in addition to providing control, abstraction, access, and management of information makes it a global performer, and should make us a preferred choice the world over in the years to come. The Department of Defense provides a set of guidelines known as Security Technical Implementation Guides (STIG) that provide a security configuration baseline; systems must meet to be used within the Department of Defense. By following these guidelines less time and money will be required for integration by our team and by our federal customers, increasing the competitiveness and marketability of our product.

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 1: Operational Architecture

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 2: Network Architecture

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 3: Product Technical Architecture

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 4: Software Architecture

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 5: Notebook Hardware Architecture

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 6: Standalone Environment

Figure 7: Mobile Client/Server Architecture Only

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 8: Client/Server Architecture with Central Location

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

1.1.2. Product Summary 1.1.2.1. Product Overview


The Crime Data Collection, Aggregation and Assimilation System (CSCAAS) is a criminals' data assimilation system that is flexible enough for Homeland Security, TSA, Corporate America, and law enforcement from the local to international levels. This tool collects the information about the criminal from the user and aggregates and assimilates the information from various databases and resources, and presents accurate information to the users. This will allow organizations an efficient way to get a more holistic view of criminal's information it can act on more rapidly. The most important aspect of the system is to provide faster and higher rates of case-closing through more efficient gathering and analysis of crime related data. The system will have the capability to study the criminal's pattern based on the data available and present the user with highly graphical reports within a click of a button. The system will also have the ability to facilitate reporting on query by date, time and location. The system is also capable of generating alerts when new data; for the queries already processed; is fed to the database. These alerts can be sent through multiple channels including email and text messages to mobile devices. The goal of CDCAAS is to provide a convenient and robust user interface to allow the users to enter the information pertaining to the criminal, this interface will retrieve the information from various databases i.e. Fingerprint Database, DNA, DMV and other. This system will accommodate increased information processing and sharing demands in support of anti-terrorism. This initiative will provide new services to the state, local and federal partners. The implementation of CDCAAS should revolutionize the security and general crime data collection industry. The system should reduce the monetary costs, and greatly reduce the time involved in identifying individuals for security or other purposes. The system also has the potential to reduce the risks to innocent civilians by allowing faster identification of threats or the risk to corporations of losing company secrets due to unauthorized individuals entering sensitive areas of the company. The CDCAAS system will be developed by extending the System CDC software packages to support: Distributed Data Access with Access Control Information Exchange Management Interactive Input System Pattern Study and Reporting

Each of these functions will be built using the following packages: General Purpose Database Package 10

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Spreadsheet Package Requirements and Configuration Management Package Secure Communication Package Graphics Presentation Package Word Processing Package Project Management Package GPS Navigation Package Compile/Link/Runtime Packages for BASIC, C, and C++ Language Independent Debugging and Testing Package Electronic Inventory and Tracking Package

Database Package: This package will provide the user not only with the bare bone functionality of designing tables, but also allow advanced users to import/export data in different files and formats. Package being supported by a high end graphical interface will allow the users to manage the database within fraction of the time as taken by present day systems. The package will be able to handle concurrent user connections whether local or remote. This is OLE, ODBC and InnoDB compliant. Spreadsheet Package: Provide a custom 'spreadsheet wizard' which will allow data analysts or other users the customize reports for information updates, identification and alerts in a generated reports. This package must be able to handle a minimum of one hundred concurrent user connections and must be OLE and ODBC compliant. Requirements and Configuration Management Package: Its ability to manage information scheduling, and providing contact information for users such as security personnel and points of contact eliminates the need of any other application to be used in conjuncture. This must be OLE, ODBC and InnoDB compliant. Secure Communications Package: Package has been designed to allow secure transfer of data packets from the input devices to the system and between systems. Keeping in mind the increase in security vulnerabilities every single day, the package has been fitted with self updating mechanism which will keep a tab on new threats and update itself to counter them. Information Exchange Management Package: This package enables the user to have a single point of resource to access all information available. It acts as a bridge between all the databases and allows conversion of one data format to another. This must be OLE, ODBC and InnoDB compliant. Graphics Presentation Package: High end graphical interface enables the user to spend less time on configuration and handling. The package is also capable of presenting the generated reports in different graphical formats. Word Processing Package: Provides generic macros and custom 'word processor wizard' which will allow the user to create custom and preformatted reports based on data and logs in the system.

11

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Project Management Package: This provides project managers with the ability to generate real time information in the form of resource estimation and usage reports. This also allows them to generate other information such as WBS (Work breakdown Structures), CPM (Critical Path Method), PERT (Program Evaluation and Review Technique) and Gantt charts, etc available and used to develop CDCAAS. This must be OLE, ODBC and InnoDB compliant GPS Navigation Package: Enables the dependent modules to study the patterns generated based on fetched information and then present in various graphical formats. This package must be OLE, ODBC and InnoDB compliant. Compile Link and Runtime Package: Provides the necessary additional software libraries to perform executable building functions from C, C++, Basic Programming Languages to Java and Flash. Also provides additional libraries for utilizing the secure communications package to communicate over the network. Language Independent Debugging and Testing Package: Provides wizards to perform usability and functionality testing through automation scripts. This also gives System Administrators the ability to debug problems on the fly in production systems deployed in the field. Electronic Inventory and Tracking Package: User will be able to track hardware and any other movable equipment. The user can log issues, comments and will be permitted to generate any number of tracking statuses to be used.

1.1.2.2 Profile of Typical Product Users


The following table lists out the categories of system users, a brief description of each and the knowledge and skill level expected for them. System Administrators Profile of Typical Product Users Monitor and maintain the system. An individual who configures the CDCAAS system by setting system data validation and searching policies, assigning and revoking user authorizations , adding and removing law enforcement agencies , among other duties. Have the ability to enter new and update existing information. For example, these could be security or human resources personnel adding individual profiles to the system. It is assumed that these users will have at least an intermediate level of computer knowledge and skill. Monitor user access and system modification logs. It is assumed that these users will have at least a basic level of computer knowledge and skill.

Data Entry Person

Information Assurance Monitor

12

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Data Analysts

Law Enforcement Agency

Profile of Typical Product Users An individual who has restricted ability to change existing data. An analyst enters and modifies An analysts enters and modifies biometrics and DNA data. This role is usually performed by a role is usually performed by a forensic biologist. Perform real-time search to solve the criminal case.
Table 6: Profile of Typical Product Users

1.1.2.3 Development Environment


The CDCAAS system will be developed with system XYZ notebook computer with following configuration: CDCAAS Hardware (Notebook Configuration): Two Universal 2009-B microprocessors, 3.4 GHz 20.1 inch display A three-button mouse point device 4 GB of main memory (SDRAM) 16 MB Video RAM 320 GB primary hard drive DVD/R/RW AMD CD-RW Combo Drive A 100 MB ZIP drive Integrated 2.0 MP camera Printer port Asynchronous port Integrated 802.11 g wireless LAN Four USB ports 1394/FireWire connector A LAN interface card A 56,000 bps capable fax/modem Anti-virus/security suite Video card

CDCAAS Notebook Bundled Devices: Speakers Head phones Laser printer 9-cell lithium ion battery MP-3 30 GB audio/video player A Bar Code scanner 13

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Personal Assistant Device (416 megahertz, I GB RAM) 48-bit Color Flatbed Scanner (2400 dpi optical resolution) System ABC enhanced keyboard A CRT monitor (1,280 x 1,024 non-interlaced; high resolution; bit-mapped; 21 inch color display) Port replicator A stand for the CRT monitor Power connector Digital camcorder Internal Speakers Wireless Digital Phone with voice mail messaging and Internet service

1.1.2.4 Priorities and Constraints


Most Important Product features: Collect the information from the user and search all the available databases which have criminals biometric information available and retrieve the information to the user in a fast and efficient way. Ability to be a centralized place for data retrieval. Providing access to the authorized users and also providing security, data integrity and user authentication to the users. The system should be able to handle a high user friendly interface with available hardware and software resources. The system must operate in available network. Working prototypes are required to show demos to customers on a timely basis regarding the progress of the project.

Environmental Conditions and Constraints:

1.1.2.5. Risk Factors


In managing the areas of risk from product planning to closeout, the following activities are ensured by the Product Manager: Areas of risk are identified and maintained Factors that contribute to the potential occurrence of each risk are identified Specific tasks for monitoring identified risk factors and for reducing the potential occurrences of each risk are documented Appropriate mitigation plans for each area of risk are identified Risk items, risk status, and risk mitigation plans are documented and reviewed by product and senior management on a regular basis 14

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

The following risk factors are only those assumed at the time of current version development. The list will be continually updated Inadequate requirements (e.g., unstable, conflicting, poorly defined) Scope creep (e.g., continuing stream of additional requirements) Unrealistic or dynamic schedules and budgets Shortfalls in qualified personnel (e.g., unavailable when needed, novice when expert needed) Delay in the procurement of software resources critical to project success Budget Overrun. Shortfalls in externally performed tasks (e.g., subcontractor failure to deliver, late delivery) Shortfalls in externally furnished components (e.g., COTS software, hardware components) Shortfalls in performance capability (e.g., inability of databases to scale-up)

Risks for each task are documented in the monthly Program Management Reviews. The risk tables include the identified risk; a qualitative assessment of whether the risk is considered low, medium, or high based on its likelihood of occurrence and impact on the project; mitigation activities; and, whether the risk has changed status since it was reviewed at the previous PMR. All participants in the PMR are responsible for reviewing the risks in their functional areas and updating them as necessary.

1.2 Project Deliverables 1.2.1 Software Applications, Computer Software Configuration Item (CSCI)
Application and Acquisition Documentation Type delivery date Custom 28 January 2010 COTS 28 January 2010

CDCAAS Software Product Name Database Management System Spreadsheet

Code (CSCI-08-NO1)

(CSCI-08-N02)

Function Stores data individuals and hardware View and create spreadsheets

15

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

CDCAAS Software Product Name Requirements and Configuration Management Secure Communication Graphics Presentation Word Processor Project Management GPS Navigation

Code (CSCI-08-N03)

Function Email and appointment management Facilitate secure connection to remote networks Generate and view graphical reports Generate and view written reports Generate and control program plans Monitor and track hardware and individuals Develop CDCAAS addons and plug-ins Debug and test CDCAAS add-ons and plug-ins Inventory and track hardware and individuals

Application and Acquisition Documentation Type delivery date Reuse 28 January 2010

(CSCI-08-N04)

Custom

28 January 2010

(CSCI-08-N05) (CSCI-08-N06) (CSCI-08-N07) (CSCI-08-N08)

COTS

Compiler Debugger and Test Electronic Inventory and Tracking

(CSCI-08-N09) (CSCI-08-N10)

28 January 2010 Reuse 28 January 2010 COTS 28 January 2010 Outsourced 28 January to Ivan 2010 Industries Reuse 28 January 2010 Custom 28 January 2010 Custom 28 January 2010

(CSCI-08-N11)

Table 7: Software Applications, Computer Software Configuration Item (CSCI)

1.2.2 Delivery Locations and Quantities


We have received advanced orders for 700 systems to be deployed to the sites listed n Table 11 below. At each site, we are required to install at least 50% of the systems and provide training on our application packages for 50 users.

16

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Delivery Locations and Quantities Success Metric Location Quantity


Norfolk, VA Jacksonville, FL Memphis, TN Dallas, TX San Diego, CA Mishawaka, In Boston, MA Mobile, AL Total 100 100 100 50 100 150 50 50 700 Table 8: Delivery Locations and Quantities Install at least 50% of systems ordered and train 50 users

1.2.3 Documentation
Document Requirements Specification Detailed Design Documents Documented Source Code Test Plan and Test Cases Test Results (including performance benchmarks) Traceability Matrices User Manuals Training Manuals (including a Getting Started User's Guide) Installation Instructions Maintenance Guide Version Description Document COTS Custom X X X X X X X X X X X X X X X X X X X X X X X X Re-Use Out-Source X X X X X X X X X X X

Table 9: Delivered Documentation

1.2.4 Delivery Customer Acceptance


Applications with corresponding documentation will be developed iteratively with a series of internal product releases, to be reviewed by the customer, all local and state government agencies, during the next 24 months, with a full set of packages 17

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

available in final version for alpha testing on August 29, 2008. COTS documentation will be provided in the form it is available from the vendor(s). The Requirements Specification will detail how the COTS applications are expected to work based on open source documented vendor claims and how they will be integrated in the extension to CDCAAS. Change pages will be made for reuse documentation only for required changes made in the following applications: CSCI-08N03, CSCI-08-N06, and CSCI-08-N09. Outsource application and documentation for CSCI-08-N08 will be provided as available during the series of internal product releases. During the two-month beta test, all applications and documentation will be subject to signed-off customer acceptance based on the set of test performance benchmarks corresponding to the system requirements. The procedures for unmet performance benchmarks will be detailed in the Software Project Management Plan to achieve customer acceptance by the subsequent two month customer roll-out at the remaining customer sites. Any unmet critical performance benchmarks after customer roll-out will be handled on a customer assigned priority basis with the appropriate expertise (with our company working on custom and reuse and our company integrating COTS and outsourced software) during the six-month maintenance period.

1.3 Evolution of the Software Project Management Plan


Date
February 11, 2008 February 18, 2008 February 25, 2008 March 17, 2008 March 31 , 2008 April 14, 2008 April 21 ,2008 May 5, 2008

Deliverable
Stage I -Project Definition, Project Charter, Product & Project Summaries Stage 2 -Management Objectives and Priorities, Assumptions, Dependencies and Constraints, Resource Estimate, Schedule Estimate Stage 3 -Process Model, Organizational Structure, Organizational Interfaces, Project Responsibilities Stage 4 -Technical Methods, Tools, and Techniques, Software Documentation, Project Support Functions, Staffing Plan Stage 5 -Monitoring and Controlling Mechanisms Stage 6 -Work Packages, Dependencies, Resource Requirements, Budget and Resource Allocation, Schedule Stage 7 -Risk Management, Reference Materials, Definitions and Acronyms Stage 8 -Additional Components, Index, Appendices, CDCAAS End

Table 10: Evolution of the Software Project Management Plan

18

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

1.4 Reference Materials


The following documents were used as references when creating this software project management plan. "COCOMO II Model Definition Manual", Version 1.4, University of Southern California. "CMMI for Development, Version 1.2", CMMI-DEV, V1.2, CMU/SEI-2006-TR-008, ESC-TR-2006-008, Carnegie Mellon University Software Engineering Institute. "Database Security Technical Implementation Guide Version 8.1 ", September 192007, Defense Information Systems Agency. "IEEE standard for software project management plans", IEEE Std 1058.1-1987, Computer Society. "Introduction to Software Risk Management", Carnegie Mellon University Software Engineering Institute. John/Jane Doe Manager, "Applications Software for System XYZ and Life-Cycle System Delivery Expectations", January 28 2008, Applications Software Development. Kenneth E. Nidiffer, "SWE 625 Course Notes", January 2008, George Mason University. Richard E. Fairley, "A Guide for Preparing Software Project Management Plans", November 1986, Wang Institute Tech Report TR-86-14. "Secure Remote Computing Security Technical Implementation Guide Version 1.2", August 10 2005, Defense Information Systems Agency. "Sharing Peripherals Across the Network Security Technical Implementation Guide Version 1.1", July 28 2005, Defense Information Systems Agency. "USC COCOMOII.1997 Reference Manual", 1997, University of Southern California. "USC COCOMO 11.1999.0 Software", [http://sunset.usc.eduJresearch/COCOMOIII], USC CSE Center for Software Engineering. "Wireless Security Technical Implementation Guide Version 5.2' November 15 2007, Defense Information Systems Agency.

1.5 Definitions and Acronyms 1.5.1 Definitions


Alpha Test In-house testing of pre-production products and eliminate most obvious design defects or deficiencies, either in a laboratory setting or in some part of the developing firms regular operations. 19

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Baseline A specification or product that has been formally reviewed and agreed upon, that thereafter serves as the basis for further development and that can be changed only through formal change control procedure. Beta Test In software development, a beta test is the second phase of software testing in which a sampling of the intended audience tries the product out. Beta testing is considered "pre-release testing." Capability Maturity Model (CMM) in software engineering is a model of the maturity of the capability of certain business processes. A maturity model can be described as a structured collection of elements that describe certain aspects of maturity in an organization, and aids in the definition and understanding of an organization's processes. Capability Maturity Model Integration (CMMI) in software engineering and organizational development is a process improvement approach, which wants to provide organizations with the essential elements of effective process improvement. It can be used to guide process improvement across a project, a division, or an entire organization. Constructive Cost Model (COCOMO) is an algorithmic Software Cost Estimation Model developed by Barry Boehm. The model uses a basic regression formula, with parameters that are derived from historical project data and current project characteristics. Configuration Management The process of identifying and defining the deliverable product set in a system, controlling the release and change of these items throughout the system life cycle, recording and reporting the status of product items and change request. Such information typically includes the versions and updates that have been applied to installed software packages and the locations and network addresses of hardware devices. Cost Calculated from the time variable in developing an internal project by multiplying time with the cost of the team members involved. When hiring an independent consultant for a project, cost will typically be determined by the consultant or firms hourly rate multiplied by an estimated time to complete. Commercial Off-The-Shelf (COTS) software or hardware products, which are readymade and available for sale to the general public. Detailed Design Process of refining and expanding the preliminary design of a system or component to the extent that the design is sufficiently complete to be implemented. See also: software development process. Global Positioning System (GPS) is a Global Navigation Satellite System (GNSS) developed by the United States Department of Defense. It is the only fully functional GNSS in the world. It uses a constellation of between 24 and 32 Medium Earth Orbit satellites that transmit precise microwave signals, which enable GPS receivers to determine their current location, the time, and their velocity. 20

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Integration Testing Testing performed to expose faults in the interfaces and in the interaction between integrated components. Kernel The guts of any operating system (OS). The system loads the kernel into the main memory which stays there while other pieces of the OS go in and out of memory. The kernel control s all request for disks, processors, or other resources. Lifecycle - Lifecycle refers to the process used to build the deliverables produced by the project. There are many models for a project lifecycle. Milestone A milestone is a scheduling event that signifies the completion of a major deliverable or a set of related deliverables. A milestone, by definition, has duration of zero and no effort. There is no work associated with a milestone. It is a flag in the work plan to signify that some other work has completed. Usually, a milestone is used as a project checkpoint to validate how the project is progressing. In many cases there is a decision, such as validating that the project is ready to proceed further, that needs to be made at a milestone. Outsource Refers to a company that contracts with another company to provide services that might otherwise be performed by in-house employees. Peer Review A review of a software work product, following defined procedures, by peers of the producers of the product for the purpose of identifying defects and improvements. Preliminary Design Process of analyzing design alternatives and defining the architecture, components, interfaces, and timing and sizing estimates for a system or component. Quality Assurance A planned and systematic pattern of all actions necessary to provide adequate confidence that an item or product conforms to established technical requirements. Requirements A condition or capability that must be met or possessed by a system or system component to satisfy a contract, standard, specification, or other formally imposed documents. Scope Scope is the way you describe the boundaries of the project. It defines what the project will deliver and what it will not deliver. High-level scope is set in your project definition (charter) and includes all of your deliverables and the boundaries of your project. The detailed scope is identified through your business requirements. Security Technical Implementation Guides (STIG) A Department Of Defense security guideline for configuration of COTS software and hardware. Software Life Cycle Process - Software Life Cycle Process is a method and standards for improving and mastering development processes, supporting processes and management processes throughout the software lifecycle. Spiral Model The spiral model is a software development process combining elements of both design and prototyping-in-stages, in an effort to combine advantages 21

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

of top-down and bottom-up concepts. The spiral model is intended for large, expensive and complicated projects. Stakeholder Specific people or groups who have a stake in the outcome of the project are stakeholders. Normally stakeholders are from within the company and may include internal clients, management, employees, administrators, etc. A project can also have external stakeholders, including suppliers, investors, community groups, and government organizations. System Testing Testing conducted on a complete, integrated system to evaluate the system's compliance with its specified requirements. Unit Testing a method of testing that verifies the individual units of source code are working properly. A unit is the smallest testable part of an application. Virtual Private Network (VPN) computer network in which some of the links between nodes are carried by open connections or virtual circuits in some larger network (e.g., the Internet) instead of by physical wires. The link-layer protocols of the virtual network are said to be tunneled through the larger network when this is the case. One common application is secure communications through the public Internet, but a VPN need not have explicit security features, such as authentication or content encryption. VPNs, for example, can be used to separate the traffic of different user communities over an underlying network with strong security features. Waterfall Model The waterfall model is a sequential development process, in which development is seen as flowing steadily downwards (like a waterfall) through the phases of requirements analysis, design, implementation, testing (validation), integration, and maintenance. Work Breakdown Structure A work breakdown structure or WBS is a tree structure, which permits summing of subordinate costs for tasks, materials, etc., into their successively higher level parent tasks, materials, etc. It is a fundamental tool commonly used in project management and systems engineering. Work Package Like a project plan in miniature, a work package is a subset of a project that can be assigned to a specific party for execution.

1.5.2 Acronyms
Acronyms 1M 3GL AAF AAM ACAP ACAP ACM ACT AEXP Term Percentage of integration redone during reuse Third-Generation Programming Language Adaptation Adjustment Factor Adaptation Adjustment Multiplier Analyst Capability Analyst Capability (COCOMO) Association for Computing Machinery Annual Change Traffic Applications Experience (COCOMO) 22

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Acronyms ASLOC AT BPS BRAK C4I CASE CDCAAS CDRL CD-RW CEO CIO CM CM CMM CMMI COCOMO Conops CORBA COTS CPI CPLX CSCI CSTB DATA DBMS DI DM DOCU DRC DSI DVD DVD-R DVD-RW EAF ECP ECR EDS EIA EM ESLOC EV FCIL

Term Adapted Source Lines of Code Automated Translation Bits per Second Breakage. The amount of controlled change allowed in a software development before requirements are unfrozen. (Command/Control/Communications/Computer and Intelligence) Computer Aided Systems Engineering Crime Data Collection, Aggregation, and Assimilation System Contract Data Requirements Lists Compact Disc Rewritable Chief Executive Officer Chief Information Officer Percentage of code modified during reuse Configuration Management Capability Maturity Model Capability Maturity Model lntegration COnstructive COst MOdel Concept of Operations Document Common Object Request Broker Architecture Commercial Off-the-Shelf Cost Performance Index (COCOMO) Product Complexity Computer Software Configuration Hem Computer Science and Telecommunications Board Data Base Size I COCOMO Database Management System Degree of Influence Percentage of design modified during reuse Documentation to match lifecycle needs Direct Resource Code Delivered Source Instructions Digital Versatile Video Disc DVD non-Rewriteable format Read only DVD Rewriteable format Read and Write Effort Adjustment Factor (COCOMO) Engineering Change Proposals Exploration Commitment Review Electronic Data Systems Electronic Industry Alliance Engineering Manager Equivalent Source Lines of Code Earned Value Facilities 23

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Acronyms FCR FLEX FOUL FP GB GFS GHz GNU GOTS GPS GUl H/W HWCl ICASE IDE IEEE INCOSE ISMS ISO IT ITIL KASLOC KEDSI KESLOC KSLOC LAN LEXP LTEX MHz MIS MLS MM MODP MS N1ST NDI NOP OCR ODBC OLE OPF OS PCAP

Term Foundation Commitment Review Development Flexibility (COCOMO) Follow-Ons Unlimited". Function Points Gigabyte Government Furnished Software Gigahertz Gnu not Unix Government Off the Shelf Global Positioning System Graphical User Interface Hardware Hardware Configuration Item Integrated Computer Aided Software Environment Integrated Development Environment Institute of Electrical and Electronics Engineers, Inc. International Council on Systems Engineering Information Security Management System Information Standards Organization Information Technology Information Technology infrastructure Library Thousands of Adapted Source Lines of Code Thousands of Estimated Delivered Source Instructions (COCOMO) Thousands of Equivalent Source Lines of Code Thousands of Source Lines of Code Local Area Network Programming Language Experience (COCOMO) Language and Tool Experience (COCOMO) Megahertz Management Information System Multi-level secure Effort in Programmer-Months Modem Programming Practices Master of Science Degree National Institute of Standards and Technology Non-Development Item New Object Points Operations Commitment Review Open Database Connectivity Object Linking and Embedding Open Process Framework Operating System Programmer Capability (COCOMO) 24

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Acronyms PCON PDA PDlF PERS PERT PEXP PL PM PMAT PMBOK PMI PMP PREC PREX PROD PSM PVOL QA RAM RCPX RELY RESL RFID RFP RUSE RVOL SADT SCAMPI SCED SDLC SECU SEI SITE SLA SLIM SLOC SME SNAFU SOA SOO SOW

Term Personnel Continuity (COCOMO) Personnel Digital Assistant Platform Difficulty Personnel Capability Program Evaluation and Review Technique Platform Experience Product Line Person months, the unit of effort COCOMO II uses to express effort. Process Maturity (COCOMO) Project Management Body of Knowledge Program Management Institute Project Management Plan Precedentedness (COCOMO) Personnel Experience Productivity rate Practical Software and Systems Measurement (Lecture Topics, Oct 20) Platform Volatility Quality Assurance Random Access Memory Product Reliability and Complexity Required Software Reliability Architecture / Risk Resolution (COCOMO) Radio Frequency Identification Request for Proposal Required Reusability (COCOMO) Requirements Volatility Structured Analysis and Design Technique Standard CMMI Assessment Method for Process Required Development Schedule (COCOMO) Software Development Life Cycle Classified Security Application Software Engineering Institute at Carnegie Mellon University Multi-site Development (COCOMO) Service Level Agreement Software Lifecycle Management (Larry Putnam) Source Line of Code Subject Matter Expert Situational Normal: All Fouled Up, or similar Service Oriented Architecture Statement of Operational Objectives Statement of Work 25

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Acronyms SPI SPICE SPMP SQA STOR SU SwSE T&E TEAM TIME TOOL TR TURN UML UNFM USAFIESD USB USC V&V VCR VEXP VIRT VMVH VMVT WAN WBS WITS WP WWW

Term Software Performance Index Software Process Improvement and Capability Determination, the ISO 15504 software capability assessment standard Software Project Management Plan Software Quality Assurance Main Storage Constraint Percentage of reuse effort due to software understanding Software System Engineering Test And Evaluation Team Cohesion (COCOMO) Execution Time Constraint Use of Software Tools (COCOMO) Technical Report Computer Turnaround Time Unified Modeling Language Programmer Unfamiliarity U.S. Air Force Electronic Systems Division Universal Serial Bus University of Southern California Verification and Validation Valuation Commitment Review Virtual Machine Experience Virtual Machine Volatility Virtual Machine Volatility: Host Virtual Machine Volatility: Target Wide Area Network Work Breakdown Structure Worldwide Identity Tracking System Work Package World Wide Web

26

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

2. Project Organization
2.1 Process Model

Figure 15: CDCAAS Process Model

2.1.1 Process Milestones


This section identifies milestones for all activities for managing and performing the work specified in the products contractual requirements including tasks, reviews, and deliverables. These milestones include base lining requirement, generating baselines, formal and informal peer review, customer review, and getting feedback from customer. The following milestones are planned for the CDCAAS release: Project Kickoff Meeting Completion of Requirements Specification 27

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Completion of Design Phase Completion of Sub-System Test Phase Completion of System Test Phase Conclusion of Alpha Deployment Gate includes Customer signoff Conclusion of Beta Deployment Gate includes Customer signoff Completion of Deployment Phase Activities Procedures Exit Condition Staff assigned with office space, computers and communications Management team agrees on draft of SPMP Management signs off on estimate and resource baseline and approves final SPMP Team leaders agree on design templates, documentation templates, common functionality features to be documented for version 1.0

Administrating and resource allocation between project Project Kickoff members Meeting Draft of software Project Management Plan (SPMP) Define requirements for Systems Requirements custom, COTS, reuse/NDI, Review and outsourced packages Estimate resources Hold series of internal preliminary design reviews Systems Preliminary Get user agreement on Design Review functional baseline Create and analyze use cases. Analyze and define requirements for custom, COTS, reuse/NDI, and outsourced packages Review requirements for feasibility and testability. Establish and document allocation baseline Produce final SPMP Start developing test plan Complete preliminary design for COTS, Custom and reuse packages. Conduct series of preliminary design reviews Get user agreement on functional baseline 28

Requirement Specification

Review closed and requirement is base lined Customer agrees and signs off baseline requirements. Management provides necessary resources.

Preliminary Design Review

Baseline PDD (Preliminary Design Document) Close PDD review Baseline Functional specification User agreement formally signed and archived

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Activities

Procedures Design detailed system features including user interface, data structure, and communication method between modules. Conduct review on Critical Design artifact. Establish baselines for Critical Design Document. Ger user agreement on Critical Design baseline Coding and document custom ,COTS and reuse packages Test COTS, custom, and reuse code, including mature reuse prototype Test outsourced package and test all sub-system functionality Validate components and sub-system functionality.

Exit Condition

Critical Design

Critical design document is created. Close Detail Design review User agreement formally signed and archived

Coding and unit test

Software is developed and source controlled Results of tests documented and signed off by project management

Sub system test

System test

Deployment and Support

Results of tests documented and signed off by project management User agreement formally signed and archived Results of tests documented and signed off by project Test full functionality based management on user work flow for all 11 User manual and packages maintenance manual Fix bugs found during documented integration testing and Product baseline established finalize baselines. User agreement formally signed and archived Alpha and beta test Customers sign off on completed deployment Start help desk operation System deployed at user Deploy system at user sites sites Begin user training Scheduled for user training Begin software Help desk and maintenance maintenance activities operational

29

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Activities

Procedures Document best practices, lessons learned, and future marketing opportunities Discontinue automatic license upgrades Close out supporting contracts

Exit Condition Documentation submitted to management and to corporate knowledge database. Close-out licensing and contract information sent to Finance Office Contracting Office, and management personnel and acknowledged

Phase out

Figure 9: Activities, Benchmarks and Success Indicators

2.1.2 Baselines
The essential idea of baselines is that in order to reach a destination it is necessary to know your starting point. In CDCAAS project the following three baselines will be used for each component during its waterfall development lifecycle. Functional Baseline: Describes a systems or items functional characteristics, and the verifications required to demonstrate the achievement of those specified functional characteristics. Allocated Baseline: Describes the functional and interface characteristics for CI (Configuration Item) that are allocated from those of the higher level CI and the verification required to demonstrate achievement of those specified characteristics. Product Baseline: Defines the releasable contents of the project including the application, test case, test results, system and user documentation during the production, fielding/deployment and operational support phases of its life cycle.

2.1.3 Reviews
Various internal reviews and customer reviews will be conducted as scheduled in section 5.5 and review comments and sign off will be documented for each review. Also additional review may be conducted if necessary or based on customer request to identify possible problem in early stage. Those reviews will be Software Requirement Specification Review, Preliminary Design Review, Critical Design Review, Various Test Readiness Review, and Deployment Readiness Review. Other activities also required customer review and sign off will be required before it is executed.

2.1.3.1 Software Requirements Review


A Requirements Review is one of a number of such reviews to verify and approve sets of system-level requirements as they are developed. The main purpose of this review is to ensure that the design is progressing in the correct to give the team a confidence in the design process through the progressive monitoring and approval of the system-

30

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

level requirements that are developed between the User Requirements Document and the Functional Baseline. A Requirements Review will be conducted as a formal review and end of the review customer will sign off and approved the baseline Requirements. Re-review will be conducted if necessary.

2.1.3.2 Preliminary Design Review


Preliminary Design Reviews will be held at the conclusion of Preliminary Design in the systems engineering process to evaluate the Preliminary Design effort and to review and approve the Allocated Baseline containing the Development Specifications. A single system-level Preliminary Design Review will be held for each package to establish functional baseline.

2.1.3.3 Critical Design Review


The Critical Design Review is one of the major systems engineering reviews conducted towards the end of Critical Design & Development and will normally mark the end of the activity and the beginning of Construction and/or Production. The Critical Design Review will evaluate the detailed design effort and approves the Product Specifications and related products and establishes the Product Baseline. The Critical Design Review will also review discrepancies from the Preliminary Design Review and will assess the progress of technical performance measures. There will be an individual Critical Design Review for each package. Also a systemlevel review will be conducted to ratify results and address interface and integration issues between packages. All major documentation and plans will be reviewed. Baselines for the product, functionality, and allocation will be agreed to by the customer will be disseminated to all project members.

2.1.3.4 Subsystem Test Review


The Sub System and System Test Readiness Review will be conducted to avoid committing test and evaluation resources needlessly. The Sub System Test Readiness Review will be used to demonstrate the readiness of each package to enter test and evaluation after coding and unit testing is done. The System Test Readiness Review will be used to demonstrate the readiness of the system after sub system is completed successfully. To assess the readiness of a configuration item for test, the following items will be reviewed: Test and Evaluation Master Plan (TEMP); relevant test and evaluation procedures; formal and informal test results so far; supporting documentation such as operator and maintenance manuals; Development Specifications and Product Specifications; and appropriate support and test equipment; test facilities. Additionally the documentation will be maintained for future software maintenance and to improve test process.

31

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

2.1.3.5 System Alpha Test


The Alpha test will be performed for two months at Falls Church, VA by independent testers as a form of internal acceptance testing, before the software goes to beta site. During the alpha testing we will create user training manual, establish help desk procedures and set-up software maintenance activities. At the end of the Alpha Test period, all documented errors will be fixed and requirements will be updated if necessary.

2.1.3.6 System Beta Test


Two months Beta Test comes after alpha testing. The system will be released to a limited audience outside of the CDCAAS team so that further testing can ensure the product has few faults or bugs. During this period, the help desk will be put into place to support customer with any issues related to hardware or software. All help desk call will be documented for fix bugs before final release or future system enhancement. At the end of the Beta Test period, customer will create Beta Test report and submit to the CDCAAS team and the CDCAAS team will analyze reports to see any modification before final deployment. The Beta Test reports will includes the features they like and any system enhancement request that they desire.

2.1.3.7 Deployment Readiness Review


The deployment readiness review will be conducted after beta test to ensure the readiness of the release for deployment and includes the operability and supportability of the release. This Deployment Readiness Review will decide whether to deploy the system at customer site or not. At this review the customer, manager and key person in the CDCAAS team will sign off the review form.

2.1.3 8 System Deployment


After customer sign off final Deployment Readiness Review form the final version of CDCAAS system will be installed all customer site. Also the schedule for user training will be set and announced to the users. Configuration manager will baseline and label all artifacts as a version 1.0.

2.1.3.9 System Maintenance & Evolution


Series of software upgrade are expected throughout the CDCAAS systems lifecycle based on user group request. The series of upgrade will include bug fixes from errors and problems are identified during operation. Also it will include enhancement of the system as a user group request new requirements and existing requirement changes or to adapt the new technical environment.

32

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

2.1.3.10 Phase Out


When the CDCAAS system will be withdrawn from service will be decided based on user group survey and company policy. During this stage lessons learned, various data related to project process will be archived in the companys knowledge resource repository for future project. All resources will be reassigned to the company based on the company policy. Contracting and licensing agreements will be discontinued.

2.1.3.11 Disposal
Disposal will be practiced according to the company regulations and all sensitive data will be destroyed securely based on approval from upper level management.

2.2 Organizational Structure


Crime Data Collection, Aggregation and Assimilation System (CDCAAS) is structured into three divisions headed by Chief Executive Officer and Chairman followed by a Chief Technology Officer (CTO), a Chief Financial officer (CFO) and Chief Operating Officer (COO). A corporate organization chart is shown below:

33

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 10: CDCAAS Organization Chart

The CDCAAS project is conducted by the Application Software Development branch of the Vice President of the Software Division. The organizations structure is shows below:

34

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 11: Software Division Organization Chart

CDCAAS is under the Application Software Development branch and has its own structure below:

35

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 12: CDCAAS Project Organization Chart

36

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

The basic CDCAAS project structure consists of a Project Manager, plus an Administrative Support team that serves the entire project. These are considered overhead (i.e. not computed in the calculations of development resources). There are separate organizations for Software Quality and Control, Technical Support, and Analysis, Design and Development Teams. Programmers are distributed over the development of 10 packages out of the 11 applications. The development of GPA Package is outsourced to Ivan Industries. The Analysis, Design and Development Team structure is given below. Variations in team structure appear based on required effort some teams require more developers than others. Variations also appear based on the kind of development. A Team Lead and a Senior Programmer are assigned to work on three different kinds of development (Custom, COTS and Component). COTS and Component Teams applications also get an expert each. Team Structure of the Analysis, Design and Development Team: 3 Team Leaders 1 Requirement Analyst 1 Architect/Designer 1 System Engineer 3 Senior Programmers 15 Programmers 1 COTS Expert 1 Component Expert

Team Structure Software Quality and Control Team: 1 CM Specialist 1 QA Engineer 4 Testers

37

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Analysis Design and Development Team

Team Leaders

Senior Programmers

Requirements Analyst

Programmers

Architecture Designer

COTS Experts

System Engineer

Component Experts

Figure 13: Analysis Design and Development Team

38

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 14: CDCAAS Project Team Structure

39

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

2.3 Organization Boundaries and Interfaces


This section describes the managerial boundaries between the CDCAAS Project and the other organizations with the cooperation and the customer. The diagram below depicts the organizational interfaces related to the CDCAAS Program Manager.

Figure 15: Program Manager Organizational Interfaces

The diagram below depicts the organizational interfaces related to the CDCAAS Project Manager.

40

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 16: Project Manager Organizational Interfaces

Requirement Analyst Quality Assurance Specialist Testers

Software Developers Configuration Management Specialist

Training Specialist Installation Specialist Documentation Specialist Help Desk Technician

Administrative Assistant HR Specialist Procurement Specialist Network Support Specialist

41

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

2.4 Project Responsibilities 2.4.1 Project Manager


Project Manager is the individual responsible for managing the project. The project manager will have control over the scope of the project and assure the quality of the project management process. The project manager will also report on project status and forecasting. The project manager will ensure that the project is delivered on schedule. He is the one who directs the efforts of software engineers, analysts, programmers and other project personnel.

2.4.2 Assistant Project Manager


The assistant project manager shall assist the project manager with that persons responsibilities.

2.4.3 Chief Software Developer


The chief software developer specifies technology platforms and solutions and works closely with the project manager on product specification. He is responsible for staffing the software team.

2.4.4 Administrative Assistant


He performs a variety of administrative task as directed by the project manager or his/her designees. The administrative assistant will provide administration of the project library.

2.4.5 System Engineer/Analyst


The system engineer/analyst is responsible for coordinating the construction, maintenance, and growth of the project systems.

2.4.6 Requirements Analysts


The requirements analysts are responsible for the continuous interpretation and documentation of the software requirements. elicitation,

2.4.7 Technical Team Leader


The technical team leader shall assign technical responsibilities to individual team members and track the status of those the tasks.

2.4.8 Software Developers


The software developers shall have the primary responsibility of coding and unit testing the software.

42

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

2.4.9 Testers
The testers will be responsible throughout the development cycle for developing and executing test procedures to ensure that the customer's requirements are met by the project packages.

2.4.10 Help Desk Technician


The help desk technician's responsibility will be to provide technical support during system testing as well as during the six-month help desk service period at the end of the project.

2.4.11 Project Specialists


The project specialists are individuals who specialize in a particular area of the software development process. These roles are: Training Specialist Produces and distributes training materials for the software to the appropriate endusers. Organizes and conducts training sessions for end-users. Installation Specialist Installs the hardware at the defined locations, possibly in conjunction with the network support specialist during deployment and installation. Documentation Specialist Produces software system User and Maintenance Documentation, possibly working in conjunction with the training specialist to produce training materials. Human Resources Specialist Interviews and screens applicants for the project and assist the team members with human resources-related issues. Network Support Specialist Performs physical system connections as described in the Network Operational Context diagram, possibly in conjunction with the installation specialist. Procurement Specialist Procures all necessary equipment and software for the project. Configuration Management Specialist Manages the software versions for deployment. Required to document all changes to the software for each build and provide a process to move each software version through Unit testing, integrated testing, user acceptance testing and production environments. Manages and tracks all other configuration items.

43

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Quality Assurance Specialist Responsible for assuring the software satisfies the test-cases. Provides testing matrices that map each requirement to the test case. Ensures the software is ready to transition through the test cycles and is ready for production.

Figure 17: Responsibility Matrix Summary

44

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Function Project Manager Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation

Total Project Responsibility Matrix Staff Months Coding and RequireUnit Integration Subtotal Testing COCOMO ments Design Testing 2.64 3.63 1.47 2.22 7.32 2.64 3.63 1.47 2.22 7.32 6.97 3.07 9.59 43.57 3.63 76.22 29.05 1.47 14.53 21.78 2.22 21.78 94.40 7.32 112.53

System Testing 1.47 1.47 10.18 1.47 13.08

Total 11.43 11.43 111.55 11.86 135.19

14.81 5.20 5.20 7.84 13.08 4.78 -

29.05 32.65 32.65 14.53 72.61 7.27 3.63

5.80 18.87 43.57 7.27 2.90 1.47 1.47

8.69 23.94 26.12 65.35 2.22 4.37 6.52 2.22 4.37

43.54 75.46 102.35 87.15 2.22 4.37 82.03 10.96 9.47

5.80 14.53 14.53 43.57 2.90 2.90 4.37 7.27 2.90 2.90

64.15 95.19 122.08 138.57 5.12 2.90 8.74 102.38 18.64 12.37

1.73 4.78 4.78 87.11 12%

3.63 18.16 18.16 363.03 50%

1.47 7.27 7.27 145.36 20%

2.22 10.88 10.88 218.00 30%

7.32 36.30 36.30 726.39 100%

1.47 7.27 7.27 145.34 20%

10.52 48.36 48.36 958.83 132%

Figure 18: Total Project Responsibility Matrix

45

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Database Management System Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal System Function ments Design Testing Testing COCOMO Testing Project Manager 0.28 0.35 0.14 0.21 0.71 0.14 Asst Project 0.28 0.35 0.14 0.21 0.71 0.14 Manager Chief 0.68 4.24 2.83 2.12 9.19 0.99 Programmer Secretary 0.30 0.35 0.14 0.21 0.71 0.14 Systems 0.94 7.42 1.41 2.12 10.95 1.27 Engineer/ Analyst Requirements 1.44 2.83 0.57 0.85 4.24 0.57 Analyst Technical Team 0.51 3.18 1.84 2.33 7.35 1.41 Lead Programmer 0.51 3.18 4.24 2.54 9.96 1.41 Tester 0.76 1.41 0.71 6.36 8.48 4.24 Help Desk 0.21 0.21 0.28 Technician Training 0.28 Specialist Installation 0.43 0.43 0.43 Specialist Documentation 1.27 7.07 0.28 0.64 7.99 0.71 Specialist HR Specialist 0.46 0.71 0.14 0.21 1.06 0.28 Network 0.35 0.14 0.43 0.92 0.28 Support Specialist Procurement 0.17 0.35 0.14 0.21 0.71 0.14 Specialist CM Specialist 0.46 1.77 0.71 1.06 3.53 0.71 QA Specialist 0.46 1.77 0.71 1.06 3.53 0.71 Total Staff 8.52 35.34 14.12 21.21 70.67 14.11 Months % of COCOMO 12% 50% 20% 30% 100% 20% Allocation
Figure 19: Database Management System Package Responsibility Matrix

Total 1.13 1.13 10.86 1.14 13.16

6.24 9.27 11.88 13.48 0.49 0.28 0.85 9.96 1.80 1.20

1.01 4.70 4.70 93.31 132%

46

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Function Project Manager Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation

2. Spreadsheet Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal ments Design Testing Testing COCOMO 0.05 0.07 0.03 0.04 0.13 0.05 0.07 0.03 0.04 0.13 0.13 0.06 0.17 0.79 0.07 1.38 0.53 0.03 0.26 0.40 0.04 0.40 1.71 0.13 2.04

System Testing 0.03 0.03 0.19 0.03 0.24

Total 0.21 0.21 2.03 0.21 2.45

0.27 0.10 0.10 0.14 0.24 0.09 -

0.53 0.59 0.59 0.26 1.32 0.13 0.07

0.11 0.34 0.79 0.13 0.05 0.03 0.03

0.16 0.43 0.48 1.18 0.04 0.08 0.12 0.04 0.08

0.79 1.37 1.86 1.58 0.04 0.08 1.48 0.20 0.17

0.11 0.26 0.26 0.79 0.05 0.05 0.08 0.13 0.05 0.05

1.17 1.73 2.21 2.51 0.09 0.05 0.16 1.85 0.33 0.22

0.03 0.09 0.09 1.58 12%

0.07 0.33 0.33 6.58 50%

0.03 0.13 0.13 2.64 20%

0.04 0.20 0.20 3.94 30%

0.13 0.66 0.66 13.16 100%

0.03 0.13 0.13 2.63 20%

0.19 0.87 0.87 17.37 132%

Table 10: Spreadsheet Package Responsibility Matrix

47

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

3. Requirements & Configurations Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal System Function ments Design Testing Testing COCOMO Testing Project Manager 0.47 0.66 0.26 0.44 1.36 0.26 Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation 0.47 1.27 0.55 1.74 0.66 7.92 0.66 13.85 0.26 5.28 0.26 2.64 0.44 3.96 0.44 3.96 1.36 17.15 1.36 20.45 0.26 1.85 0.26 2.37

Total 2.10 2.10 20.26 2.18 24.56

2.69 0.95 0.95 1.42 2.37 0.87 -

5.28 5.94 5.94 2.64 13.19 1.32 0.66

1.06 3.43 7.92 1.32 0.53 0.26 0.26

1.58 4.35 4.75 11.87 0.44 0.79 1.19 0.44 0.79

7.92 13.72 18.60 15.83 0.44 0.79 14.91 2.02 1.71

1.06 2.64 2.64 7.92 0.53 0.53 0.79 1.32 0.53 0.53

11.66 17.31 22.19 25.17 0.97 0.53 1.58 18.60 3.42 2.24

0.32 0.87 0.87 15.83 12%

0.66 3.30 3.30 65.96 50%

0.26 1.32 1.32 26.38 20%

0.44 1.98 1.98 39.84 30%

1.36 6.60 6.60 132.18 100%

0.26 1.32 1.32 26.38 20%

1.94 8.79 8.79 174.39 132%

Table 11: Requirements & Configurations Package Responsibility Matrix

48

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

4. Secure Communications Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal System Function ments Design Testing Testing COCOMO Testing Project Manager 0.51 0.71 0.29 0.41 1.41 0.29 Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation 0.51 1.34 0.58 1.85 0.71 8.43 0.71 14.73 0.29 5.63 0.29 2.80 0.41 4.21 0.41 4.21 1.41 18.27 1.41 21.75 0.29 1.97 0.29 2.53

Total 2.22 2.22 21.58 2.29 26.13

2.87 1.00 1.00 1.51 2.53 0.93 -

5.63 6.31 6.31 2.80 14.03 1.41 0.71

1.12 3.65 8.43 1.41 0.56 0.29 0.29

1.68 4.63 5.04 12.64 0.41 0.85 1.27 0.41 0.85

8.43 14.59 19.78 16.85 0.41 0.85 15.86 2.12 1.85

1.12 2.80 2.80 8.43 0.56 0.56 0.85 1.41 0.56 0.56

12.42 18.39 23.58 26.79 0.97 0.56 1.70 19.80 3.60 2.41

0.34 0.93 0.93 16.83 12%

0.71 3.51 3.51 70.19 50%

0.29 1.41 1.41 28.18 20%

0.41 2.09 2.09 42.06 30%

1.41 7.01 7.01 140.43 100%

0.29 1.41 1.41 28.15 20%

2.05 9.35 9.35 185.41 132%

Table 12: Secure Communications Package Responsibility Matrix

49

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

5. Graphics Extension Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal Function ments Design Testing Testing COCOMO Project Manager 0.08 0.11 0.04 0.07 0.22 Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation 0.08 0.21 0.09 0.29 0.11 1.31 0.11 2.30 0.04 0.87 0.04 0.44 0.07 0.66 0.07 0.66 0.22 2.84 0.22 3.39

System Testing 0.04 0.04 0.30 0.04 0.40

Total 0.35 0.35 3.36 0.36 4.08

0.45 0.15 0.15 0.24 0.40 0.14 -

0.87 0.98 0.98 0.44 2.19 0.22 0.11

0.17 0.57 1.31 0.22 0.09 0.04 0.04

0.26 0.72 0.79 1.97 0.07 0.13 0.20 0.07 0.13

1.31 2.27 3.08 2.62 0.07 0.13 2.47 0.33 0.28

0.17 0.44 0.44 1.31 0.09 0.09 0.13 0.22 0.09 0.09

1.93 2.87 3.68 4.17 0.15 0.09 0.26 3.08 0.56 0.37

0.05 0.14 0.14 2.62 12%

0.11 0.54 0.54 10.94 50%

0.04 0.22 0.22 4.36 20%

0.07 0.33 0.33 6.57 30%

0.22 1.09 1.09 21.87 100%

0.04 0.22 0.22 4.36 20%

0.32 1.45 1.45 28.85 132%

Table 13: Graphics Extension Package Responsibility Matrix

50

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Function Project Manager Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation

6. Word Processor Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal ments Design Testing Testing COCOMO 0.26 0.36 0.16 0.23 0.75 0.26 0.72 0.33 0.98 0.36 4.47 0.36 7.80 0.16 2.97 0.16 1.50 0.23 2.22 0.23 2.22 0.75 9.66 0.75 11.53

System Testing 0.16 0.16 1.04 0.16 1.34

Total 1.18 1.18 11.43 1.24 13.84

1.50 0.52 0.52 0.82 1.34 0.49 -

2.97 3.33 3.33 1.50 7.44 0.75 0.36

0.59 1.93 4.47 0.75 0.29 0.16 0.16

0.88 2.45 2.68 6.69 0.23 0.46 0.65 0.23 0.46

4.44 7.71 10.48 8.95 0.23 0.46 8.39 1.14 0.98

0.59 1.50 1.50 4.47 0.29 0.29 0.46 0.75 0.29 0.29

6.53 9.73 12.50 14.24 0.52 0.29 0.91 10.48 1.93 1.27

0.16 0.49 0.49 8.88 12%

0.36 1.86 1.86 37.12 50%

0.16 0.75 0.75 14.99 20%

0.23 1.11 1.11 22.30 30%

0.75 3.72 3.72 74.41 100%

0.16 0.75 0.75 14.99 20%

1.08 4.96 4.96 98.27 132%

Table 14: Word Processor Package Responsibility Matrix

51

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

7. Project Management Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal System Function ments Design Testing Testing COCOMO Testing Project Manager 0.04 0.05 0.02 0.03 0.11 0.02 Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation 0.04 0.10 0.05 0.14 0.05 0.66 0.05 1.15 0.02 0.44 0.02 0.22 0.03 0.33 0.03 0.33 0.11 1.42 0.11 1.69 0.02 0.15 0.02 0.20

Total 0.17 0.17 1.68 0.18 2.03

0.22 0.08 0.08 0.12 0.20 0.07 -

0.44 0.49 0.49 0.22 1.09 0.11 0.05

0.09 0.28 0.66 0.11 0.04 0.02 0.02

0.13 0.36 0.39 0.98 0.03 0.07 0.10 0.03 0.07

0.66 1.14 1.54 1.31 0.03 0.07 1.23 0.16 0.14

0.09 0.22 0.22 0.66 0.04 0.04 0.07 0.11 0.04 0.04

0.97 1.43 1.84 2.08 0.08 0.04 0.13 1.54 0.28 0.19

0.03 0.07 0.07 1.31 12%

0.05 0.27 0.27 5.46 50%

0.02 0.11 0.11 2.18 20%

0.03 0.16 0.16 3.28 30%

0.11 0.55 0.55 10.93 100%

0.02 0.11 0.11 2.19 20%

0.16 0.73 0.73 14.42 132%

Table 15: Project Management Package Responsibility Matrix

52

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Function Project Manager Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation

8. GPS Navigation Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal System ments Design Testing Testing COCOMO Testing 0.36 0.52 0.20 0.31 1.02 0.20 0.36 0.99 0.44 1.35 0.52 6.14 0.52 10.75 0.20 4.09 0.20 2.05 0.31 3.07 0.31 3.07 1.02 13.30 1.02 15.87 0.20 1.44 0.20 1.85

Total 1.59 1.59 15.73 1.66 19.07

2.09 0.74 0.74 1.11 1.85 0.67 -

4.09 4.61 4.61 2.05 10.25 1.02 0.52

0.82 2.66 6.14 1.02 0.41 0.20 0.20

1.22 3.38 3.68 9.22 0.31 0.61 0.92 0.31 0.61

6.14 10.65 14.43 12.29 0.31 0.61 11.58 1.53 1.33

0.82 2.05 2.05 6.14 0.41 0.41 0.61 1.02 0.41 0.41

9.06 13.43 17.22 19.54 0.72 0.41 1.22 14.45 2.61 1.74

0.25 0.67 0.67 12.29 12%

0.52 2.56 2.56 51.24 50%

0.20 1.02 1.02 20.44 20%

0.31 1.54 1.54 30.70 30%

1.02 5.13 5.13 102.39 100%

0.20

1.47

1.02 6.82 1.02 6.82 20.47 135.15 20% 132%

Table 16: GPS Navigation Package Responsibility Matrix

53

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

9. Compiler Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal System ments Design Testing Testing COCOMO Testing Total 0.09 0.13 0.05 0.08 0.25 0.05 0.39 0.09 0.24 0.11 0.33 0.13 1.48 0.13 2.59 0.05 0.99 0.05 0.50 0.08 0.74 0.08 0.74 0.25 3.21 0.25 3.83 0.05 0.34 0.05 0.44 0.39 3.80 0.41 4.60

Function Project Manager Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation

0.50 0.18 0.18 0.27 0.44 0.16 -

0.99 1.11 1.11 0.50 2.47 0.25 0.13

0.20 0.64 1.48 0.25 0.10 0.05 0.05

0.29 0.81 0.89 2.22 0.08 0.15 0.22 0.08 0.15

1.48 2.57 3.49 2.96 0.08 0.15 2.79 0.37 0.32

0.20 0.50 0.50 1.48 0.10 0.10 0.15 0.25 0.10 0.10

2.18 3.24 4.16 4.71 0.18 0.10 0.29 3.48 0.63 0.42

0.06 0.16 0.16 2.95 12%

0.13 0.62 0.62 12.37 50%

0.05 0.25 0.25 4.94 20%

0.08 0.37 0.37 7.42 30%

0.25 1.23 1.23 24.72 100%

0.05 0.25 0.25 4.93 20%

0.36 1.64 1.64 32.61 132%

Table 17: Compiler Package Responsibility Matrix

54

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Function Project Manager Asst Project Manager Chief Programmer Secretary Systems Engineer/ Analyst Requirements Analyst Technical Team Lead Programmer Tester Help Desk Technician Training Specialist Installation Specialist Documentation Specialist HR Specialist Network Support Specialist Procurement Specialist CM Specialist QA Specialist Total Staff Months % of COCOMO Allocation

10. Debugger and Test Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal System ments Design Testing Testing COCOMO Testing 0.28 0.39 0.15 0.23 0.77 0.15 0.28 0.75 0.33 1.03 0.39 4.66 0.39 8.16 0.15 3.11 0.15 1.55 0.23 2.33 0.23 2.33 0.77 10.10 0.77 12.04 0.15 1.09 0.15 1.40

Total 1.21 1.21 11.93 1.25 14.46

1.58 0.56 0.56 0.84 1.40 0.51 -

3.11 3.49 3.49 1.55 7.77 0.78 0.39

0.62 2.02 4.66 0.78 0.31 0.15 0.15

0.93 2.56 2.80 6.99 0.23 0.46 0.70 0.23 0.46

4.66 8.08 10.95 9.32 0.23 0.46 8.78 1.16 1.00

0.62 1.55 1.55 4.66 0.31 0.31 0.46 0.78 0.31 0.31

6.87 10.18 13.06 14.82 0.54 0.31 0.93 10.95 1.99 1.31

0.19 0.51 0.51 9.32 12%

0.39 1.94 1.94 38.83 50%

0.15 0.78 0.78 15.53 20%

0.23 1.16 1.16 23.29 30%

0.77 3.89 3.89 77.65 100%

0.15

1.12

0.78 5.18 0.78 5.18 15.52 102.50 20% 132%

Table 18: Debugger and Test Package Responsibility Matrix

55

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

11. Electronic Inventory and Tracking Package Responsibility Matrix Staff Months Coding and IntegraRequireUnit tion Subtotal System Function ments Design Testing Testing COCOMO Testing Project 0.21 0.29 0.12 0.17 0.58 0.12 Manager Asst Project 0.21 0.29 0.12 0.17 0.58 0.12 Manager Chief 0.56 3.48 2.32 1.74 7.54 0.81 Programmer Secretary 0.24 0.29 0.12 0.17 0.58 0.12 Systems 0.77 6.09 1.16 1.74 8.99 1.04 Engineer/ Analyst Requirements 1.18 2.32 0.46 0.70 3.48 0.46 Analyst Technical 0.42 2.61 1.51 1.92 6.03 1.16 Team Lead Programmer 0.42 2.61 3.48 2.09 8.18 1.16 Tester 0.63 1.16 0.58 5.22 6.96 3.48 Help Desk 0.17 0.17 0.23 Technician Training 0.23 Specialist Installation 0.35 0.35 0.35 Specialist Documentation 1.04 5.80 0.23 0.52 6.56 0.58 Specialist HR Specialist 0.38 0.58 0.12 0.17 0.87 0.23 Network 0.29 0.12 0.35 0.75 0.23 Support Specialist Procurement 0.14 0.29 0.12 0.17 0.58 0.12 Specialist CM Specialist 0.38 1.45 0.58 0.87 2.90 0.58 QA Specialist 0.38 1.45 0.58 0.87 2.90 0.58 Total Staff 6.96 29.00 11.59 17.39 57.99 11.60 Months % of COCOMO 12% 50% 20% 30% 100% 20% Allocation
Table 19: Electronic Inventory and Tracking Package Responsibility Matrix

Total 0.90 0.90 8.91 0.94 10.80

5.13 7.61 9.75 11.06 0.41 0.23 0.70 8.18 1.48 0.99

0.83 3.86 3.86 76.54 132%

56

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

3. Managerial Process
3.1 Management Objectives & Priorities
Managements objectives and priorities in this project are to deliver a high performance and reliable product that provides a unified tool to access the crime data. The objectives and priorities also include managing all the process of extending the eleven software project tasks. The important packages out of the eleven packages are schedule tracking, resource controlling, risk management and budget controlling to ensure that we will deliver the quality product on schedule and within the budget. The primary priority is to deliver the project on schedule. As crime rate continues to increase, new technology such as CDCAAS using advanced data aggregation and assimilation tools will become increasingly critical for city, county and state organizations to get the crime related cases closed faster than before. To achieve it, the management will organize staff, lead and control the human, financial and technical resources required to accomplish the project within specified time, technical specifications and budget. Management will also be responsible for ensuring all risks are identified and the mitigations plans are developed and implemented.

3.1.1 Goals & Objectives


The CDCAAS project goals are conceived to be consistent and support the organizations goals and objectives. The goals and objectives are divided into three subcategories, strategic corporate goal, information technology and related goals and project specific tactical goals.

3.1.1.1 Strategic Corporate Goals


Reduce the time and costs of developing software; be more effective in providing timely and cost effective delivery capabilities to the customer. Creative method of seeking and exchanging ideas and best practices. Ensures availability, responsiveness and professionalism those results in the delivery of high quality services on a consistent basis. Build relationships with stakeholders to communicate effectively to gather and articulate a technical vision and take leadership in producing a strategic plan to achieving it. Create and maintain system vision and target architectures for systems in a domain capture evolution priorities and provide recommendations. Be flexible to adapt to the new emerging technologies. Understand risk and use it as a tool to manage priorities, issues, and deliverables. Be aware of risk and strive to mitigate risk to appropriate levels. Keep the resources up to date and have a vision towards creating product with new distinct features to be ahead of the competitors. Leverage the web in support of geographic distributed development.

57

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Spend time and resources in research and development to identify new markets and users to penetrate in with innovative technologies.

3.1 Management Objectives & Priorities


Managements objectives and priorities in this project are to deliver a high performance and reliable product that provides a unified tool to access the crime data. The objectives and priorities also include managing all the process of extending the eleven software project tasks. The important packages out of the eleven packages are schedule tracking, resource controlling, risk management and budget controlling to ensure that we will deliver the quality product on schedule and within the budget. The primary priority is to deliver the project on schedule. As crime rate continues to increase, new technology such as CDCAAS using advanced data aggregation and assimilation tools will become increasingly critical for city, county and state organizations to get the crime related cases closed faster than before. To achieve it, the management will organize staff, lead and control the human, financial and technical resources required to accomplish the project within specified time, technical specifications and budget. Management will also be responsible for ensuring all risks are identified and the mitigations plans are developed and implemented.

3.1.1 Goals & Objectives


The CDCAAS project goals are conceived to be consistent and support the organizations goals and objectives. The goals and objectives are divided into three subcategories, strategic corporate goal, information technology and related goals and project specific tactical goals.

3.1.1.1 Strategic Corporate Goals


Reduce the time and costs of developing software; be more effective in providing timely and cost effective delivery capabilities to the customer. Creative method of seeking and exchanging ideas and best practices. Ensures availability, responsiveness and professionalism those results in the delivery of high quality services on a consistent basis. Build relationships with stakeholders to communicate effectively to gather and articulate a technical vision and take leadership in producing a strategic plan to achieving it. Create and maintain system vision and target architectures for systems in a domain capture evolution priorities and provide recommendations. Be flexible to adapt to the new emerging technologies. Understand risk and use it as a tool to manage priorities, issues, and deliverables. Be aware of risk and strive to mitigate risk to appropriate levels.

58

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Keep the resources up to date and have a vision towards creating product with new distinct features to be ahead of the competitors. Leverage the web in support of geographic distributed development. Spend time and resources in research and development to identify new markets and users to penetrate in with innovative technologies.

3.1.1.2 Information Technology & Related Goals


The following are the goals related to IT and other related areas: Build a technology enhanced organization. Provide interface to most technologies. The interface will allow the user to access Web, and stand-alone tools. Follow the widely approved commercial and industrial standards and practices throughout the software development life cycle Maintain Capability Maturity Model (CMM) Level-II development process while working toward CMM Level - III. Take advantage of software component and model-based development via reuse and COTS. Developing controls to ensure confidentiality, integrity and availability of all processes and information. Ensure effective communication and business relationship management with all the stakeholders. Develop CDCASS application to follow open architecture standards by conforming to the compatibility, interfaces and portability standards it adheres to.

3.1.1.3 Project Specific Tactical Goals


Extend existing 11 packages with more flexible capabilities as proposal stated. Develop and promote System CDCAAS as the companys next major product line Within five years of completion of CDCAAS development, provide CDCAAS as the solution of choice to the state governments of at least 10 of the 50 major states in USA. Develop system with high reliability and availability with no environmental constraints. CDCAAS system suite will take advantage of tailor made COTS products for faster development and reduce the time to market. Move local law enforcement, federal government and military towards adopting CDCAAS as its worldwide Crime data collection, aggregation and assimilation system of choice. Increase penetration rates into new markets abroad.

59

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

3.1.2 Management Priorities


Product which is robust enough to counter the challenges of ever increases security vulnerabilities, flexible enough to be customized with the changing needs of an organization without compromising on utilization of available hardware resources, speed and accuracy; deserves to be called dependable. The will to design such a revolutionary product on schedule within budget, is driving the management to deliver this exceptional product. This desire is further solidified by the strong advanced order of 700 systems. XYZ Notebook computer with CDCAAS Software Application Suite will create a new Benchmark, in the way how a software application can enhances the hardware capabilities and functionality.

3.1.2.1 Crisis Management


Set of guidelines to handle various kinds of crisis's that may be encountered is in place as follows: A backup center to store a copy of all source code and other resources will be maintained at a separate location in case of catastrophe at the main development site. Situations when access to the office is blocked by snow, ice or other obstacles, there will be a Virtual Private Network set in place to enable employees to connect to the network so as to work from home. There will be a phone list maintained by HR of contact information of the team members and a central call in number for employees to check in with in case of local catastrophe such as a natural disaster or terrorist attack.

3.1.2.2 Professional Staff Development


Funds to support tuition for college classes for 25% of the employees will be set aside. Moreover there will be funds allocated to fund each staff member to take the equivalent of 5 days per year of training for something relevant to their current assignment and/or job function. For an "A" in the class tuition for college classes will be reimbursed 100% and 80% for a B, no reimbursement will be given for grades lower than B. There will be a $5,000 cap on tuition paid per employee in the calendar year. There will be a $3,000 limit on training for each employee assuming that 75% of the staff will actually take the full amount of training. Certification and Knowledge Management will be put in place so that even if someone not working in one domain but if interested fulfills the requirement. Loss of staff due to attrition or other reasons can be countered by making available a certain percentage of bench force. Clear statements about requirements and expectations for each position on the team will be made available to all employees to ensure that each employee understands what is necessary at each position and level of the organization in order to advance.

60

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

3.1.1.2 Information Technology & Related Goals


The following are the goals related to IT and other related areas: Build a technology enhanced organization. Provide interface to most technologies. The interface will allow the user to access Web, and stand-alone tools. Follow the widely approved commercial and industrial standards and practices throughout the software development life cycle Maintain Capability Maturity Model (CMM) Level-II development process while working toward CMM Level - III. Take advantage of software component and model-based development via reuse and COTS. Developing controls to ensure confidentiality, integrity and availability of all processes and information. Ensure effective communication and business relationship management with all the stakeholders. Develop CDCASS application to follow open architecture standards by conforming to the compatibility, interfaces and portability standards it adheres to.

3.1.1.3 Project Specific Tactical Goals


Extend existing 11 packages with more flexible capabilities as proposal stated. Develop and promote System CDCAAS as the companys next major product line Within five years of completion of CDCAAS development, provide CDCAAS as the solution of choice to the state governments of at least 10 of the 50 major states in USA. Develop system with high reliability and availability with no environmental constraints. CDCAAS system suite will take advantage of tailor made COTS products for faster development and reduce the time to market. Move local law enforcement, federal government and military towards adopting CDCAAS as its worldwide Crime data collection, aggregation and assimilation system of choice. Increase penetration rates into new markets abroad.

3.1.2 Management Priorities


Product which is robust enough to counter the challenges of ever increases security vulnerabilities, flexible enough to be customized with the changing needs of an organization without compromising on utilization of available hardware resources, speed and accuracy; deserves to be called dependable. The will to design such a revolutionary product on schedule within budget, is driving the management to deliver this exceptional product. This desire is further solidified by the strong 61

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

advanced order of 700 systems. XYZ Notebook computer with CDCAAS Software Application Suite will create a new Benchmark, in the way how a software application can enhances the hardware capabilities and functionality.

3.1.2.1 Crisis Management


Set of guidelines to handle various kinds of crisis's that may be encountered is in place as follows: A backup center to store a copy of all source code and other resources will be maintained at a separate location in case of catastrophe at the main development site. Situations when access to the office is blocked by snow, ice or other obstacles, there will be a Virtual Private Network set in place to enable employees to connect to the network so as to work from home. There will be a phone list maintained by HR of contact information of the team members and a central call in number for employees to check in with in case of local catastrophe such as a natural disaster or terrorist attack.

3.1.2.2 Professional Staff Development


Funds to support tuition for college classes for 25% of the employees will be set aside. Moreover there will be funds allocated to fund each staff member to take the equivalent of 5 days per year of training for something relevant to their current assignment and/or job function. For an "A" in the class tuition for college classes will be reimbursed 100% and 80% for a B, no reimbursement will be given for grades lower than B. There will be a $5000 cap on tuition paid per employee in the calendar year. There will be a $3000 limit on training for each employee assuming that 75% of the staff will actually take the full amount of training. Certification and Knowledge Management will be put in place so that even if someone not working in one domain but if interested fulfills the requirement. Loss of staff due to attrition or other reasons can be countered by making available a certain percentage of bench force. Clear statements about requirements and expectations for each position on the team will be made available to all employees to ensure that each employee understands what is necessary at each position and level of the organization in order to advance.

3.2 Assumptions, Dependencies and Constraints


This section describes the assumptions, dependencies, and constraints that for the CDCAAS project. The following assumptions are circumstances and events that need to occur for the project to be successful, but are outside of the control of the CDCAAS project team. Also Constraints are things that might restrict, limit, or regulate the project.

62

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

3.2.1 Assumptions
The following assumptions are accepted as true during the development of the CDCAAS project. The Corporation will provides all necessary resources including staff, software Development tools, hardware, budget resources, and staff training. The Universal 2009-B micro processor will be developed by Universal at their microprocessor division and delivered by February 2009 as scheduled. The alpha test facility in Fairfax, Virginia will be ready to be used for alpha testing by July 25, 2010. The Kernel developed by our Universal Microprocessor Division will be backward compatible. Each software packages will be increased by 20% in new features and its size by 20% by approximately Outsourced package will be delivered based on schedule with acceptable quality. The COTS products will be delivered on time with expected functionality and quality. Senior management will support the CDCAAS project fully and timely manner. Capability Maturity Model Integrated (CMMI-DVE) level 3 training will be provided to any staff members by Corporation if necessary.

3.2.2. Dependencies
The following dependencies are exists for our CDCAAS project and those dependencies need to be monitored timely manner to avoid any schedule delays. The Universal 2009-B micro processor will be developed and delivered by February 2009 as scheduled. The alpha test facility in Fairfax, Virginia will be ready to be used for alpha testing by July 25, 2010. The COTS products will be delivered on time with expected functionality and quality. The kernel set of software developed by our system engineers department will be ready by scheduled integration testing. Ivan Industry will deliver the GPS Navigation System by scheduled integration system test with expected functionality. The COTS products will be delivered on time with expected functionality and quality.

3.2.3 Constraints
The following constraints are identified in order to produce effective CDCAAS system. The CDCAAS system must operate on new System XYZ computer system with 63

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

acceptable performance, security, reliability and functionality. The CDCAAS system must be fully compliant with federal regulations and standards for securing data and protecting personal information. The CDCAAS system must offer specified level of audit ability to protect and ensure accuracy of data. The CDCAAS system must provide a high level of concurrency and loadbalancing among all its deployments.

3.2 Assumptions, Dependencies and Constraints


This section describes the assumptions, dependencies, and constraints that for the CDCAAS project. The following assumptions are circumstances and events that need to occur for the project to be successful, but are outside of the control of the CDCAAS project team. Also Constraints are things that might restrict, limit, or regulate the project.

3.2.1 Assumptions
The following assumptions are accepted as true during the development of the CDCAAS project. The Corporation will provides all necessary resources including staff, software Development tools, hardware, budget resources, and staff training. The Universal 2009-B micro processor will be developed by Universal at their microprocessor division and delivered by February 2009 as scheduled. The alpha test facility in Fairfax, Virginia will be ready to be used for alpha testing by July 25, 2010. The Kernel developed by our Universal Microprocessor Division will be backward compatible. Each software packages will be increased by 20% in new features and its size by 20% by approximately Outsourced package will be delivered based on schedule with acceptable quality. The COTS products will be delivered on time with expected functionality and quality. Senior management will support the CDCAAS project fully and timely manner. Capability Maturity Model Integrated (CMMI-DVE) level 3 training will be provided to any staff members by Corporation if necessary.

3.2.2. Dependencies
The following dependencies are exists for our CDCAAS project and those dependencies need to be monitored timely manner to avoid any schedule delays. The Universal 2009-B micro processor will be developed and delivered by February 2009 as scheduled. 64

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

The alpha test facility in Fairfax, Virginia will be ready to be used for alpha testing by July 25, 2010. The COTS products will be delivered on time with expected functionality and quality. The kernel set of software developed by our system engineers department will be ready by scheduled integration testing. Ivan Industry will deliver the GPS Navigation System by scheduled integration system test with expected functionality. The COTS products will be delivered on time with expected functionality and quality.

3.2.3 Constraints
The following constraints are identified in order to produce effective CDCAAS system. The CDCAAS system must operate on new System XYZ computer system with acceptable performance, security, reliability and functionality. The CDCAAS system must be fully compliant with federal regulations and standards for securing data and protecting personal information. The CDCAAS system must offer specified level of audit ability to protect and ensure accuracy of data. The CDCAAS system must provide a high level of concurrency and loadbalancing among all its deployments.

3.3 Risk Management


This section describes the major risks associated with CDCAAS. Risks and their corresponding mitigation strategies are described below. Since projects are undertaken to create a unique product or service, risk is an intrinsic part of project work. The goal of CDCAAS project risk management is to: Identify factors, or risks, that can negatively impact a project. Classify each possible risk into a risk category. Assess each risk and predict the possible impact on the project.

Maximize positive outcomes and minimize negative outcomes by closely monitoring all accepted risks. Categories that CDCAAS risks fall into are: Personnel Risks Application Development Risks Technical Risks Contractual Risks

65

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

The table below lists the significant CDCAAS Project risks, the category they fall into, and corresponding mitigation strategy. Risk Id R001 Risk Category Personnel Risk Description Shortage of technical personnel (Programmers) Risk Mitigation Strategy Programmers have been assigned to tasks ensuring gaps within subtasks, so they can be reassigned to applications where there may be a shortage of personnel if needed. Shortage of Senior Programmers with matching management skill sets are expected to take on the personnel (Team duties of team leads. Leads) Programmers will be shuffled to cover for the senior programmers tasks. Team members Team leaders and Experts are sent for have limited special training with the vendors or experience with experience developers in COTS and COTS and Reusable Software. Reusable Other programmers will be trained by Component Team leaders and Experts softwares Create manual documents for programmer to look for help and research Kernel is build on Request Kernel to be fully test on our new Universal 2006-G before furnish to processor CDCAAS Universal 2006-G Request fully training of Kernel and and they are new educating technical operating function and first operate of the Universal 2006-G in CDCAAS Create Kernel and Universal 2006-G support service to team members. Many CDCAAS To mitigate this risk, any applications packages will be that use visualization tools, will be using evaluated during regular intervals to visualization tool. determine if any new or existing It is possible that functionality can be categorized as a not all business business logic component and if the logic components identified component can be offered can be readily as a service. identified.

R002

Personnel

R003

Application development

R004

Application development

R005

Technical

66

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Risk Id R006

Risk Category Technical

Risk Description There is a risk that not all presentation components of the CDCAAS packages are identified before the integration. Product Scope of Work

R007

Contractual

R008

Contractual

Product Safety and Liability

R009

Contractual

Acquirer's Level of Control

Risk Mitigation Strategy To mitigate this risk, any presentation components are identified after the start of the integration will be evaluated to determine their criticality. The critical components will be added during integration and non-critical components will be omitted or added to the later versions of the CDCAAS. The better defined the requirements, the lower the risks and the greater the opportunity to establish fixed time and cost parameters The higher the concern about risk of failure, the higher the level of integrity required. This will be reflected in cost and schedule increases. Fixed-price contracts place the risk of software development on the shoulders of the supplier. To mitigate this risk, either ensures that the requirements are well defined or insist upon a "time-and-materials" basis for compensation.

Table 20: Risk Management Description and Mitigation Strategies

3.4 Monitoring and Controlling Mechanism


This subsection describes the monitoring and controlling mechanisms to be used in developing CDCAAS. This includes the measurement, reporting, and review techniques used in collecting, analyzing, and reporting information on project schedule, budget, quality, productivity and progress. The subsection is divided into five parts, to describe each of the project attributes being measured.

3.4.1 Schedule
The strategy for monitor and control of the CDCAAS schedule will be as follows: Create a planned schedule for use as a base line, using COCOMO and bottom-up estimates from the WBS packages. Conduct a biweekly audit session to assess health of the Project.

67

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Identify inch stones to help indicate the progress towards the milestones created by the Project Manager. Conduct weekly internal development team meetings to assess progress and have teams to take self-correcting actions to address any variances. Conduct Monthly official team meetings with the CDCAAS Project Manager, Programming and Analysis chief, and SQA chief during which actual work accomplished is determined and compared against planned work. Based on the comparisons made during the semi-monthly meetings, the CDCAAS PM takes an appropriate action to address variances. Outside vendor activities: These will be monitored more closely than internal activities and corrections must be made more swiftly.

A master schedule for the CDCAAS project is presented in section 5.5. Worked will be tracked at two levels; progress made in completing WBS work packages (micro level), and accomplishment of major milestones (macro).

3.4.1.1 Progress of the Team (Micro Level)


The micro level teams progress strategy is as follows: Assign start and end dates for the tasks. Timesheets for each team members status of activities and events. The timely reporting of data from team members. Provide positive reinforcement to those team members that deliver timesheet data on time. Conduct internal/informal team meetings of approximately 1 hour duration to review and access the progress. Preparation of meeting agenda and updating the concerns addressed and questions answered in the meeting. Track the status of inch-stones and mile-stones. Biweekly audits of the progress made by each team member. Discuss Issues like progress vs. cost estimate, requirements measurement for scope control and overall quality measurement in productivity. Preparation of the criteria to measure the progress of the entire project using the project charter. Preparation of Audits: Creation of a written assessment document of the teams progress in completing the work packages using 0-50-100% system. [Not started-0%, Not yet complete-50%, Complete-100%]. CDCAAS SQA Chief will be responsible of reviewing the team members timesheet against the accounting and payroll reports, detail work of the teams and verifying the accuracy of the teams assessment.

3.4.1.2 CDCAAS Projects Progress (Macro)


The results of the biweekly audit status will be the success indicator of the CDCAAS project. This will be done by: 68

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Computing the earned value schedule variance (BCWP - BCWS) by taking the difference between the budgeted costs of work performed (BCWP) and budgeted cost of work scheduled (BCWS). Recording which major milestones have been accomplished or missed to date for internal development teams. Recording which major milestones have been accomplished or missed to date by Ivan Industries, as compared with milestone and schedule specified in the subcontract. On- Schedule- Earned value variance for schedule that is within 5%. In this case no corrective actions will be taken. Behind-Schedule- Negative variance in excess of 5%. This will be recorded and will be discussed in the PMs monthly meetings. Ahead of schedule- Positive variance in excess of 5%- This will be recorded and discussed in the monthly meetings. Treating the missed milestone as a negative indicator. Discuss the achievement of the milestone within the next 30 days while keeping the rest of the task on schedule. In case of failure, additional 30 days will be provided. Shifting of resources or taking other corrective actions as necessary.

The schedule is divided into:

Plan of action in case of missing a major milestone:

These decisions to pursue the plan of action will be taken by the Project Manager in consultation with SQA Chief, the Admin Chief and the Procurement Specialist (Ivan Industries).

3.4.2 Budget
The Project Manager is responsible for establishing the budget for the project and must be approved by the Program Manager of New Systems Development, VP SW Division, and Director of Finance. The Project Manager can modify the budget to accomplishment the project, and the budget revisions must be approved by the Program Manager of New Systems Development, VP SW Division and Director of Finance. CDACAAS Budget will be monitored and controlled as follows: Personal effort, financial performance, and equipment resources are measurement of the budget. Reviewing staffs skills and salary to meet with project requirement and accompany with planned budget. Estimating project spending, facilities, equipment and materials are subject to planning budget. Microsoft Project and Microsoft Office will be used to plan the budget, which can be use to capture actual effort and estimate the project completion. Staff cost is reported from Payroll department, equipments and facilities cost is computed using actual cost or forecasting future needs. 69

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Identify Earned Value variances as 5% as a base line to compare actual progress to budget. If the variances are in the ranch of 5% is considered within budget, positive variance in excess of 5% is under budget, and negative variance is over budget. COCOMO and bottom-up estimates from the WBS work package are used in Grant Chart and Millstone Charts to establish a planned budget for use as a base line to establish inch-stone and milestone of the project. Budget is allocated to each project function and tasks to establish cost accounting for the project. Conduct weekly internal development team and semi-monthly official team meeting to review cost, expenditure, inch-stone, and milestone of progress against the budget. All costs relating to the project's activities will be collected, such as time cards, over head report, and purchases order (equipment, software, materials), and compare with budget and discuss with development team members (the details of discussion and reports are the same as in schedule section 3.4.1 above). In details, the weekly internal development team meeting will review and assess of progress on work packages, inch-tone and mile-stone to determine if the project is on, over, or under budget and take action to correct (if needed) the progress. In the semi-monthly meeting, with the presentation of CDCAAS Project Manager, each teams will present in detail their budget status, indicate the progress completed of work package in percentage against the budget plan. The variance of earn value is used to evaluate the actual progress to see if the progress is on budget. If the project is behind, Project Manager will aware the status to development team, review resource of staffs, schedule, and equipment cost, then make appropriate action to correct the performance such as reducing purchases, boosting up working time, and rescheduling the completion time. If the variance is above 15% behind, PM needs to report to Program Director to seek other options. In addition to the meetings, time frames and budget diagrams will be visible to core team to review the project status. The kick off daily team meeting (15 to 30 minutes) is an opportunity to identify issues, risks and any other relevant information, to communicate progress, and also to adjust and balance workloads for team member to meet budget plan.

3.4.3 Quality Assurance


The following list reiterates and describes some of the mechanisms used to monitor and control Quality in CDCAAS.

3.4.3.1 Design Walk-through


Involvement of the Software Quality Assurance Team at all design walk-through throughout the entire software development life cycle.

70

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Responsibility of the Software Quality Assurance Team to ensure that peer and management reviews of the software design are conducted. Responsibility of the Project Manager to ensure that a verifiable process is used to identify all action items generated during the review process. Audit conducted by the Software Quality Assurance Team to ensure that all action items have been addressed.

3.4.3.2 Baseline Quality Reviews


Prior to any baseline release of executable code the reviews will be conducted by Software Quality Assurance Team that is identified with an alphabetic revision ID. This review ensures that:

1. 2. 3. 4. 5.

The code has been tested and meets module specifications, except as noted. That any changes to applicable software module design documents have been identified. That appropriate validation tests have been run. That the functionality of the baseline is documented. That all software design documentation complies with the Software Quality Assurance plan.

3.4.3.3 Code Reviews


Responsibility of the Team Leads to assign another engineer to review the code with the author. Such reviews are considered informal. The purpose of which is to pinpoint early-on flaws in accuracy, style, maintainability, as well as promote sharing of knowledge and coding style among the staff.

3.4.3.4 External Reviews


Client and Senior Management will review the deliverables before the final submission. The goal is for the documents to come to a version that is acceptable for the client, external advisors monitoring the project and senior management

3.4.3.5 Inspections
Conduct inspection to review requirement analysis results, requirement traceability matrices, preliminary designs, code, and test plans/cases/procedures. Participants of the inspection will be :Team lead, QA Engineer and Testers Mandatory for the teams to familiarize themselves with the item that is to be inspected prior to the inspection. 71

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Suggestion of a defect log with full description of errors/problems found will be produced and inserted in the appropriate software development folder. Audit of the defect log and the solutions suggested by the Software Quality Assurance team. Documented inspection records will be kept in a safe software development folder.

3.4.3.6 Corrective Action Process


CDCAAS will maintain a Corrective Action Process to provide for: Logging bugs, flaws and discrepancies Identifying and classifying the cause Defining a corrective action Tracking and verifying the corrective action Disposal of flaws

3.4.4 Productivity
Tasks will be broken down into subtasks and work packages (when appropriate) using the work breakdown structure for an individual teams defined piece of work. Data will be collected based on the amount of effort and time expended working on the component by the in-house programming teams. Data will be reviewed from the outsourcing companys billing documentation based on the work breakdown structure and Earned Value metrics reported Data presented by tem during periodic status reviews will be discussed and evaluated in terms of its effectiveness in helping gauge productivity.

3.4.4.1 Product Evaluations


Productivity Evaluation for the CDCAAS will be measured by: Periodic Status Reviews Integration or Interface Commonalities Verification & Validation Techniques Documentation Progress Unit & Product Test Evaluation

Periodic status reviews will take place weekly or monthly depending on the requirements understanding, coding debugging and prototyping progress of the

72

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

software unit. The status of the software units will be monitored by periodic (bimonthly or monthly) audits and inspections.

3.4.4.2 Corrective Action Plans


CDCAAS application will log unexpected bugs/errors which might show up due to various factors i.e. including both hardware & software. CDCAAS application ensures its smooth functioning even in event of a resource failure. CDCAAS application ensures appropriate alert messages and notifications for the user, with step by step walk through in event of a problem. Productivity Metrics for the CDCAAS Project will be recorded for each component. Metrics for documenting interfaces and integration will be the responsibility of the in-house programming teams since the chosen COTS and Outsourced code will need to interoperate with the rest of the system in a way that appears seamless to the end user. Testing and Defect Metrics will be collected as part of productivity metrics. Testing of the interfaces between the software units will be emphasized since these plans should cascade naturally from the documentation on interfaces and interface metrics. Preliminary Requirements Design Critical Design Subsystem Deployment Review Review Review Test Review Review 12/30/08 09/05/09 02/28/09 03/03/09 11/07/09 05/02/09 07/14/09 03/13/10 09/05/09 09/15/09 05/15/10 11/07/09

Component Database Management 10/28/08 System Spreadsheet 07/04/09 Requirements & 12/27/08 Configuration Management. Secure 10/25/08 Communication Graphics 04/02/09 Presentation Word Processor 09/05/09 Project 01/10/09 Management GPS Navigation 06/04/09 Compiler 10/08/09

12/27/08 06/04/09 11/07/09 03/14/09 08/06/09 12/10/09

02/28/09 08/06/09 01/09/10 05/16/09 10/08/09 02/11/10

05/02/09 11/25/09 05/15/10 09/19/09 02/11/10 06/17/10

08/21/09 01/09/10 07/02/10 11/21/09 04/15/10 08/19/10

73

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Component Debugger & 02/03/10 Test Electronic Inventory & 02/03/10 Tracking

Preliminary Requirements Design Critical Design Subsystem Deployment Review Review Review Test Review Review 04/07/10 04/07/10 06/09/10 06/09/10 10/13/10 10/13/10 12/15/10 12/15/10

Table 28: Productivity Review Schedule

Group Individual Developers

Project Teams

Development Organization

Appropriate Metrics Work effort distribution Estimated vs. actual task duration and effort Code covered by unit testing Number of defects found by unit testing Code and design complexity Product size Work effort distribution Requirements status (number approved, implemented, and verified) Percentage of test cases passed Estimated vs. actual duration between major milestones Estimated vs. actual staffing levels Number of defects found by integration and system testing Number of defects found by inspections Defect status Requirements stability Number of tasks planned and completed Released defect levels Product development cycle time Schedule and effort estimating accuracy Reuse effectiveness Planned and actual cost Table 29: Appropriate Metrics

3.4.5 Progress
The schedule, milestones, and reviews are good reference points for measuring progress. The customer will pay attention to the planned milestones and gates to be completed by a scheduled date, the schedule reviews during status meeting will 74

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

indicate if these dates are being met. Management wants to know if CDCAAS keeps budgeted expenditures within the expected variance range from actual expenditures. The team will conduct weekly progress status meetings to assess progress or problems. Once CDCAAS are in the system test phase, daily status meeting will be conducted to provide up-to-date status of the project between team members and also make sure that we can meet the target dates with current progress. The project manager uses the metrics discussed in sub sections to measure the projects progress. Schedules and Budgets both play important roles as does QA. The team delivers not only on time and according to budget, but also in the right way and the right system. By keeping an eye on productivity as well, and taking corrective actions when necessary, the team will ensure that CDCAAS delivers the correct system on time and within budget.

3.4.5.1 Development Team Progress


The CDCAAS development team progress will be monitored and controlled as follows. Conduct weekly status meeting with developers and testers Conduct monthly CCB meeting with customer to discuss issues raised during status meeting Conduct daily status meeting from system test phase Conduct daily test CCB from system test phase

Project schedule will be updated after each status meeting to reflect the most current development progress.

3.4.5.2 Project Progress


The following Earned Value measurements will be used to monitor project schedule progress: Compute the Earned Value Schedule Variance using the following formula: SV = BCWP - BCWS SV: Schedule Variance BCWP: Budgeted Cost Work Performed BCWS: Budgeted Cost Work Scheduled Compute the Earned Value Cost Variance using the following formula: CV = BCWP - ACWP CV: Cost Variance BCWP: Budgeted Cost Work Performed ACWP: Actual Cost Work Performed

75

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Budgeted Cost of Work Performed (BCWP) will be automatically calculated by Microsoft Project as the % complete activity status is collected and entered into the appropriate fields in the program. Budgeted Cost of Work Scheduled (BCWS) is automatically maintained by Microsoft Project as time passes after a plan baseline is saved within the software. Schedule Variance (SV) is automatically calculated by Microsoft Project as the BCWS and BCWP values change. Each measurement will be retrieved from Microsoft Project and published for monthly/weekly status meeting During the project status meeting project manager will monitor current progress for the following items and after meeting modify schedule with up-to-date status: Which major milestone have been accomplished or missed to date for internal development teams. Which milestones have been accomplished or missed to date by Ivan Industries, as compared with the milestone schedule specified in the subcontract.

3.4.6 Measures
The following Earned Value measurements will be used to monitor project schedule progress: SV: Schedule Variance CV: Cost Variance BCWP: Budgeted Cost Work Performed BCWS: Budgeted Cost Work Scheduled ACWP: Actual Cost Work Performed Compute the Earned Value Schedule Variance using the following formula: SV = BCWP - BCWS Compute the Earned Value Cost Variance using the following formula: CV = BCWP - ACWP

Life Cycle Phase


Requirements
Schedule Attribute Base Work completed WP done to date

Design
Work completed WP done to date

Code & Unit Test


Work completed WP done to date

Int. Test
Work completed WP done to date

System Test
Work completed WP done to date

Post Deploy Support


Installs completed Installs done to date

User Operations
Help desk use Hours support used to date

76

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Life Cycle Phase


Requirements
Derived BCWP-BCWS

Design
BCWPBCWS SV over 5% BCWP Expended dollars Actual cost to date BCWPACWP CV over 5% BCWP Complexity of design Design complete to date Design complete/ Design expected Over 5% less than expected Design completed Design completed to date Design completed / Design expected 5% less complete then expected Schedule projection met

Code & Unit Test


BCWPBCWS SV over 5% BCWP Expended dollars Actual cost to date BCWPACWP CV over 5% BCWP Defects detected Defects reported to date Defects / SLOC

Int. Test
BCWPBCWS SV over 5% BCWP Expended dollars Actual cost to date BCWPACWP CV over 5% BCWP Defects detected Defects reported to date Defects / SLOC

System Test
BCWPBCWS SV over 5% BCWP Expended dollars Actual cost to date BCWPACWP CV over 5% BCWP Defects detected Defects reported to date Defects / SLOC

Post Deploy Support


Complete installs/planned installs Over 5% less installs the expected Expended dollars Actual cost to date BCWP-ACWP CV over 5% BCWP Defects reported Defects reported to date Defects / SLOC

User Operations
Actual/Expect ed use Over 5% more use then expected Expended dollars Actual cost to date BCWP-ACWP CV over 5% BCWP Defects reported Defects reported to date Defects / SLOC

Decision Indicator Cost Attribute Base

SV over 5% BCWP Expended dollars Actual cost to date BCWP-ACWP CV over 5% BCWP Requirement Volatility Requirements Changed to date New Reqmts/Orig Reqmts Over 5% new requirements

Derived Decision Indicator Quality Attribute Base

Derived

Decision Indicator

Over 5 defects per 1000 SLOC SLOC completed SLOC complete to date SLOC complete/ SLOC expected 5% less complete then expected Schedule projection met

Productivit y Attribute Base

Reqmts completed Reqmts completed to date Reqmts completed / Reqmts expected 5% less complete then expected

Over 5 defects per 1000 SLOC Rework required Recoding required to date Amt. new code / Original code Over 10% redone

Over 5 defects per 1000 SLOC Rework required Recoding required to date Amt. new code / Original code Over 10% redone

Over 5 defects per 1000 SLOC

Over 5 defects per 1000 SLOC Trouble Tickets received Tickets received to date Tickets received/ time Over 10 tickets received per day Customer complaints

Sites installed

Sites installed to date Sites installed/time

Derived

Decision Indicator

Progress Attribute

Schedule projection met

Quality projection met

Quality projection met

Over 10% less installed in amount of time then expected Customer satisfaction

77

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Life Cycle Phase


Requirements
Base Reqmts complete to date Reqmts complete/ Reqmts expected 5% less complete then expected

Design
Design completed to date Design completed / Design expected 5% less complete then expected

Code & Unit Test


SLOC complete to date SLOC complete/ SLOC expected 5% less complete then expected

Int. Test
Defects reported to date Defects / SLOC

System Test
Defects reported to date Defects / SLOC

Post Deploy Support


Customer satisfaction survey results Customer satisfaction survey results/time Customer satisfaction decreasing over time

User Operations
Customer complaints reported to date Customer complaints reported/tim e Customer complaints increasing over time

Derived

Decision Indicator

Over 5 defects per 1000 SLOC

Over 5 defects per 1000 SLOC

Table 30: Class vs. Phase Measurements

Attribute Characteristic
Attribute collected

Schedule
Work completed (Work packages completed to date) Development teams Semimonthly Team status reports and timesheets

Budget/ Cost
Expended dollars (Expended dollars to date)

Quality
Defects detected (Defects detected to date)

Productivity
Rework required (Rework required to date)

Progress
Customer satisfaction

Source of data

Collection schedule How is data collected

Development Development Development teams and testing and testing teams teams SemiSemiWeekly monthly monthly Timesheets Defect Defect and reporting reporting Purchase tool, peer tool, source Requests and quality control reviews system change reports

Customer surveys, Help desk Monthly Customer satisfaction survey reports, Complaint and resolution turnaround time reports Customer Satisfaction Management Reports

Where is the attribute stored

Microsoft Project and Timesheet system

CDCAAS Project Budget and Finance Spreadsheet

Software Discrepancy Report Unit Development Folder

Unit Development Folder Trouble Reports

78

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Attribute Characteristic
Who verifies the attribute

Schedule
QA Manager

Budget/ Cost
Assistant PM, Payroll and Accounting Department Manager

Quality
Software Quality Assurance Team, Testing Team Manager

Productivity
Software Quality Assurance Team, Testing Team Manager, Development Team Lead

Progress
Project Manager, Assistant PM, Support Team Manager

Table 31: Project Measures Matrix

3.5 Staffing Plan 3.5.1 Obtaining


While filling a position for the CDCAAS project, we follow certain methods for recruiting. They are: Recruit capable people we know personally or by reputation. Solicit referrals as appropriate from stakeholders, domain experts and team members. Skim the best talent from the home office pool. Offer referral bounties to team members. Consider employees based on advertisements in Newspapers. Utilize the service of a specialized recruiter with whom the company has a relationship to aid in the employee search.

3.5.2 Training
Training includes all activities designed to enhance the competencies of the project team members. Training is both formal and informal. We follow training methods like classroom, online, computer based, on the job training from another project team member, mentoring and coaching. If project team members lack necessary management or technical skills, such skills can be developed as part of the project work. Scheduled training takes place as stated in the staffing management plan. Unplanned training takes place as a result of observation, conversation, and project performance appraisals conducted during the controlling process of managing the project team.

3.5.3 Retaining
Retention is as important as recruiting. Perhaps it is even more important due to the amount of time and training given to our team. We invested the time to build the 79

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

team, now create the strategy to keep them and to improve their performance. Loyal teams make the difference during tough times. There are, undoubtedly, hundreds of ideas and techniques for retaining key employees and team members. This list should be considered a few great ideas. Consider our companys strategic goals and consider the people we have in place who are attempting to meet those goals. The some ideas followed are here: Communicate - Keep our team members informed. Stay visible - Team members feel more confident when they know the leader is available for support. Feedback - Provide it often and ask for it. Keep an open mind & consider their suggestions. Show appreciation for good work - Reward our key players as often as possible. People generally won't work for people who just don't care for them. Realizing the fact, we truly care about the staffs, their career and their goals. Provide Dual- Career Ladders Often providing promotions along with salary increases and better working conditions.

3.5.4 Phasing out of personnel


Team members who are phased out of the project will be sent back to the home office. To the extent possible, we will facilitate the placement of such persons in other projects. We recommend them for their promotions and salary raises depending upon their performance in our project.

80

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 20: Compiler Package Staffing

81

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Compiler Package Staffing

Table 21: Compiler Package Staffing

82

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 21: GPS Navigation Package Staffing

83

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

GPS Package Staffing

Table 22: GPS Navigation Package Staffing

84

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

85

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Secure Communications Package Staffing

Table 23: Secure Communication Package Staffing

86

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 22: Graphics Package Staffing

87

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Table 24: Graphics Package Staffing

88

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 23: Total Project Staffing Chart

89

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Table 25: Total Project Staffing

90

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 24: Total Project Staffing by Package

91

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Total Project Staffing by Package

Table 26: Total Project Staffing by Package

92

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

CDCAAS CORPORATION Senior Programmers CDCAAS Corp. is seeking Senior Programmers in C, C++, and Java with experience in full life cycle for a new product development in Fairfax area. Job Description: Will be responsible for performing requirements analysis, design, coding, unit testing, integration testing of software, and personal configuration management in a CMMI. Required Qualifications: -Minimum Education: BS in CS or SW Engineering. -Minimum Experience: Candidates must have 5 years experience in developing C, C++, Visual Basic, JAVA, and experience with GUI Design; requires low-level Windows systems programming/device driver development, and intimate knowledge of Windows internals; have knowledge of relational database concepts (i.e. ERDs) and significant experience programming in .NET Technology; experience in using COTS tools and Reusable Software a plus. Required Skills: - GUI design and program experience. - Proficiency in C/C++, Java, UML, and HTML - Experience in database programming, MS Project, MS Office, and VISIO. - Experience with COTS product and Reusable software is a plus. - Understanding of network management and correlation products is highly desirable. - Excellent written and oral communication skills and ability to work with people at every level Join CDCAAS Corp to get your career on the fast track. We'll work together to determine a suitable benefits package. We offer options to our technical professionals that could include: a health plan, 401k, provisions for vacation and holiday pay, and technical and professional training. Interested applicants should submit resume and salary requirements to Technical Recruiter, CDCAAS Corp, P.O. BOX 1234, Fairfax, Virginia. Out of area Candidates will be considered for this opportunity.

93

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

CDCAAS CORPORATION Business Analyst CDCAAS Corp. is looking for a Business Analyst to support our professional services and product management initiatives. Responsibilities: Interact with business and technical teams to support growing customer base Apply in-depth knowledge of software application to respond to inquiries from customers and internal staff Interact with customers who are users of application software. Track and monitor status and progress of customer issues Collect and analyze customer feedback. Assist with review and provide input to business requirements and analyze product features/functions Work with engineering to understand technical issues relating to product and business requirements Assist with development and execution of application testing Facilitate the management of client needs, tasks, and expectations Provide input to and assist in maintenance of user product documentation, including functional specifications, process flows, and business rules Qualifications Excellent verbal and written communications skills Strong organizational skills Ability to learn quickly and be effective in a fast-paced environment Strong analytical and problem solving skills A great deal of initiative and team spirit Knowledge of rule engine and mortgage or financial industry preferred 1 - 3 years of relevant job experience in an IT product development or BA or BS degree (Prefer Accounting, Information System, Economics, or Business)

94

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

CDCAAS CORPORATION Components Reusable Expert CDCAAS is seeking Expert in Components Reusable implementation with experience in full life cycle for a new product development in Fairfax area. Job Description: Works with Software development team and Project Manager to define design requirements and additional system requirement, design or prototype application software interface, design Common Module and reusable components. Required Qualifications: -Minimum Education: BS in CS or SW Engineering -Minimum Experience: Must demonstrate 4 years experience in the analysis and validation of reusable software and hardware components, experience with software design and implementation, expertise in .NET*, C/C++, Java, and XML, ability to work comfortably and effectively with operational users, program managers, and software engineers. -Required Skills: - GUI design and program experience. - Experience in database programming, MS Office, MS Project, and WBS Chart Pro - Experience with COTS product and Reusable software - Significant experience programming in .NET Programmable Interop Assemblies (NPI). Join CDCAAS to get your career on the fast track. We'll work together to determine a suitable benefits package. We offer options to our technical professionals that could include: a health plan, 401k, provisions for vacation and holiday pay, and technical and professional training. Interested applicants should submit resume and salary requirements to Technical Recruiter, CDCAAS Corp, P.O. BOX 1234, Fairfax, Virginia. Out of area Candidates will be considered for this opportunity. is a plus.

95

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

CDCAAS CORPORATION Junior Programmers CDCAAS is seeking Junior Programmers for new development effort in Fairfax area. Job Description: Works with Software development team and Project Manager to develop code and deploy a full life cycle for a new product development. -Required Qualifications: Associate degree CS with 2 year experience or BS in CS or SW Engineering, Project and Detail-oriented, self-reliant/self-starter, team player, creative problem solving skills. -Required Skills: Strong knowledge of MS Office, Visio, MS Outlook, HTML, and UML Knowledge of .Net Technology, Rational Rose, Visual Basic is a plus. Join CDCAAS corp. to get your career on the fast track. We'll work together to determine a suitable benefits package. We offer options to our technical professionals that could include: a health plan, 401k, provisions for vacation and holiday pay, and technical and professional training. Interested applicants should submit resume and salary requirements to Technical Recruiter, CDCAAS CORP, P.O. BOX 1234, Fairfax, VA. Out of area candidates will be considered for this opportunity.

3.5.5 Staff Expertise


Staff Expertise, Recruitment, and Utilization Skill Index (1-4)
4 3 3 2

Job Title
Project Manager Asst Project Manager Business Analyst Architecture Designer

Level (I-VII)
IV III II III

Degree/Major
MS Computer Science MS Software Engineering MS MIS BS Computer Engineering

Experience (years)
10 7 5 5

Salary($)
110,000 87,000 68,000 68,000

Bid Price
330,000 260,000 204,000 204,000

Labor Category
50 44 178 172

Source
Transfer Transfer Advertise Transfer

96

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Staff Expertise, Recruitment, and Utilization Skill Index (1-4)


3

Job Title
Software Engineering Team Lead(3) Senior Programmer Analyst(3) COTS Expert Computer Expert Junior Programmers(15) QA Engineers(2) Tester(4) CM Specialist Training Specialists Installation Specialist Document Specialist System Engineers Network Support Specialist Quality Assurance Specialist

Level (I-VII)
III

Degree/Major
MS Software System Engineering BS Computer Science BS Computer Science BS Computer Science BS Computer Science MS Information Systems BS Computer Science BS Information Systems BS Information Systems BS Electrical Engineering BS Management Information Systems BS Computer Science BS Electrical Engineering Management Information Systems

Experience (years)
6

Salary($)
84,000

Bid Price
252,000

Labor Category
167

Source
Transfer/ Advertise Advertise /Transfer Advertise Advertise Advertise /Transfer Transfer Transfer Transfer Transfer Transfer Transfer Transfer Transfer Advertise

II II II I IV II III III III I I III VII

5 4 4 2 8 2 6

3 2 2 1 2 2 2

76,000 50,000 50,000 45,000 110,000 60,000 84,000 50,000 47,000

228,000 150,000 150,000 135,000 330,000 180,000 252,000 150,000 141,000 120,000 204,000 174,000

190 120 121 191 100 152 155 43 49 54 123 175 175

2 5 5 7

1 2 2 4

45,000 40,000 68,000 73,000

97

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

4. Technical Process
4.1 Methods, Tools & Techniques
The following wire-structure describes applicable tools, methods, and standards with the understanding that the local interpretations and deviations from those standards may apply during each phase and for each type of software application (Custom, Reuse, COTS and Outsourced). The actual methods and standards will vary as deliverables proceed through each phase of the development life cycle for each component over the entire software life cycle. The following outline serves as a governing guide of currently understood best practices and standards. 1. 1.1. 1.1.1. 1.1.2. 1.2. 1.2.1. 1.2.2. 1.2.3. 1.2.4. 1.3. 1.3.1. 1.3.2. 1.3.3. 2. 2.1. 2.1.1. 2.1.2. 2.1.3. 2.1.4. 2.2. 2.2.1. 2.2.2. 2.2.3. 2.2.4. 2.2.5. 2.3. 2.3.1. 2.3.2. Requirements Processes Applicable Tools Visual Paradigm for UML-Requirements Management Package Visual Paradigm for UML-UML Support Package Methods UML Use Cases Throw-away prototypes Interactive Storyboards Standards IEEE 830, Recommended Practices for Software Requirements Specifications IEEE 1062, Recommended Practices for Software Acquisition IEEE 1420, Standard for Information Technology-Software Reuse Design Tools Visual Paradigm for UML-UML Support Package Visual Paradigm for UML-Database Modeling Package Visual Paradigm for UML-Object-Relational Mapping Package CASE Tools Methods Object Oriented Design Information Hiding Relational Database Design UML XML Standards IEEE 1016, Recommended Practice for Software Design Descriptions European Computer Manufacturer's Association (ECMA) TR/55, Reference Model for Frameworks of Software Engineering Environments 98

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

2.3.3. 2.3.4. 3. 3.1. 3.1.1. 3.1.2. 3.1.3. 3.2. 3.2.1. 3.2.2. 3.3. 3.3.1. 3.3.2. 3.3.3. 3.3.4. 4. 4.1. 4.1.1. 4.1.2. 4.1.3. 4.2. 4.2.1. 4.2.2. 4.2.3. 4.3. 4.3.1. 4.3.2. 4.3.3. 5. 5.1. 5.1.1. 5.1.2. 5.2. 5.2.1. 5.2.2. 5.3. 5.3.1. 5.3.2.

IEEE 1348, Recommended Practices for Adoption of CASE Tools IEEE 1420, Guide for Information Technology-Software Reuse Concept of Operation for Interoperating Reuse Libraries Code Tools JAVA Eclipse IDE for JAVA EE Developers Visual Paradigm Teamwork Server Methods Pair Programming Evolutionary Prototyping Standards Code Conventions for the Java Programming Language (SUN Microsystems, JAVA code) IEEE 1028, Standard for Software Reviews and Audits IEEE 730, Standard for Software Quality Assurance Plans IEEE 1298, Standard for Software Quality Management Systems Unit Test Tools Cactus Visual Paradigm Teamwork Server Bugzilla Methods Bottom-up testing Testing with sample data Testing with expected and high volume data sets Standards IEEE 1008, Standard for Software Unit Testing IEEE 1012, Standard for Software Verification and Validation Plans IEEE 829, Standard for Software Test Documentation Integration Test Tools Visual Paradigm Teamwork Server Bugzilla Methods Top-down testing White Box testing Standards Open Process Framework (OPF) Testing IEEE 829, Standard for Software Test Documentation 99

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

6. 6.1. 6.1.1. 6.1.2. 6.2. 6.2.1. 6.2.2. 6.2.3. 6.2.4. 6.2.5. 6.2.6. 6.2.7. 6.2.8. 6.3. 6.3.1. 7. 7.1. 7.1.1. 7.1.2. 7.2. 7.2.1. 7.2.2. 7.2.3. 7.2.4. 7.3. 7.3.1. 7.3.2. 8. 8.1. 8.1.1. 8.1.2. 8.1.3. 8.2. 8.2.1. 8.2.2. 8.2.3. 8.2.4. 8.2.5. 8.2.6. 8.2.7.

System Test Tools Visual Paradigm Teamwork Server Bugzilla Methods Black Box Testing Thread Testing White Box Testing Limited-Scale Integration Testing Full-Scale Integration Testing Test Plan Test Execution and Issue Resolution Test Documentation Standards IEEE 829, Standard for Software Test Documentation Deployment Tools Symantec Endpoint Antivirus Kaspersky Antivirus 2009 Methods Beta Testing Software Integration Full Deployment Baseline Standards IEEE 828, Standard for Software Configuration Management Plans IEEE 1063, Standard for Software User Documentation Maintenance Tools Liberum CDCAAS Bugzilla Methods Remote monitoring Help Desk Service Level Agreement (SLA) Automatic updates Error tracking Baselines Software Integration 100

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

8.2.8. 8.2.9. 8.3. 8.3.1. 8.3.2. 8.3.3. 8.3.4. 8.3.5. 9. 9.1. 9.1.1. 9.1.2. 9.1.3. 9.1.4. 9.2. 9.2.1. 9.2.2. 9.2.3. 9.2.4. 9.2.5. 9.2.6. 9.3. 9.3.1. 9.3.2. 9.3.3.

Security Practices Information Technology Infrastructure Library (ITIL) Standards IEEE 1219, Standard for Software Maintenance IEEE 1042, Standard for Software Configuration Management ISO 17799, Security Standard BS 7799, Guidelines for Information Security Risk Management ISO 27001, Information Security ManagementSpecification with Guidance for Use Disposal Tools Life Span Technology Recycling Retire-IT Computer Disposal Intel Students Recycling Used Technology Program Computer for Schools Affiliate Methods Certified Data Destruction Hard Drive Erasure Equipment Inventory, Inspection, and Testing Comprehensive Reporting Responsible Recycling Remarketing Standards Applicable State and Local Regulations (e.g.State of California's Electronic Waste Recycling Act of 2003, any CRT landfill bans) International Association of Electronics Recyclers (IAER) Electronic Industries Alliance (EIA) Reuse and Recycle

4.2 Software Documentation


Throughout the project lifecycle various documents shall be created. Each document will follow a common process flow for creation, review, and distribution. The documentation specialist shall define document templates that all documentation entities will use. Document or section of document is assigned to subject matter expert. Authors prepare document. Document is reviewed by other subject matter experts and approval authority.

101

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

If document is approved from the review, documentation specialist assembles the final document by incorporate review comments. If document is not approved the author revise document and re-review the document. If the document is to be delivered to the customer, the customer reviews will be conducted. If the customer rejects the document, the document will be sent to the author for revise. If the customer approves the document with comments, the author will incorporate comments. Document is stored, labeled and version controlled. The document is officially released for distribution. The following table describes the software documentation necessary to effectively communicate requirements and responsibilities to entire project team. The document entities include list of activities and tasks, organization of tasks by execution time, type of skills required to produce the document, assigned personnel to the task, and the document process flow document.

Name Project Proposal

Format/ Data Item Description Corporation Format

Written By Marketing

P A G E S 120

Time to Initiate Before Project Initiates

Approval Authority Senior Corporate Management

Distribution Senior Corporate Management, Senior Project Manager, Customer Senior PM, PM, Chief Architect Senior PM, PM, Chief Architect, Team Leader Senior PM, PM, Chief Architect, Team Leader

Publication Type Essential to Project, Essential to Customer

Price $72,000

Miles-tone date 8/1/2008

System Design Document Software Project Managemen t Plan Soft-ware Requireme nts Specification

DI-MCCR80534

Soft-ware Team

150

After SPMP

Chief Architect

Essential to Project

$22,500

10/25/2008

IEEE Std 1058.1-1987

Project Manager

190

Before Project Initiates

Senior Corporate Management

Essential to Project

$28,500

09/05/2008

DI-MCCR80025

Subject Matter Expert

230

After SPMP

Senior Project Management

Essential to Project

$34,500

11/25/2008

102

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Name Interface Requirements Document Interface Design Document

Format/ Data Item Description DI-MCCR80026

Written By Software Team

P A G E S 175

Time to Initiate After SPMP

Approval Authority Senior Project Management

Distribution Senior PM, PM, Chief Architect, Team Leader PM, Chief Architect, Team Leader

Publication Type Essential to Project

Price $26,250

Miles-tone date 11/25/2008

DI-MCCR80027

Software Team

225

After SRS

Senior Project Management Project Manager

Essential to Project

$33,750

3/25/2009

Software Design Document Software Test Plan

DI-MCCR80012

Software Team

350

After SRS

PM, Chief Architect, Team Leader Chief Architect, Team Leader Chief Architect, Team Leader

Essential to Project

$52,500

09/25/2009

DI-MCCR80014

Software Team

275

After SRS

Team Leader

Essential to Project

$55,000

10/25/2009

Version Description Document

DI-MCCR80013

Software Team

45

After Every Software Build After Software test are performed After Detailed Design

Team Leader

Essential to Project

$6,750

1/25/2011

Software Test Report

DI-MCCR80017

Product Assurance

125

Team Leader

PM, Chief Architect, Team Leader

Company Important

$25,000

1/25/2011

System Operators Manual

DI-MCCR80018

Subject Matter Expert

290

Senior Project Management

Senior PM, PM, Chief Architect, Customer Senior PM, PM, Chief Architect, Team Leader, Customer Senior PM, PM, Team Leader, Customer PM, Chief Architect, Team Leader

Essential to Customer

$174,000

12/27/2010

Software Users Manual

DI-MCCR80019

Subject Matter Expert

180

After Detailed Design

Senior Project Management

Essential to Customer

$108,000

12/27/2010

Software Programme rs Manual

DI-MCCR80021

Subject Matter Expert

200

After Detailed Design

Chief Architect

Essential to Customer

$40,000

12/27/2010

Engineering Change Proposal

DI-E-3128

Product Assurance

12

After SRS analysis

Chief Architect

Company Important

$2,400

12/27/2010

103

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Name Total

Format/ Data Item Description

Written By

P A G E S 2555

Time to Initiate

Approval Authority

Distribution

Publication Type

Price $681,150

Miles-tone date

Table 27: Document Table

Figure 25: Document Production Process Flow Chart

104

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

4.3 Project Support Functions


The team CDCAAS will develop several additional plans that will support the CDCAAS projects effective development. Those plans will be Configuration Management (CM), Quality Assurance (QA), Verification & Validation (V&V), and Test and Evaluation (T&E). The following sections will address each of these functions and discuss their responsibilities and planned activities in the entire Software Development Lifecycle.

4.3.1 Configuration Management


The CDCAAS team utilizes Microsofts Visual SourceSafe to store and version controls the configuration items for the product. When an item has successfully passed its specified control gate(s), CM removes write access to the item and labels it as a baseline version. Requests for changes to base lined items are reviewed by the products Change Control Board (CCB); write-access is granted only if the request is approved. In this manner, rigorous configuration management and change control is maintained throughout the product lifecycle. As software modules are completed and software builds take place, CM coordinates the review of build directive materials and ensures the adequacy of build preparation before authorizing the engineering staff to begin the build. In a similar manner, CM coordinates the release of all product deliverables by using a Microsoft Excel Spreadsheets to track and report on deliverable status during the course of the product. CM, supported by the product team, is ultimately responsible for ensuring the integrity of delivered software and related documentation. The planned CM activities are: - Identify Configuration Items Each item that is under CM control and protected from changes shall be uniquely identified for tracking and auditing purposes. Identifiers can be designated as soon as the product deliverables list is finalized, but no later than the point at which the item is placed under CM control. - Baselines CM, in conjunction with the Product Technical Lead, creates and maintains a folder structure within VSS for the products configuration items. Configuration items are then developed in accordance with the product lifecycle, and are submitted to CM after passing appropriate reviews or control gates. CM creates a baseline of the item, using VSS labels. If a base-lined item is approved for update, CM provides write access for team members to incorporate the approved changes. Following appropriate approval activities, the item is re-base-lined by CM.

105

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

- Configuration Change Control The change control process is applied to specified configuration items, components, and baselines under CM control. CM will not accept changes to controlled documents without appropriate documentation identifying the changes to be made and the authorization for those changes. Proposed changes will be documented, analyzed for cost and schedule impact, and, if approved, prioritized and scheduled for implementation. The CCB, chaired by the Chief Scientist, reviews the completed DTI, and approves it for closure or rework. Minutes from the CCB meetings are forwarded to the customer for their review. - Configuration Status Accounting CM will utilize VSS along with a Microsoft Excel Spreadsheets Database, to track and report information about product deliverables and CM activities. Standard reports are issued according to the product schedule, and may also be requested on an as-needed basis by management. The generated reports are stored as controlled items, but are not under formal change control. - Configuration Audits In addition to QA audits of the CM process, the CMA periodically reviews the structure and facilities of the CM repositories, and the CM Information System, to verify that the contents are correct and complete. CM audits software baselines to verify that they conform to the documentation that defines them. The results of the audits are documented, and any identified action items are recorded and tracked to closure. - Build Management Working closely with the Product Technical Lead, the build release engineer develops and maintains the scripts necessary to create automated nightly builds. Records of the builds are maintained in sufficient detail that the status of the system is known at all times, and recovery of previous versions is possible. - Release Management CM is responsible for the preparation and external release of controlled items in accordance with the approved CM procedures. Items cannot be externally released without the consent of CM. CM is also responsible for internal releases of items under its control. An internal release makes the item (e.g., documents, software) available to the appropriate product team member for update. - Environment Maintenance

106

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

CM, working with the Network support staff, is responsible for creating and maintaining the products base-lined environments (e.g., development, testing). Change control polices will be applied to all base-lined environments. - Disposition of Data at Close Out As permitted by contractual agreements, CM will maintain product materials for the purposes of reference and reuse after product closeout.

4.3.2 Quality Assurance


The primary objective of the Quality Assurance (QA) is to provide management with the adequate confidence that the defined software process is being implemented and to verify continual improvement in the quality of the products under development, the processes employed, and in the staff. QA will ensure compliance with established standards, documented plans, and procedures. QA will ensure that any deficiencies found in the product work products or processes are brought to managements attention in an expeditious manner. The goals of the QA program are to: Increase software product quality. Reduce overall software product risk and product cost Maintain software product integrity through objective verification in each phase of the product lifecycle. The following is the planned QA activities: - Peer Reviews QA will participate in a subset of each category of review, formal and one-on-one, which is selected on the basis of the risk associated with the items being reviewed. Known risk areas will be given greater emphasis. At least one internal peer review will be conducted following the completion of each developed work products and prior to any required customer review. - Readiness Reviews Readiness Reviews are conducted according to approved procedures, and serve as control gates for the product, ensuring that the product does not enter a lifecycle phase prematurely. These reviews are formal meetings and, as such, require signatures from specified participants to document approval. - Customer Reviews Customer Reviews also serve as control gates for the product. These reviews may be conducted in a formal meeting setting, or electronically, with redlined documents and 107

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

emailed comments. QA is invited to all formal Customer Reviews and ensures client comments are addressed appropriately and in a timely manner. In the event that further discussion is required, additional meetings or electronic exchanges will be initiated. CDCAAS team will conduct at least one peer review of the product prior to the required Customer Review. - Action Item Management Action items can be generated during any formal or informal meeting (e.g., Peer Review, Customer Review, Status, Product Management Review (PMR), In-Process Review (IPR), Technical Exchange, Staff, All-Hands, etc.), and are documented in a designated section of the meeting minutes form. - Corrective Action Escalation Process When problems persist and/or corrective actions continue to be inadequate, QA will escalate the issue to the appropriate level of management (e.g., Program Manager, Group QA Manager) via the independent reporting structure. QA has the option of presenting the escalated concerns verbally or in writing. QA will maintain a Corrective Action Escalation Log to track escalated items to closure. - Final Delivery Inspection QA, in conjunction with Configuration Management, verifies that the product has successfully completed all development processes and requirements specified, that packaging and delivery preparations have been completed according to procedure, and that the product is complete and accurate, and includes identification of any open items or product shortages.

4.3.3 Verification and Validation


CDCAAS success depends on the consistency, completeness, and correctness of the software at each stage and between each stage of the development cycle. Also it depends on the correctness of the final program of software produced from user needs and requirements. Therefore, the following should be verified and validated. Requirement Review Preliminary Design Critical Design Code Unit Test Integration and regression test System test 108

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

4.3.4 Test Evaluation


After system requirements baseline, the CDCAAS test team will start writing a detailed test plan. In addition, the team will also create integration testing plans and system test plans. During the testing of each module, the testing team will create a test report for all findings during the testing. After testing test plan, test data set, and test results with finding will be stored in the software baseline library in an electronic format compatible with the computer and program generating the controlled item. If software requirements changes at a later time, the CDCAAS test team will update the test plans to reflect current the current requirements.

4.3.4 Test and Evaluation


Prior to developing the software, the test team will carefully draft a detailed test plan. In addition, the team will also create integration testing plans and system testing plans. After testing each module, the appropriate staff will keep a software test log to report all the findings that resulted from the testing process. Software test plans and logs will be added to the software baseline library. If software requirements changes at a later time, the test team will update the test plans to reflect current needs at that point of time. The configuration team will enforce version control.

109

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

5. Work Packages, Schedule, and Budget


5.1 Work Packages 5.1.1 Work Breakdown Structure

Figure 26: Work Breakdown Structure (Overall)

110

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 27: Figure 27: Work Breakdown Structure (Secure Communication)

Figure 28: WBS GPS Navigation

111

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 29: WBS Database Management System

112

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 30: WBS Spreadsheet

113

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

ID #

WBS ID 1 1.1 1.2 1.3 1.4 1.5 1.6 1.6.1 1.6.1.1 1.6.1.1.1 1.6.1.1.2 1.6.1.1.3 1.6.1.1.4 1.6.1.1.5 1.6.1.2 1.6.1.2.1 1.6.1.2.2 1.6.1.2.3 1.6.1.2.4 1.6.1.2.5 1.6.1.3 1.6.1.3.1 1.6.1.3.2 1.6.1.3.3 1.6.1.3.4 1.6.1.3.5 1.6.1.4 1.6.1.4.1 1.6.1.4.2 1.6.1.4.3 1.6.1.4.4 1.6.1.4.5 1.6.2 1.6.2.1 1.6.2.1.1 1.6.2.1.2 1.6.2.1.3 1.6.3 1.6.3.1 1.6.3.1.1 1.6.3.1.2 1.6.3.1.3 1.6.3.1.4 1.6.3.2

Task Name CDCAAS System Project Management Technical Management Quality Assurance Requirements Management Configuration Management Software System CDCAAS Custom Development Requirements and Configuration Management Identify Candidates(Req) Preliminary Design Detail Design Coding and Unit Testing Integration Testing Secure Communication Identify Candidates (Req) Preliminary Design Detail Design Coding and Unit Testing Integration Testing Word Processor Identify Candidates(Req) Preliminary Design Detail Design Coding and Unit Testing Integration Testing Debugger & Test Identify Candidates (Req) Preliminary Design Detail Design Coding and Unit Testing Integration Testing Outsourced Custom Development GPS Navigation Identity Candidate Coding and Unit Testing Integration testing Software Reuse Development Database Management System Identify Candidates (Req) Design Customization Coding and Unit Testing Integration Testing Compiler

114

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

ID #

WBS ID 1.6.3.2.1 1.6.3.2.2 1.6.3.2.3 1.6.3.2.4 1.6.3.3 1.6.3.3.1 1.6.3.3.2 1.6.3.3.3 1.6.3.3.4 1.6.4 1.6.4.1 1.6.4.1.1 1.6.4.1.2 1.6.4.1.3 1.6.4.1.4 1.6.4.1.5 1.6.4.2 1.6.4.2.1 1.6.4.2.2 1.6.4.2.3 1.6.4.2.4 1.6.4.2.5 1.6.4.3 1.6.4.3.1 1.6.4.3.2 1.6.4.3.3 1.6.4.3.4 1.6.4.3.5 1.7 1.8 1.9 1.10 1.11 1.12 1.13 1.14

Task Name Identify Candidates(Req) Design Customization Coding and Unit Testing Integration Testing Electronic Inventory and Tracking Identify Candidates (Req) Design Customization Coding and Unit Testing Integration Testing COTS Development Spreadsheet application Identify Candidates (Req) Evaluate & Select Candidate COTS Design Customization Coding and Unit Testing Integration Testing Graphics Presentation Identify Candidates (Req) Evaluate & Select Candidate COTS Design Customization Coding and Unit Testing Integration Testing Project Management Identify Candidates (Req) Evaluate & Select Candidate COTS Design Customization Coding and Unit Testing Integration Testing Hardware Kernel Documentation Software support Environment Alpha Test Beta Test System Deployment Support
Table 28: Work Breakdown Structure

115

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

5.1.2 Work Package Specifications Database Detailed Design


Work Package Specification: Database-Detailed Design Activity number: 1.6.3.1.2 Activity name: DATABASE Detailed Design Feature description: Create wizard interface to allow user to learn how to generate report functionality. Activity Description: Create detailed design of functionality that Wizards will provide, integrate with user interface. Estimated duration: 14 days. Resources needed: Personnel: 2 Senior Software Engineers, 2 System Analysts. Skills: Voice Navigation knowledge, GUI development tool, Requirements Specification knowledge. Tools: GUI builder, CM tool. Travel: None. Work Product: Detailed Design of functionality for Wizards. Risks: Lack of human factors skills. Predecessors: Detailed design of Wizards user interface complete. Completion Criteria: Inspection of functionality. IMPLEMENTATION Personal assigned: Rita Grover, Hanuman Singh, Rakesh Roshan, Jim Rosenberg Starting date: ___05/18/09____Completion Date: ___06/28/09____ Cost (budgeted/actual):___$716,601.36_____ Legacy comments: None SPREADSHEET WORK PACKAGE Customizing SPECIFICATION: Spreadsheet Application-Detailed Design-

Activity number: 1.6.4.1.3 Activity name: Spreadsheet-Detailed Design- Spreadsheet customization

116

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Feature description: The improved Spreadsheet Package will add new features to the existing MS Excel software and allow better management of reading and writing to text files, improved security features. Activity Description: Create and review detailed design of integration of New Spreadsheet into existing Spreadsheet Application. Estimated duration: 15 days. Resources needed: Personnel: 1 Team Lead, 1 Requirement Analyst, 1 System Engineer, 1 Senior Programmer, 2 Programmers, 1 QA Engineer, 1 Component Expert. Skills: Visual Basic, MS Access 2003. Tools: Spreadsheet App, Visual Basic, Access 2003. Travel: None. Work Product: Detailed Design of Spreadsheet Customization. Risks: Lack of human factors skills. Predecessors: Requirements and High Level Design of Spreadsheet Customization. Completion criteria: Inspection of functionality and approval of selected product decision document. IMPLEMENTATION Personal assigned: Kelly Romeo, Wanda Briggs, Teddy Brown, Charles Rosenburg, Jim Rosenberg. Starting date: ___12/07/09___Completion Date: ___01/29/09____ Cost (budgeted/actual):____$473,887.44_____ Legacy comments: None

117

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

5.2 Dependencies

Figure 31: Dependencies (Part 1)

118

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 32: Dependencies (Part 2)

119

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 33: Dependencies (Part 3)

120

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 34: Dependencies (Part 4)

121

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 35: Dependencies (Part 5)

122

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 36: Dependencies (Part 6)

123

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 37: Dependencies (Part 7)

124

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 38: Dependencies (Part 8)

125

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 39: Dependencies (Part 9)

126

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 40: Phase Distribution (Project Overall)

Figure 41: : Phase Distribution (Project Plans & Requirements)

127

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 42: : Phase Distribution (Project Programming)

Figure 43: Distribution (Project Product Design)

128

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 44: Phase Distribution (Project Integration & Test)

Figure 45: Project Maintenance (Part 1)

129

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 46: Project Maintenance (Part 2)

Figure 47: Project Maintenance (Part 3)

130

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 48: Project Maintenance (Part 4)

Figure 49: Module Overall (Database Management)

131

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 50: Module Plans & Requirements (Database Management)

Figure 51: Module Programming (Database Management)

132

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 52: Module Product Design (Database Management)

Figure 53: Module Integration & Test (Database Management)

133

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 54: Module Maintenance (Database Management Part 1)

Figure 55: Module Maintenance (Database Management Part 2)

134

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 56: Module Maintenance (Database Management Part 3)

Figure 57: Module Maintenance (Database Management Part 4)

135

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

5.3 Resource Requirement

Figure 58: Resource Loading Chart 1

Figure 59: Resource Loading Chart 2

136

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 60: Resource Loading Chart 3

Figure 61: Resource Loading Chart 4

137

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 62: Resource Loading Chart 5

Figure 63: Resource Loading Chart 6

138

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 64: Resource Loading Chart 7

Figure 65: Resource Loading Chart 8

139

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 66: Resource Loading Chart 9

Figure 67: Resource Loading Chart 10

140

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 68: Resource Loading Chart 11

Figure 69: Resource Loading Chart 12

141

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 70: Resource Loading Chart 13

Figure 71: Resource Loading Chart 14

142

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 72: Resource Loading Chart 15

Figure 73: Resource Work Summary Report Chart

143

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 74: Resource Work Availability Report Chart

144

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

5.3 Resource Estimate


The following tables contain estimates for resource requirements. Calculations are based on estimated source code lines for each software package. Three methods are used to estimate each package's initial size estimate (ISE). Method 1 Using Table 1 below, functional group for each package are determined and then with the data available ISEs are approximated.

145

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 75: Software Productivity (SLOC/SM) by Application Domains

146

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Package CDCAAS Database Management System Data Processing CDCAAS Spreadsheet Data Processing CDCAAS Requirements and configuration Management Data Processing CDCAAS Secure Communication Information Center CDCAAS Graphics Presentation Web CDCAAS Word Processor Data Processing CDCAAS Project Management Data Processing CDCAAS GPS Navigation Guidance, Navigation & Control System CDCAAS Compiler Tools CDCAAS Debugger & Test Environment CDCAAS Electronic Inventory and Tracking Command & Control
Table 29: Resource Estimate-Method 1

KEDSI/ KSLOC 400 70 350 358 198 135 43 77 205 125 600

Method 2 Divide estimated size of Application Storage by four to obtain the size estimate. Size KEDSI (MB) /KSLOC 300 75 176 44 156 39 260 65 188 47 248 62 120 30 408 102

CDCAAS CDCAAS CDCAAS CDCAAS CDCAAS CDCAAS CDCAAS CDCAAS

Package Database Management System Spreadsheet Requirements and configuration Management Secure Communication Graphics Presentation Word Processor Project Management GPS Navigation

147

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Package CDCAAS Compiler CDCAAS Debugger & Test CDCAAS Electronic Inventory and Tracking
Table 30: Resource Estimate-Method 2

Size KEDSI (MB) /KSLOC 92 23 220 55 380 95

Method 3 Modified Delphi Technique: The Delphi Technique was designed to gather input from participants without requiring them to work face-to-face. Often, the process is used to find consensus among experts who have differing views and perspectives. The Delphi Technique enables group problem-solving using an iterative process of problem definition and discussion, feedback, and revisions. The Modified Delphi Technique described here uses mail or email to gather information, provide feedback, and report conclusions. Here a panel of 3 experts with 8 years experience each was requested to give an estimate based on their experience. Expert 1 (KEDSI) 210 40 82 170 150 80 110 290 60 70 140 Expert 2 (KEDSI) 350 90 130 150 110 50 180 200 100 80 150 Expert 3 (KEDSI) 310 140 190 100 97 110 112 200 80 140 190 Average (KEDSI) 290 90 134 140 119 80 134 230 80 90 160

Package CDCAAS Database Management System CDCAAS Spreadsheet CDCAAS Requirements and configuration Management CDCAAS Secure Communication CDCAAS Graphics Presentation CDCAAS Word Processor CDCAAS Project Management CDCAAS GPS Navigation CDCAAS Compiler CDCAAS Debugger & Test CDCAAS Electronic Inventory and Tracking

Table 31: Resource Estimate-Method 3

COCOMO: With average of above 3 estimates ISE's as in input to COCOMO II model, approximation for cost, effort & schedule has been generated:

148

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 76: COCOMO Summary Screen

149

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 77: COCOMO Product Parameters

Figure 78: COCOMO Platform Parameters

150

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 79: COCOMO Personnel Parameters

Figure 80: COCOMO Project Parameters

151

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 81: COCOMO Scale Parameters

Figure 82: COCOMO Equation Parameters

152

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 83: COCOMO Phase Distribution-Project Overall

Figure 84: COCOMO Phase Distribution-Project Plans & Requirements

153

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 85: COCOMO Phase Distribution-Project Programming

Figure 86: COCOMO Phase Distribution-Project Product Design

154

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 87: COCOMO Phase Distribution-Project Integration & Test

Computed Final Size (CFS) in thousands of estimated delivered source instructions (KEDSI) is derived from ISE through a development class-based adjustment, resulting in an estimate which more accurately represents the amount of work required by the project team to engineer the package. For example, a custom software package is expected to require more lines of code than a COTS product. To generate the CFS for a custom package, the ISE is multiplied by 20%. For a reuse/non-developmental item (NDI) package, the ISE is multiplied by 6%. For a COTS package, the ISE is multiplied by 4% since very little new coding is required. A weighted average is generated with the modal figure given four times the weight of the high and low estimates. The formulas are as follows: COTS: ISE * 4% Reuse/non-developmental item: ISE * 6% Custom: ISE * 20% CFS Expected Value = (Optimistic Value + 4 * Most Likely Value + Pessimistic Value ) / 6 Computed Effort (CE) in staff months is generated using three different methods: (1) the Constructive Cost Model (COCOMO), (2) the Walston-Felix model, and (3) the Boyston model. Again, the modal result is weighted four times heavier than the high and low values. The formulae are shown as follows:

155

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

COCOMO: Effort = 3.6 * (CFS (expected value) ^ 1.2) Walston-Felix: Effort = 5.2 * (CFS (expected value) ^ 0.91) Boyston: Effort = 5.78 + 3.11 * (CFS (expected value ) ^ 1.0 ) CE Expected Value = (Optimistic Value + 4 * Most Likely Value + Pessimistic Value) / 6 Computed Duration (CD) in months is generated using the COCOMO, Walston-Felix, and Boyston models: COCOMO: Duration = 2.5 * (CE (expected value) ^ 0.32) Walston-Felix: Duration = 4.1 * (CE (expected value) ^ 0.36) Boyston: Duration = 2.15 * (CE (expected value) ^ 0.33) CD Expected Value = (Optimistic Value + 4 * Most Likely Value + Pessimistic Value) / 6 To meet scheduling targets some packages will require extra effort. A schedule compression value is applied to the Computed Duration, which in turn generates the Effort Adjustment Factor (EAF), Adjusted Effort (AE) in staff months, Productivity PEDSI in staff months, and average staff for each software package. The calculations required are listed below. Schedule Compression Value Compressed Duration (Months) Effort Adjustment Factor (EAF) Adjusted Effort, AE (Staff Months) Productivity PEDSU/Staff Month Average Staff %SCV = A number less than or equal to 14 CD = 1 - ( %SCV / 100) * Duration ( expected value ) EAF = ( %SCV / 100 + 1 ) AE = ( CE ( expected value ) * EAF) P = CFS ( expected value ) / AE) AS = AF / CD

A spreadsheet detailing the entire series of calculations is attached as Appendix I. A spreadsheet which allows the spreadsheet formulas to be examined is attached as Appendix II.

156

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 88: Resource Estimate Spreadsheet Part 1 157

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 89: Resource Estimate Spreadsheet Part 2

158

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

5.4 Budget and Resource Allocation


ID 1 1.1 1.2 1.3 1.4 1.5 1.6 1.6.1 1.6.1.1 1.6.1.1.1 1.6.1.1.2 1.6.1.1.3 1.6.1.1.4 1.6.1.1.5 1.6.1.2 1.6.1.2.1 1.6.1.2.2 1.6.1.2.3 1.6.1.2.4 1.6.1.2.5 1.6.1.3 1.6.1.3.1 1.6.1.3.2 1.6.1.3.3 1.6.1.3.4 1.6.1.3.5 1.6.1.4 1.6.1.4.1 1.6.1.4.2 1.6.1.4.3 1.6.1.4.4 1.6.1.4.5 1.6.2 1.6.2.1 1.6.2.1.1 1.6.2.1.2 1.6.2.1.3 1.6.3 Task Name CDCAAS System Project Management Technical Management Quality Assurance Requirements Management Configuration Management Software System CDCAAS Custom Development Requirements and Configuration Management Identify Candidates(Req) Preliminary Designs Detail Design Coding and Unit Testing Integration Testing Secure Communication Identify Candidates (Req) Preliminary Design Detail Design Coding and Unit Testing Integration Testing Word Processor Identify Candidates(Req) Preliminary Design Detail Design Coding and Unit Testing Integration Testing Debugger & Test Identify Candidates (Req) Preliminary Design Detail Design Coding and Unit Testing Integration Testing Outsourced Custom Development GPS Navigation Identity Candidate Coding and Unit Testing Integration testing Software Reuse Development 159 Fixed Cost $49,215,627.96 $1,277,154.00 $1,278,101.00 $881,150.00 $678,000.00 $981,150.00 $32,961,467.66 $17,938,593.25 $5,580,610.84 $334,836.65 $1,004,509.95 $1,339,346.60 $2,343,856.55 $558,061.08 $5,933,252.92 $355,995.12 $1,067,985.00 $1,423,980.00 $2,491,965.00 $593,327.80 $3,144,759.78 $188,685.59 $566,056.76 $754,742.35 $1,320,799.11 $314,475.97 $3,279,969.71 $196,798.18 $590,394.55 $787,192.73 $1,377,587.28 $327,996.97 $4,324,790.21 $4,324,790.21 $281,790.21 $3,762,145.00 $280,855.00 $6,750,283.91

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

ID 1.6.3.1 1.6.3.1.1 1.6.3.1.2 1.6.3.1.3 1.6.3.1.4 1.6.3.2 1.6.3.2.1 1.6.3.2.2 1.6.3.2.3 1.6.3.2.4 1.6.3.3 1.6.3.3.1 1.6.3.3.2 1.6.3.3.3 1.6.3.3.4 1.6.4 1.6.4.1 1.6.4.1.1 1.6.4.1.2 1.6.4.1.3 1.6.4.1.4 1.6.4.1.5 1.6.4.2 1.6.4.2.1 1.6.4.2.2 1.6.4.2.3 1.6.4.2.4 1.6.4.2.5 1.6.4.3 1.6.4.3.1 1.6.4.3.2 1.6.4.3.3 1.6.4.3.4 1.6.4.3.5 1.7 1.8 1.9 1.10 1.11 1.12

Task Name Database Management System Identify Candidates (Req) Design Customization Coding and Unit Testing Integration Testing Compiler Identify Candidates(Req) Design Customization Coding and Unit Testing Integration Testing Electronic Inventory and Tracking Identify Candidates (Req) Design Customization Coding and Unit Testing Integration Testing COTS Development Spreadsheet application Identify Candidates (Req) Evaluate & Select Candidate COTS Design Customization Coding and Unit Testing Integration Testing Graphics Presentation Identify Candidates (Req) Evaluate & Select Candidate COTS Design Customization Coding and Unit Testing Integration Testing Project Management Identify Candidates (Req) Evaluate & Select Candidate COTS Design Customization Coding and Unit Testing Integration Testing Hardware Kernel Documentation Software support Environment Alpha Test Beta Test 160

Fixed Cost $2,985,839.13 $358,300.63 $716,601.36 $1,167,685.49 $743,251.65 $1,315,009.16 $157,801.10 $315,602.20 $512,853.57 $328,752.29 $2,449,435.62 $293,932.27 $587,864.55 $955,279.89 $612,358.91 $3,947,800.30 $1,316,356.25 $210,619.81 $157,962.00 $315,925.00 $473,887.44 $157,962.00 $1,315,393.59 $78,923.62 $236,770.85 $315,694.46 $552,465.31 $131,539.36 $1,316,050.46 $78,963.03 $236,889.08 $315,852.11 $552,741.19 $131,605.05 $1,050,000.00 $8,610,540.45 $681,150.00 $128,517.84 $208,063.00 $160,178.00

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

ID 1.13 1.14

Task Name System Deployment Support


Table 32: Budget and Resource Allocation

Fixed Cost $91,314.00 $228,842.00

161

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

5.5 Schedule

Figure 90: CDCAAS System Master Schedule

162

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 91: Debugger & Test Schedule

163

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 92: CDCAAS Electronic Inventory & Tracking/Custom Development Detailed Schedule

164

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 93: CDCAAS COTS Development Detailed Schedule

165

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 73: Custom Development Schedule Estimate 166

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Figure 74: Detailed Schedule Estimate

167

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Current Price Breakeven Table:

Table 33: Current Price-Breakeven Table

Figure 75: Current Price-Breakeven Chart 168

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Decreased Price Breakeven Table:

Table 34: Decreased Price Breakeven Table

Figure 76: Decreased Price-Breakeven Chart 169

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Increased Price-Breakeven Table:

Table 35: Increased Price-Breakeven Table

Figure 77: Increased Price-Breakeven Chart 170

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Optimum Price-Breakeven Table:

Table 36: Optimum Price-Breakeven Table

Figure 78: Optimum Price-Breakeven Chart 171

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Additional Components
1.1. Subcontracting Process
Our approach in the subcontracting process is based upon:
Creating a Management Integrated Product Team (IPT) that includes subcontract personnel to provide a streamlined, timely response to all Task Orders Assigning a distinct Task Order IPT for each task order assigned to a subcontractor and implemented under this program Identifying risk areas, with subcontractors, at the beginning or in the early stages of individual task orders, implementing Risk Mitigation procedures to minimize the risk, and inspecting the work in process to assure mitigation efforts are adequate. Providing timely corrective actions for problem areas and monitoring those actions to prevent Recurrence. Using Lessons Learned from other programs to maximize the potential for success for each task order.

1.1.1 Selection of Subcontractors


Selection has been divided into three parts. The first part, describes the process used for the selection of initial subcontractors and vendors. The second part, describes the standard process for selection of core subcontractors/vendors based upon follow-on CDCAAS requirements. The third part, describes our process for selecting additional, non-core subcontractors, to support follow-on task orders.

1.1.1.1 Initial Subcontractors/Vendors Selection


During the CDCAAS procurement process, strategic approach to meeting the requirements consisted of identifying and qualifying no more than four complementary subcontractors that can perform 100% of the requirements. In addition, vendors may be added on a case-by-case basis to satisfy unique proprietary hardware or software requirements, or unique task order requirements. Team Crime Busters elected to go with a small core team in order to develop a team atmosphere rather than a lose mix of vendors with little or no allegiance, and to select team members that specialized rather than those with a range of general capabilities. The initial subcontractor selection process began immediately during the early stages of the procurement process. The corporate management team prepared and issued a series of capability surveys, and held face-to-face meetings and discussions with potential candidates. Information on the size of the business, experience of top management, major Customers and contracts, references, quality systems and procedures used, and relevant past performance information were gathered and incorporated into a master selection folder. Past performance was evaluated using references or historical data, if available, and information obtained by contacting subcontractor Customers. From information obtained we categorized sub-contractors candidates into two groups:

Core Subcontractors: They are top-notch provider of training and simulation products and
services to Government or Industry with in-place quality and performance standards, having

172

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

an exemplary reputation with its Customers.

Vendors: Quality vendor of COTS training and simulation products and/or services those are directly applicable to CDCAAS Team. Once all proposed subcontractor candidates had been identified and evaluated, the core candidates were rated based upon their core technical capability, past performance and strengths as compared to the supplemental capability needed CDCAAS Program. Systems Engineering and Applied Management, Hardware/Software Development and Integration, Instruction, and Courseware experience were major elements of the technical capabilities evaluation decision.
Nondisclosure Agreements, which protect the exchange of sensitive information and expedite the technical interaction that is required at this phase, were issued. Upon selection of the core candidates, a two-tier Integrated Product Team (IPT) organization was established to respond to the CDCAAS requirements and manage the CDCAAS Program after contract award.

1.1.1.2 Selection of Vendors for Follow-On Task Orders


The process for selection of vendors for follow-on Task Orders will be somewhat different. Since the core subcontractors will be in-place, the process is greatly streamlined. Resource Managers have been identified by each core subcontractor, their responsibilities established, and charters developed to support the team. The Task Order IPT Leader initiates a kick-off meeting within three work days after new Task Order release. This three-day period is used for a detailed evaluation of the new Task Order requirements. This evaluation period may be extended by the Task Order IPT Leader depending on the complexity of the Task Order. The membership of the Task Order IPT is dependent on the specific requirement and will consist of Task order IPT leader, Subcontracts Administrator, and a Resource Manager from each of the core subcontractors required for the Task Order. At the kick-off meeting of the Task Order IPT, the IPT Leader solicits inputs from the Resource Managers and Subcontracts Administrator. Each Resource Manager is given the opportunity to provide their input to the New Task Order. Resource Managers will respond to their respective companys area of expertise. New Task Orders that require the specific expertise outside the core Internal CDCAAS team require the Task Order IPT to identify and actively pursues additional subcontractors.

1.1.1.3 Selection of Additional Vendors


When the scope of work for a new Task Order includes a specific requirement that is not within the core team capability, the Task Order IPT will begin the process of finding additional resources. This process is very similar to the initial selection process; however, the responsibility for recommending selection of a specific vendor now rests with the Task Order IPT. The same basic criteria are required of a new subcontractor or vendor. The new vendor must:

Have the required technical capability. Have strong management in-place with the necessary experience and tools to effectively manage their specialty area. Have existing quality processes and procedures that are ISO compliant. Have excellent past performance in the relevant specialty area. Be responsive. 173

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Be financially able to meet the commitment of the new Task Order.

Once the Task Order IPT has completed its evaluation, it recommends to the TSAT Management IPT to add this newly approved subcontractor/vendor to the team for this new Task Order.

1.1.2. Coordinating with Subcontractors


Task Order IPT Leaders have the responsibility and authority to coordinate subcontractor efforts. The Resource Managers of the core subcontractors are members of the Task Order IPT, as required. They are required and expected to participate in and support the initial kick-off meetings, regular discussions, technical exchanges and management reviews. The Task Order IPT Leader is responsible for the interaction of the members and encourages direct coordination of their efforts with each other. The Task Order IPT coordinates discussions, facilitates tasks, and is responsible for problem resolution. Each Resource Manager provides inputs at meetings and discussions, assists the Task Order IPT Leader in defining the charter against task order requirements, supports needed tasks, provides strategic inputs to processes and procedures, and assists in developing schedules and metrics. The Task Order IPT Leader coordinates the Task Order effort through the Management IPT as well as with the Customer.

1.1.3. Integrating with Subcontractors


The Task Order IPT Leader has the responsibility and authority for integrating the subcontractors. This task is accomplished by creating a standard set of guidelines for the Task Order. During the initial selection of core subcontractors, specific responsibilities were developed for each approved subcontractor based upon their capabilities. Resource Managers were assigned to the Management IPT or appropriate Task Order IPTs to help develop charters, schedules, plans, procedures and performance standards. We integrated the subcontractors processes and procedures into the master CDCAAS plan, creating performance metrics as needed and clearly defining the role each subcontractor plays in the CDCAAS Program. With each new Task Order, the Task Order IPT will create and update these team plans and processes based upon the task order requirements. Subcontractor plans, processes, and procedures will integrate the subcontractor work efforts into the overall plans. Performance metrics and milestones are used to ensure schedules are met and integrated tasks are accomplished. Core subcontractors are included in all applicable Task Order IPT meetings, technical interchanges, discussions and quality reviews, and risk assessment and management meetings. Risk management is a key issue in Task Order evaluation and analysis. Risks are identified, categorized as affecting cost, schedule or performance, risk impact studies are conducted and probability of risk occurrence evaluated. As part of the Risk Management process, the IPT develops workarounds and implements a risk mitigation plan to minimize the impact of the risk. Subcontractor capability, staffing, depth and availability of resources, financial stability, and past performance are elements included in the Risk Management process.

174

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

1.1.4. Controlling Subcontractors


The Task Order IPT Leader has the responsibility and authority for control of subcontractor performance. The subcontractors performance is continually monitored and compared against negotiated standards and the performance metrics established for the subcontracted effort. Reviews are conducted by the IPT Leader and/or corporate Quality Administrator, as needed, to ensure timely and effective completion of assigned subcontractor tasks. Scheduled reviews shall be conducted with individual Resource Managers, minutes of these reviews documented, and corrective actions taken for any problem areas. Corrective actions are required for all work not performed in accordance with the established standards, or otherwise unsatisfactory work. If discovered through normal work review of quality surveillance, such non-conforming work will be documented, reported, and steps will be taken to correct the discrepancy either immediately, or as soon as practicable. The approach used is to correct the discrepancy on the spot, if possible, and take action to preclude future non-compliance. When immediate action cannot be taken to correct nonconformity, or a condition causing a deficiency, a corrective action report on the deficiency is prepared by the Resource Manager and submitted to the IPT Leader. The Resource Manager will assign suspense dates for correction and take action to preclude future non-compliance. Corrective actions continue until the discrepant condition is resolved. The Resource Manager will forward corrective action reports to the IPT Leader. Any deviations from established standards must go through a formal review process and the appropriate corrective action taken and documented.

2.1. Security Considerations


As with any collaborative application or portal that enables access to corporate data over the Internet, security considerations becomes a serious concern. There are many guidelines for creating a more secure system, CDCAAS employs three of them.

Traffic encryption
To prevent sensitive data from being intercepted as it travels over the wire, traffic must be encrypted. Secure Sockets Layer (SSL) has become the most common method for creating an encrypted connection between client and server, and for authenticating both the server and client machines. There are three communication protocols between the Collaborative Services and back-end servers that need to be configured for SSL: HTTP, DIIOP and LDAP.

User authentication
Obviously, we want to make sure that only authorized people are accessing our systems. Standard authentication methods include user ID and password, SSL certificates exchanged between client and server, and a user's listing in a corporate directory using LDAP, the industry standard for Internet and intranet-based directories.

175

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

SSL can be implemented via signed certificates from a certificate authority such as VeriSign. To enable SSL in CDCAAS, we have to configure SSL for the HTTP server, as well as the Application Server plug-in for the Web server. SSL also must be enabled for the LDAP connections in Application Server. Application Server has additional authentication tools, including the Credential Vault for use by portlets that need to access back-end system. The Credential Vault provides a place for portlets to access user credentials such as password and SSL certificate after the user has logged on, providing single sign-on for the user.

Function authorization
Not every user should have access to every resource or function. Good security requires that different user groups be granted different levels of access to corporate systems. Application Server accomplishes this via the J2EE security mechanisms, which include security roles. Developers can create generic security roles for various departments or types of employees, and provide those roles with access to the specific resources and functions they require. For instance, accounting and human resources employees may be given the ability to run a payroll query, whereas the marketing group cannot. Generic roles can then be mapped to actual users later.

3.1. Training Plans


Proposed Training Introduction to Work Environment and Machine Cross Training CDCAAS Software Training Business Need Solved Make current users familiar to work environment and make then productive within 60 days. Priority A part of strategic plan to improve productivity Training Duration 60 Days Resource Planning In house trainer from Team Crime Busters Budget & Notes $9,000 Grant from the client.

Train current users to effectively use the CDCAAS Suite.

An MIS Goal

60 Days

In house trainer from Team Crime Busters

120 Hours from Team Crime Busters.

Table 37: Training Plans

4.1. Alpha & Beta Test Plan 4.1.1. Alpha Testing


We have the tester, preferably not the developer; begin to do testing on the features that the developers feel are complete. But, the full feature set, defined by the Specifications, have not been fully developed yet.

176

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Test Plan
In order to know what to test, we need to have a Test Plan. Usually, this follows the functions defined in the Specifications. We will use a spreadsheet for building the Test Plan. The columns for the Test Plan are:

ID : The feature number assigned in the Specifications Description : A description of what the feature is supposed to do. Build # : The build number that the feature was implemented in. Developer : The name of the developer of the feature. Tested : A checkbox that the tester can check to show that they have tested it. Pass/Fail : To show if the feature passed testing and works as described. Notes : Brief summary used by the tester to describe why a feature did not pass.

Using this spreadsheet, we will be then able to sit down with the tester(s) and developer(s) and review the status of the development phase. This provides a good indicator for determining if the project is on schedule or not. Regarding the Build #, each build of the CDCAAS application that gets released for testing should be incremented by a Build #. In conjunction with the use of the Test Plan, is the use of a Issue Tracking System. This provides a shared environment where the developers, testers, managers, and even the clients can enter and track the progress of Issues (bugs, requests, etc.) that are related to the project. When reporting Issues, we are using a Severity Category to determine how critical an Issue is. The general categories are:

Category 1 : This issue causes a severe crash of the application, a crash to desktop, an application freeze, or corruption of the data. Nothing should be delivered to the client with Category 1 Issues. Category 2 : This issue causes undesired functionality or application errors. Can also include data miscalculation (not corrupted, but just not correct). Decision will be taken at run-time whether to deliver application with a Category 2 Issue. Perhaps it's an error that requires a sequence of steps that rarely, if ever, will occur in the production environment. Category 3 : This issue is a suggestion, minor UI change, or issues that don't affect the functionality of the application.

177

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Alpha Complete
This is a major milestone during the development process. This flags that all of the targeted features and functionality have been implemented. They just might not work correctly, but at least the code is in place.

4.1.2. Beta Testing


At this point, this is where we will get the client involved with the testing process. It is very important to get the customer/client involved! This not only shows the progress that has been made but it allows the client to provide feedback. The Beta Testing phase pretty much follows the same process of the Alpha Test. Follow the Test Plan and report any Issues that occur. It is at this point that we will do "Regression Testing", which involves going back over all the steps in the Test Plan to confirm that previously fixed Issues are still working.

Beta Complete
When both of us are comfortable with the status of all the features of the application, and have agreed on any minor anomalies that might still be in the application, then we will reach the Beta Complete phase of the project. At this point we can now put the finishing touches on the project - Documentation, Implementation, Conversion, Training, and Post Implementation Support.

5.1. Installation & Training Plans


After CDCAAS come out of the testing phase it will deployed on Customer Sites. Various steps in which this process will be completed are: From each site, a two member team will be recognized and assembled at Team Crime Buster Headquarters and these teams will be made to undergo Installation & Support training. The people constituting these teams will be the ones who have good experience and understanding in Computer Application. After successful completion of training, these teams will act as first line of support. CDCAAS software package will be uploaded on the secure company server, and a link to download the software will be circulated among the core teams in different locations. A combination of username and password will be required to securely download the software. Based on the installation training and adhering to the set rules and guidelines, CDCAAS software will be installed.

178

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

6.1. Post Deployment Support Procedures


In addition to first line of support, Team Crime Busters will provide 24X7 customer support after deployment. Various aspects laid down for customer support are: Only online support is available 24X7. Client will be provided with a Front-End where they can log in their issues. An on-Site Customer service for 5 visits or 100 hours is provided complimentary. After that a fee of $150 will be charged per visit or 8 hours of stay at client premises. In case of minor issues, a Technical Associate will log onto the client machine remotely and provide a solution for the reported problem.

179

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Index
A Acronyms, 18, 19, 22 advance orders, iii Alpha Test, 19, 32, 116, 161 analysis, v, 10, 22, 71, 94, 96, 104 Application Software Cost, vii Assumptions, v, vi, 18, 62, 63, 64 B baseline, 2, 28, 29, 31, 32, 71, 76, 106, 110 Baseline, 19, 28, 30, 31, 71, 101 Baselines, 30, 31, 101, 106 beta tests, ix budget, 57, 58, 60, 61, 63, 64, 67, 69, 70, 75 Budget, 15, 18, 69, 70, 78, 111, 160 budgets, 15 C Capability Maturity Model, 20, 23, 59, 61, 63, 64 COCOMO, vii, 18, 19, 20, 22, 23, 24, 25, 26, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 67, 70, 149, 156, 157 Code Review, 71 Commercial-Off-The-Shelf, ix Compile Link and Runtime Package, 12 Compiler, vii, x, 1, 16, 54, 73, 82, 115, 148, 149, 161 Configuration Management, 1, 11, 16, 20, 23, 41, 43, 73, 101, 106, 109, 115, 160, 1 Constraints, v, vi, 14, 18, 62, 63, 64, 65 Corrective Action, 72, 73, 109 cost, iv, v, ix, 20, 57, 58, 67, 68, 69, 70, 74, 77, 107, 108, 149 COST, vii COTS, vi, ix, 1, 15, 16, 17, 18, 20, 21, 23, 28, 29, 37, 59, 61, 63, 64, 65, 66, 180 73, 94, 96, 99, 116, 156, 161, 1 Critical Design Review, 30, 31, 73 Critical Path Method, 12 D database, 10, 11, 30, 94, 96, 2 Database Management System, vii, x, 1, 15, 23, 46, 73, 115, 148, 149, 161 databases, v, 10, 11, 14, 15 Debugger & Test, vii, x, 2, 74, 115, 148, 149, 160 Definitions, 18, 19 Deliverables, v, vi Dependencies, ii, 18, 62, 63, 64, 119 Deployment and Support, 29 Design Walk-through, 70 detailed design, ix, 31, 117, 118, 1, 3, 4, 5, 6, 7, 8, 9 DEVELOPMENT ENVIRONMENT, 13 Documentation, 15, 17, 23, 30, 41, 43, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 72, 100, 101, 116, 161, 179 E Earned Value, 23, 70, 72, 75, 76 Electronic Inventory & Tracking, vii, 74, 9 Electronic Inventory and Tracking, x, 2, 11, 12, 16, 56, 116, 148, 149, 161 Electronic Inventory and Tracking Package, 11, 12, 56 encryption, 22, 176 Estimated duration, 117, 118, 1, 2, 3, 4, 5, 6, 7, 8, 9 Evolution of the Software Project Management Plan, 18 External Reviews, 71 F Federal Bureau of Investigation, ix

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

G Gantt charts, 12 GLOBAL POSITIONING SYSTEM GPS, 6 goals, iii, v, 57, 58, 59, 61, 80, 108 Goals, 57, 58, 59, 61 GPS Navigation, vii, x, 1, 11, 12, 16, 53, 63, 65, 73, 115, 148, 149, 160 GRAPHIC PRESENTATION, 3 Graphics Presentation, vii, x, 1, 11, 16, 73, 116, 148, 149, 161 Graphics Presentation Package, 11 H hardware, ix, 12, 14, 15, 16, 20, 21, 32, 43, 60, 61, 63, 64, 73, 96 Hardware, vii, viii, ix, 7, 13, 24, 116, 161 I Initial Cost Estimate, viii INITIAL COST ESTIMATE, vii inspection, 71, 72 Inspection, 102, 109, 117, 118, 1, 2, 3, 4, 5, 6, 7, 8, 9 inspections, 73, 74 Inspections, 71 integration testing, 29, 63, 65, 94, 110 Integration Testing, 20, 101, 115, 116, 160, 161 Interpol, ix Ivan Industries, vi, ix, 1, 16, 37, 69, 76 J Java, 12, 94, 96, 100, 1, 2, 4, 7, 8 JAVA, 94, 100 K Ken Nidiffer, i Kernel, vii, 21, 63, 64, 66, 116, 161 L life cycle, 20, 30, 59, 61, 70, 94, 96, 97, 99 Life Cycle, 21, 25, 76

M Management Objectives, 18, 57, 58 Management Priorities, 60, 61 management process, 42 management processes, 21 measures, 31 Measures, 76, 79 methods, ix, 79, 99, 146, 156 Methods, 18, 99, 100, 101, 102 Milestone, 21 milestones, 27, 68, 69, 74, 76 Milestones, 27 monitoring, 14, 30, 65, 67, 71, 101 Monitoring, 18, 67 N Navigation Package, 11, 12, 53 O Obtaining, 79 Organization Boundaries, 40 Organizational Structure, 18, 33 Outsource, 18, 21 outsourced, vi, ix, 18, 28, 29, 37 Outsourced, 1, 16, 63, 64, 73, 99, 115, 160 P peer review, 27, 108, 109 Peer Review, 21, 109 Peer Reviews, 108 Phase out, 30 Phase Out, 33 preliminary design, 20, 28 Preliminary Design, 21, 28, 30, 31, 73, 109, 115, 160 PRIORITIES AND CONSTRAINTS, 14 Process Model, ii, 18, 27 Product Evaluations, 72 PRODUCT OVERVIEW, 10 PRODUCT SUMMARY, 10 productivity, 67, 68, 72, 73, 75 Productivity, ii, 25, 72, 73, 74, 77, 78, 147, 157 181

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Program Evaluation and Review Technique, 12 progress, 14, 31, 67, 68, 70, 72, 74, 75, 76, 95 Progress, 68, 72, 74, 75, 77, 78 PROJECT DELIVERABLES, 15 Project Description, ii, 1 Project Management, i, iii, iv, vii, x, 1, 11, 16, 18, 19, 25, 26, 28, 52, 73, 104, 115, 116, 148, 149, 160, 161 PROJECT MANAGEMENT, 5 Project Organization, 27, 36 Project Overview, ii, 1 Project Progress, 75 Project Responsibilities, 18, 42 Project Sponsor, v Project Support Functions, 18, 106 Q quality, iii, v, 42, 57, 58, 63, 64, 65, 67, 68, 78, 108 Quality, 21, 25, 26, 37, 41, 44, 70, 71, 72, 77, 78, 79, 100, 106, 108, 115, 160 Quality Assurance plan, 71 R Reqm'ts & Config. Mgmt., vii requirements, iii, ix, 15, 18, 21, 22, 23, 27, 28, 30, 32, 42, 43, 60, 62, 67, 68, 72, 77, 94, 95, 96, 97, 103, 109, 110, 146 Requirements, vi, x, 1, 11, 16, 17, 18, 21, 23, 25, 27, 28, 30, 31, 42, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 73, 74, 76, 99, 103, 115, 117, 118, 133, 148, 149, 160, 1, 3, 4, 5 Responsibility Matrix, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56 Retaining, 79 reusable, vi, ix, 96 reuse, 18, 22, 23, 26, 28, 29, 59, 61, 108, 156 Reuse, 1, 16, 74, 99, 100, 102, 115, 156, 160 182

risk, 10, 14, 15, 57, 58, 65, 66, 67, 108 Risk, 14, 18, 19, 25, 65, 66, 67, 102 S scope, iv, 21, 42, 68 Scope, 15, 21, 67 SCOPE, ix Secure Communication, vii, x, 1, 11, 16, 73, 115, 148, 149, 160 Secure Communication Package, 11 software documentation, 103 Software Documentation, 18, 102 Software Requirement Specification Review, 30 Spreadsheet, vii, x, 1, 11, 15, 47, 73, 78, 116, 117, 118, 148, 149, 1 Spreadsheet Package, 11, 47, 118 Staffing, vi, 18, 79, 81, 82, 83, 84, 86, 87, 88, 89, 91, 93 Staffing Plan, 18, 79 support environment, 95 Susan Smith, v system testing, 43, 74, 110 System Testing, 22, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56 System XYZ, iii, v, vi, viii, ix, 19, 63, 65 T Technical Process, 99 techniques, 67, 80 Techniques, 18, 72, 99 time, iv, ix, 1, 2, 10, 11, 12, 13, 15, 20, 57, 58, 59, 61, 63, 64, 65, 67, 68, 70, 72, 74, 75, 76, 77, 78, 79, 103, 110 tools, 57, 58, 59, 61, 63, 64, 66, 94, 99, 4 Total Effort, vii training, vi, ix, 1, 16, 29, 32, 43, 60, 62, 63, 64, 66, 79, 94, 96, 97 Training, 17, 41, 43, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 79, 98, 177, 179

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

U unit testing, ix, 31, 42, 74, 94 Unit testing, 43 Unit Testing, 22, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 100, 115, 116, 160, 161 User authentication, 176 V Validation Plans, 100 Verification, 26, 72, 100, 106, 109

W WORD PROCESSING, 4 Word Processing Package, 11 Word Processor, vii, x, 1, 16, 51, 73, 115, 148, 149, 160 work breakdown structure, 22, 72 Work Breakdown Structure, 22, 26, 111 Work breakdown Structures, 12 Work Package Specifications, 117 work packages, 68, 70, 72 Work packages, 78 Work Packages, 18, 111

183

Appendix I
Detailed Resource Estimate Spreadsheet intermediate calculations) attached. (23 columns plus

CDCAAS
Col# 1 2 3

Appendix I
4 5
Development Class

9/22/08
5a 6 7 8 9aCalc 9bCalc

Your initial size of application in thousand of estimated delivered source instructions (ISE)

Computed value of the final size of application in thousand of estimated delivered source instructions for the class of development

# 1 2 3 4 5 6 7 8 9 10 11

Package Name Database Management System Spreadsheet Reqm'ts & Config. Mgmt. Secure Communication Graphics Presentation Word Processor Project Management GPS Navigation Compiler Debugger & Test Electronic Inventory & Tracking Total Alpha Factor Exponent Allocation of CUSTOM/REUSE/COTS Applications Development Class Code (1, 2, or 3) 1 2 3 Total

90 KEDSI/ KEDSI 1 400 70 350 358 198 135 43 77 205 125 600 2,561

100 KEDSI/ KEDSI 2 75 44 39 65 47 62 30 102 23 55 95 637

(1= custom, Compressi 125 2=reuse/N on % for KEDSI/ target--for KEDSI KEDSI KEDSI DI, KEDSI 3 3=COTS) reference method 1 method 2 method 2

290 90 134 140 119 80 134 230 80 90 160 1,547

2 3 1 1 3 1 3 1 2 1 2

12 14 4 -

24.00 2.80 70.00 71.60 7.92 27.00 1.72 15.40 12.30 25.00 36.00 293.74

4.50 1.76 7.80 13.00 1.88 12.40 1.20 20.40 1.38 11.00 5.70 81.02

17.40 3.60 26.80 28.00 4.76 16.00 5.36 46.00 4.80 18.00 9.60 180.32

Max 24.00 3.60 70.00 71.60 7.92 27.00 5.36 46.00 12.30 25.00 36.00 328.78

Middle value 17.40 2.80 26.80 28.00 4.76 16.00 1.72 20.40 4.80 18.00 9.60 150.28

Actual Allocation from table above 5 Custom 3 Reuse 3 COTS 11

Required Allocation Numbers 5 3 3 11

Variance

1 2 3

0 0 0

Page 1 of 3

CDCAAS
9cCalc 9 9wits 9 - 9wits 10 11

Appendix I
12 9aCalc 9bCalc 9cCalc 13

9/22/08
14

Computed value of duratio Computed Computed Computed Computed value of value of value of value of effort effort using effort using expected WITS report using the the Walston the Boyston Calculated vs value for values for COCOMO Felix model, model, in WITS for Size, e.g., comparison, comparison, model in staff in staff staff KEDSI 20.5 KEDSI KEDSI months months months 16.35 29.43 (13.08) 102.93 66.12 56.63 2.76 8.67 (5.91) 12.17 13.10 14.36 30.83 35.48 (4.65) 220.36 117.76 101.67 32.77 13.27 19.50 237.04 124.46 107.68 4.81 7.98 (3.17) 23.69 21.70 20.73 17.23 5.02 12.21 109.63 69.36 59.38 2.24 35 (32.76) 9.48 10.83 12.75 23.83 20.22 3.61 161.78 93.16 79.90 5.48 7.97 (2.49) 27.72 24.45 22.82 18.00 32.83 (14.83) 115.51 72.16 61.76 13.35 37.67 (24.32) 80.70 54.98 47.30 167.65 233.54 (65.89) 1,101.01 668.09 584.98 5.78 3.11 1

Min 4.50 1.76 7.80 13.00 1.88 12.40 1.20 15.40 1.38 11.00 5.70 76.02

Max 102.93 14.36 220.36 237.04 23.69 109.63 12.75 161.78 27.72 115.51 80.70 1,106.47

Middle value 66.12 13.10 117.76 124.46 21.70 69.36 10.83 93.16 24.45 72.16 54.98 668.09

Min 56.63 12.17 101.67 107.68 20.73 59.38 9.48 79.90 22.82 61.76 47.30 579.52

Computed value of expected effort, e.g., 88.2 staff months 70.67 13.16 132.18 140.43 21.87 74.41 10.93 102.39 24.72 77.65 57.99 726.39

Using the COCOMO model 9.77 5.70 11.93 12.17 6.71 9.93 5.37 11.00 6.98 10.06 9.17

_1=real values _2=WITS values 1 2 1

3.6 1.2

5.2 0.91

2.5 0.32

Page 2 of 3

CDCAAS
15 16 17 18A For reference only

Appendix I
18 19 20 21 22

9/22/08
23

mputed value of duration in staff months

Using the Walston Felix model 18.99 10.37 23.79 24.31 12.45 19.34 9.70 21.70 13.01 19.64 17.68

Using the Boyston model 8.76 5.03 10.78 10.99 5.95 8.91 4.73 9.90 6.20 9.04 8.21

Computed value of expected value for duration, in staff months 15.75 8.70 19.64 20.07 10.41 16.04 8.15 17.95 10.87 16.28 14.69

Computed Value for Effort value of schedule Value for adjustment Adjusted compression schedule Value of factor (EAF) Effort (AE) percentage compression duration due to due to given target percentage, given percent schedule schedule below, cannot cannot be of schedule compres- compression be greater greater than compression sion, e.g., in Staff than 14% 14% in months 1.05 Months 0 0 15.75 1.00 70.67 0 0 8.70 1.00 13.16 8 12 17.29 1.12 148.04 10 14 17.26 1.14 160.09 0 0 10.41 1.00 21.87 0 0 16.04 1.00 74.41 0 0 8.15 1.00 10.93 0 4 17.23 1.04 106.48 0 0 10.87 1.00 24.72 0 0 16.28 1.00 77.65 0 0 14.69 1.00 57.99

Computed value of Productivity (P), e.g. 20,500/92.6 =221 EDIS/Staff Months 231.36 209.80 208.27 204.68 219.78 231.61 205.03 223.82 221.64 231.80 230.23

Computed value of Average Staff (AS), e.g. 92.6/11.4", 8.1 technical members; =(AE)/(19) 4.49 1.51 8.56 9.28 2.10 4.64 1.34 6.18 2.27 4.77 3.95

4.1 0.36

2.15 0.33

Maximum Duration= 18.2 months

Duration = (1-%compression/100) * (expected value of Duration)

P = Expected Value for Size/ computed value of adjusted effort

Page 3 of 3

Appendix II
Detailed Resource Estimate Spreadsheet with formulas revealed-attached.

CDCAAS
Col# 1 2

Appendix II
3 4 5
Development Class

9/22/08
5a

Your initial size of application in thousand of estimated delivered source instructions (ISE)

# 1 2 3 4 5 6 7 8 9 10 11

Package Name Database Management System Spreadsheet Reqm'ts & Config. Mgmt. Secure Communication Graphics Presentation Word Processor Project Management GPS Navigation Compiler Debugger & Test Electronic Inventory & Tracking Total Alpha Factor Exponent Allocation of CUSTOM/REUSE/COTS Applications

90 KEDSI/ KEDSI 1 400 70 350 358 198 135 43 77 205 125 600 =SUM(C4:C14)

100 KEDSI/ KEDSI 2 75 44 39 65 47 62 30 102 23 55 95 =SUM(D4:D14)

125 KEDSI/ KEDSI 3 290 90 134 140 119 80 134 230 80 90 160 =SUM(E4:E14)

(1= custom, 2=reuse/NDI, 3=COTS)

2 3 1 1 3 1 3 1 2 1 2

Compression % for target--for reference =AC4 =AC5 =AC6 =AC7 =AC8 =AC9 =AC10 =AC11 =AC12 =AC13 =AC14

Actual Allocation from table Development Class Code (1, 2, or 3) above 1 =COUNTIF(F$4:$F14,B26) Custom 2 =COUNTIF(F$4:$F15,B27) Reuse 3 =COUNTIF(F$4:$F16,B28) COTS Total =SUM(C26:C28)

Required Allocation Numbers 5 3 3 =SUM(E26:E28)

1 2 3

Page 1 of 8

CDCAAS
6

Appendix II
7

9/22/08

Computed value of the final size of application in thousand of estimated delivered source instructions for the class of development

KEDSI method 1 =IF($F4=1,C4*0.2,IF($F4=2,C4*0.06,C4*0.04)) =IF($F5=1,C5*0.2,IF($F5=2,C5*0.06,C5*0.04)) =IF($F6=1,C6*0.2,IF($F6=2,C6*0.06,C6*0.04)) =IF($F7=1,C7*0.2,IF($F7=2,C7*0.06,C7*0.04)) =IF($F8=1,C8*0.2,IF($F8=2,C8*0.06,C8*0.04)) =IF($F9=1,C9*0.2,IF($F9=2,C9*0.06,C9*0.04)) =IF($F10=1,C10*0.2,IF($F10=2,C10*0.06,C10*0.04)) =IF($F11=1,C11*0.2,IF($F11=2,C11*0.06,C11*0.04)) =IF($F12=1,C12*0.2,IF($F12=2,C12*0.06,C12*0.04)) =IF($F13=1,C13*0.2,IF($F13=2,C13*0.06,C13*0.04)) =IF($F14=1,C14*0.2,IF($F14=2,C14*0.06,C14*0.04)) =SUM(H4:H14)

KEDSI method 2 =IF($F4=1,D4*0.2,IF($F4=2,D4*0.06,D4*0.04)) =IF($F5=1,D5*0.2,IF($F5=2,D5*0.06,D5*0.04)) =IF($F6=1,D6*0.2,IF($F6=2,D6*0.06,D6*0.04)) =IF($F7=1,D7*0.2,IF($F7=2,D7*0.06,D7*0.04)) =IF($F8=1,D8*0.2,IF($F8=2,D8*0.06,D8*0.04)) =IF($F9=1,D9*0.2,IF($F9=2,D9*0.06,D9*0.04)) =IF($F10=1,D10*0.2,IF($F10=2,D10*0.06,D10*0.04)) =IF($F11=1,D11*0.2,IF($F11=2,D11*0.06,D11*0.04)) =IF($F12=1,D12*0.2,IF($F12=2,D12*0.06,D12*0.04)) =IF($F13=1,D13*0.2,IF($F13=2,D13*0.06,D13*0.04)) =IF($F14=1,D14*0.2,IF($F14=2,D14*0.06,D14*0.04)) =SUM(I4:I14)

Variance

=C26-E26 =C27-E27 =C28-E28

Page 2 of 8

CDCAAS
8

Appendix II
9aCalc 9bCalc 9cCalc

9/22/08

d source instructions for the class of development

KEDSI method 2 =IF($F4=1,E4*0.2,IF($F4=2,E4*0.06,E4*0.04)) =IF($F5=1,E5*0.2,IF($F5=2,E5*0.06,E5*0.04)) =IF($F6=1,E6*0.2,IF($F6=2,E6*0.06,E6*0.04)) =IF($F7=1,E7*0.2,IF($F7=2,E7*0.06,E7*0.04)) =IF($F8=1,E8*0.2,IF($F8=2,E8*0.06,E8*0.04)) =IF($F9=1,E9*0.2,IF($F9=2,E9*0.06,E9*0.04)) =IF($F10=1,E10*0.2,IF($F10=2,E10*0.06,E10*0.04)) =IF($F11=1,E11*0.2,IF($F11=2,E11*0.06,E11*0.04)) =IF($F12=1,E12*0.2,IF($F12=2,E12*0.06,E12*0.04)) =IF($F13=1,E13*0.2,IF($F13=2,E13*0.06,E13*0.04)) =IF($F14=1,E14*0.2,IF($F14=2,E14*0.06,E14*0.04)) =SUM(J4:J14)

Max =MAX(H4:J4) =MAX(H5:J5) =MAX(H6:J6) =MAX(H7:J7) =MAX(H8:J8) =MAX(H9:J9) =MAX(H10:J10) =MAX(H11:J11) =MAX(H12:J12) =MAX(H13:J13) =MAX(H14:J14) =SUM(K4:K14)

Middle value =SUM($H4:$J4)-$K4-$M4 =SUM($H5:$J5)-$K5-$M5 =SUM($H6:$J6)-$K6-$M6 =SUM($H7:$J7)-$K7-$M7 =SUM($H8:$J8)-$K8-$M8 =SUM($H9:$J9)-$K9-$M9 =SUM($H10:$J10)-$K10-$M10 =SUM($H11:$J11)-$K11-$M11 =SUM($H12:$J12)-$K12-$M12 =SUM($H13:$J13)-$K13-$M13 =SUM($H14:$J14)-$K14-$M14 =SUM(L4:L14)

Min =MIN(H4:J4) =MIN(H5:J5) =MIN(H6:J6) =MIN(H7:J7) =MIN(H8:J8) =MIN(H9:J9) =MIN(H10:J10) =MIN(H11:J11) =MIN(H12:J12) =MIN(H13:J13) =MIN(H14:J14) =SUM(M4:M14)

Page 3 of 8

CDCAAS
9

Appendix II
9wits 9 - 9wits 10

9/22/08

Computed value of expected value for Size, e.g., 20.5 KEDSI =CHOOSE(N$23,(L4*4+K4+M4)/6,O4) =CHOOSE(N$23,(L5*4+K5+M5)/6,O5) =CHOOSE(N$23,(L6*4+K6+M6)/6,O6) =CHOOSE(N$23,(L7*4+K7+M7)/6,O7) =CHOOSE(N$23,(L8*4+K8+M8)/6,O8) =CHOOSE(N$23,(L9*4+K9+M9)/6,O9) =CHOOSE(N$23,(L10*4+K10+M10)/6,O10) =CHOOSE(N$23,(L11*4+K11+M11)/6,O11) =CHOOSE(N$23,(L12*4+K12+M12)/6,O12) =CHOOSE(N$23,(L13*4+K13+M13)/6,O13) =CHOOSE(N$23,(L14*4+K14+M14)/6,O14) =SUM(N4:N14)

WITS report values for comparison, KEDSI 29.43 8.67 35.48 13.27 7.98 5.02 35 20.22 7.97 32.83 37.67 =SUM(O4:O14)

Calculated vs WITS for comparison, KEDSI

=N4-O4 =N5-O5 =N6-O6 =N7-O7 =N8-O8 =N9-O9 =N10-O10 =N11-O11 =N12-O12 =N13-O13 =N14-O14 =SUM(P4:P14)

Computed value of effort using the COCOMO model in staff months =Q$17+Q$18*$N4^Q$19 =Q$17+Q$18*$N5^Q$19 =Q$17+Q$18*$N6^Q$19 =Q$17+Q$18*$N7^Q$19 =Q$17+Q$18*$N8^Q$19 =Q$17+Q$18*$N9^Q$19 =Q$17+Q$18*$N10^Q$19 =Q$17+Q$18*$N11^Q$19 =Q$17+Q$18*$N12^Q$19 =Q$17+Q$18*$N13^Q$19 =Q$17+Q$18*$N14^Q$19 =SUM(Q4:Q14)

_1=real values _2=WITS values 1 2 1

3.6 1.2

Page 4 of 8

CDCAAS
11 12

Appendix II
9aCalc 9bCalc

9/22/08
9cCalc

Computed value of effort using the Walston Felix model, in staff months =R$17+R$18*$N4^R$19 =R$17+R$18*$N5^R$19 =R$17+R$18*$N6^R$19 =R$17+R$18*$N7^R$19 =R$17+R$18*$N8^R$19 =R$17+R$18*$N9^R$19 =R$17+R$18*$N10^R$19 =R$17+R$18*$N11^R$19 =R$17+R$18*$N12^R$19 =R$17+R$18*$N13^R$19 =R$17+R$18*$N14^R$19 =SUM(R4:R14)

Computed value of effort using the Boyston model, in staff months =S$17+S$18*$N4^S$19 =S$17+S$18*$N5^S$19 =S$17+S$18*$N6^S$19 =S$17+S$18*$N7^S$19 =S$17+S$18*$N8^S$19 =S$17+S$18*$N9^S$19 =S$17+S$18*$N10^S$19 =S$17+S$18*$N11^S$19 =S$17+S$18*$N12^S$19 =S$17+S$18*$N13^S$19 =S$17+S$18*$N14^S$19 =SUM(S4:S14) 5.78 3.11 1

Max =MAX(Q4:S4) =MAX(Q5:S5) =MAX(Q6:S6) =MAX(Q7:S7) =MAX(Q8:S8) =MAX(Q9:S9) =MAX(Q10:S10) =MAX(Q11:S11) =MAX(Q12:S12) =MAX(Q13:S13) =MAX(Q14:S14) =SUM(T4:T14)

Middle value =SUM($Q4:$S4)-$T4-$V4 =SUM($Q5:$S5)-$T5-$V5 =SUM($Q6:$S6)-$T6-$V6 =SUM($Q7:$S7)-$T7-$V7 =SUM($Q8:$S8)-$T8-$V8 =SUM($Q9:$S9)-$T9-$V9 =SUM($Q10:$S10)-$T10-$V10 =SUM($Q11:$S11)-$T11-$V11 =SUM($Q12:$S12)-$T12-$V12 =SUM($Q13:$S13)-$T13-$V13 =SUM($Q14:$S14)-$T14-$V14 =SUM(U4:U14)

Min =MIN(Q4:S4) =MIN(Q5:S5) =MIN(Q6:S6) =MIN(Q7:S7) =MIN(Q8:S8) =MIN(Q9:S9) =MIN(Q10:S10) =MIN(Q11:S11) =MIN(Q12:S12) =MIN(Q13:S13) =MIN(Q14:S14) =SUM(V4:V14)

5.2 0.91

Page 5 of 8

CDCAAS
13 14 15

Appendix II
16 17

9/22/08

Computed value of duration in staff months

Computed value of expected effort, e.g., 88.2 staff months =(U4*4+T4+V4)/6 =(U5*4+T5+V5)/6 =(U6*4+T6+V6)/6 =(U7*4+T7+V7)/6 =(U8*4+T8+V8)/6 =(U9*4+T9+V9)/6 =(U10*4+T10+V10)/6 =(U11*4+T11+V11)/6 =(U12*4+T12+V12)/6 =(U13*4+T13+V13)/6 =(U14*4+T14+V14)/6 =SUM(W4:W14)

Using the COCOMO model =X$17+X$18*$W4^X$19 =X$17+X$18*$W5^X$19 =X$17+X$18*$W6^X$19 =X$17+X$18*$W7^X$19 =X$17+X$18*$W8^X$19 =X$17+X$18*$W9^X$19 =X$17+X$18*$W10^X$19 =X$17+X$18*$W11^X$19 =X$17+X$18*$W12^X$19 =X$17+X$18*$W13^X$19 =X$17+X$18*$W14^X$19

Using the Walston Felix model =Y$17+Y$18*$W4^Y$19 =Y$17+Y$18*$W5^Y$19 =Y$17+Y$18*$W6^Y$19 =Y$17+Y$18*$W7^Y$19 =Y$17+Y$18*$W8^Y$19 =Y$17+Y$18*$W9^Y$19 =Y$17+Y$18*$W10^Y$19 =Y$17+Y$18*$W11^Y$19 =Y$17+Y$18*$W12^Y$19 =Y$17+Y$18*$W13^Y$19 =Y$17+Y$18*$W14^Y$19

Using the Boyston model =Z$17+Z$18*$W4^Z$19 =Z$17+Z$18*$W5^Z$19 =Z$17+Z$18*$W6^Z$19 =Z$17+Z$18*$W7^Z$19 =Z$17+Z$18*$W8^Z$19 =Z$17+Z$18*$W9^Z$19 =Z$17+Z$18*$W10^Z$19 =Z$17+Z$18*$W11^Z$19 =Z$17+Z$18*$W12^Z$19 =Z$17+Z$18*$W13^Z$19 =Z$17+Z$18*$W14^Z$19

Computed value of expected value for duration, in staff months =(Y4*4+X4+Z4)/6 =(Y5*4+X5+Z5)/6 =(Y6*4+X6+Z6)/6 =(Y7*4+X7+Z7)/6 =(Y8*4+X8+Z8)/6 =(Y9*4+X9+Z9)/6 =(Y10*4+X10+Z10)/6 =(Y11*4+X11+Z11)/6 =(Y12*4+X12+Z12)/6 =(Y13*4+X13+Z13)/6 =(Y14*4+X14+Z14)/6

2.5 0.32

4.1 0.36

2.15 0.33

Page 6 of 8

CDCAAS
18A For reference only 18

Appendix II
19 20

9/22/08

Value for schedule compression percentage given target below, cannot be greater than 14% =ROUNDUP(MAX(0, (1-($AB$18/$AA4))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA5))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA6))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA7))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA8))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA9))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA10))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA11))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA12))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA13))*100),0) =ROUNDUP(MAX(0, (1-($AB$18/$AA14))*100),0) Maximum Duration= 18.2 months

Value for schedule compression percentage, cannot be greater than 14% 0 0 12 14 0 0 0 4 0 0 0

Value of duration given percent of schedule compression in months =(1-AC4/100)*AA4 =(1-AC5/100)*AA5 =(1-AC6/100)*AA6 =(1-AC7/100)*AA7 =(1-AC8/100)*AA8 =(1-AC9/100)*AA9 =(1-AC10/100)*AA10 =(1-AC11/100)*AA11 =(1-AC12/100)*AA12 =(1-AC13/100)*AA13 =(1-AC14/100)*AA14
Duration = (1-%compression/100) * (expected value of Duration)

Effort adjustment factor (EAF) due to schedule compres-sion, e.g., 1.05 =AC4/100+1 =AC5/100+1 =AC6/100+1 =AC7/100+1 =AC8/100+1 =AC9/100+1 =AC10/100+1 =AC11/100+1 =AC12/100+1 =AC13/100+1 =AC14/100+1

Page 7 of 8

CDCAAS
21 22

Appendix II
23

9/22/08

Computed value of Average Computed value of Adjusted Effort (AE) due to Computed value of Productivity Staff (AS), e.g. 92.6/11.4", (P), e.g. 20,500/92.6 =221 8.1 technical members; schedule compression in Staff Months EDIS/Staff Months =(AE)/(19) =AE4*W4 =AE5*W5 =AE6*W6 =AE7*W7 =AE8*W8 =AE9*W9 =AE10*W10 =AE11*W11 =AE12*W12 =AE13*W13 =AE14*W14 =(N4*10^3)/AF4 =(N5*10^3)/AF5 =(N6*10^3)/AF6 =(N7*10^3)/AF7 =(N8*10^3)/AF8 =(N9*10^3)/AF9 =(N10*10^3)/AF10 =(N11*10^3)/AF11 =(N12*10^3)/AF12 =(N13*10^3)/AF13 =(N14*10^3)/AF14 =AF4/AD4 =AF5/AD5 =AF6/AD6 =AF7/AD7 =AF8/AD8 =AF9/AD9 =AF10/AD10 =AF11/AD11 =AF12/AD12 =AF13/AD13 =AF14/AD14

P = Expected Value for Size/ computed value of adjusted effort

Page 8 of 8

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

Appendix III
Work Package Specifications CONFIGURATION MANAGEMENT
WORK PACKAGE SPECIFICATION: Configuration Management App-Detailed DesignWizards. Activity number: 1.6.1.1.3 Activity name: Configuration Management- Detailed Design- Wizards Feature description: New Features will be added to existing MS Visual SourceSafe software to enhance reliability, scalability, performance, and remote access capabilities using .NET Programmable Interoperable Assemblies (PIA). Activity Description: Create and review detailed design of Configuration Wizards Management Software. Estimated duration: 12 days. Resources needed: Personnel: 1 Team Lead, 1 Requirement Analyst, 1 Designer, 1 System Engineer, 1 Senior Programmer, 3 Programmers, 1 QA Engineer, 1 COTS Expert. Skills: Visual Basic, Java. Tools: CM App, Visual Basic. Travel: None. Work Product: Detailed Design of Configuration wizards Management. Risks: Lack of human factors skills. Predecessors: Requirements and High Level Design of Configuration Management Wizards. Completion criteria: Inspection of functionality. IMPLEMENTATION Personal assigned: Vicky Donofrio, Noel Kidman, Kate Rosenberg, Rizzo Rat, Hanuman Singh. Starting date: ___12/24/08____Completion Date: ___04/14/09____ Cost (budgeted/actual):____$1,339,346.60_____ Legacy comments:

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

COMMUNICATION
WORK PACKAGE SPECIFICATION: Communication-Detailed Design Activity number: 1.6.1.2.3 Activity name: Customers Data Base Link - Application Function. Feature description: Display customer related information from centralized database when user calls in. Activity Description: Create functions to install/display customer information into/from database. Estimated duration: 4 days Resources needed: Personnel: 1 Team Leader, 1 System Engineer, 1 Senior Programmer, 2 Programmers, 1 Tester. Skills: Visual Basic, GUI development, Java, C/C++, Access. Tools: GUI builder, MS Office, MS Visual Basic. Travel: None. Work Product: Detailed function and classes for Communication application. Risks: Lack of human factors skills. Predecessors: Detailed design of application complete, user interface complete. Completion criteria: Inspection of functionality. IMPLEMENTATION Personal assigned: Scooter, Greenburg, Jae Motor, Mary King, Jerry. Starting date: ___12/24/08____Completion Date: ___04/14/09____ Cost (budgeted/actual): ___$1,423,980.00____ Legacy comments:

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

GRAPHIC PRESENTATION
WORK PACKAGE SPECIFICATION: Graphic Representation App-Detailed DesignPresentations. Activity number: 1.6.4.2.3 Activity name: Graphic Representation - Detailed Design- Presentations Feature description: Extensions to the standard Microsoft PowerPoint will include new features such as extracting embedded Power Point Slides from Word documents, improvements in animation engine, references to Active X controls, countdown timers, and quizzes. Activity Description: Create and review detailed design of Power Point Presentation with enhanced features from Microsoft PowerPoint Estimated duration: 15 days Resources needed: Personnel: 1 Team Lead, 1 Requirement Analyst, 1 Designer, 1 System Engineer, 1 Senior Programmer, 2 Programmers, 1 QA Engineer, 1 Component Expert, Skills: Visual Basic, Multimedia Tools Tools: Graphic Representation App, WBS Chart Pro, Visual Basic Travel: None Work Product: Detailed Design of Graphic Representation - Power Point Presentation Risks: Lack of human factors skills Predecessors: Requirements and High Level Design of Power Point Presentation Completion criteria: Inspection of functionality IMPLEMENTATION Personal assigned: Pumpkin Chattaroy, Road Romeo, Street Caesar, Jerry, Tom. Starting date: ____12/15/09___Completion Date: ___03/08/10____ Cost (budgeted/actual):____$315,694.46_____ Legacy comments:

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

WORD PROCESSING
WORK PACKAGE SPECIFICATION: Word Processing Application -Detailed Design- HTML Activity number: 1.6.1.3.3 Activity name: Word Processing- Detailed Design- HTML Feature description: The standard Microsoft Word software will be integrated with Semantic Word to offer features such as an automatic information extraction system and the tools for refining and augmenting its output, customizable tools for simultaneous generation of content and semantic annotations and an annotation scheme that allows for annotations to be reused when content is reused. Activity Description: Create and review detailed design of Word Processing with HTML Estimated duration: 09 days Resources needed: Personnel: 1 Team Lead, 1 Requirement Analyst, 1 Designer, 1 System Engineer, 1 Senior Programmer, 3 Programmers, 1 QA Engineer, 1 Component Expert, Skills: Visual Basic, HTML, Java Tools: HTML, Visual Basic Travel: None Work Product: Detailed Design of Word Processing with HTML. Risks: Lack of human factors skills. Predecessors: Requirements and High Level Word Processing with HTML. Completion criteria: Inspection of functionality. IMPLEMENTATION Personal assigned: Sam Eagle, Stacey Vault, Carol Monkey, Sonica Eagle, Charles Rosenburg. Starting Date: ____12/15/08____Completion Date: ____03/06/09____ Cost (budgeted/actual):____$754,742.35____ Legacy comments:

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

PROJECT MANAGEMENT
WORK PACKAGE SPECIFICATION: PM App-Detailed Design- WBS Integration Activity number: 1.6.4.3.3 Activity name: PM-Detailed Design- WBS Integration Feature description: Existing MS Project software will be enhanced by integrating WBS Chart Pro to include features such as capabilities of creating Work Breakdown structures Activity Description: Create and review detailed design of integration of WBS Chart Pro into existing Project Manager App Estimated duration: 20 days Resources needed: Personnel: 1 Team Lead, 1 Requirement Analyst, 1 System Engineer, 1 Senior Programmer, 2 Programmers, 1 QA Engineer Skills: Visual Basic, WBS Chart Pro Tools: PM App, WBS Chart Pro, Visual Basic Travel: None Work Product: Detailed Design of WBS Chart Pro integration Risks: Lack of human factors skills Predecessors: Requirements and High Level Design of WBS Integration Completion criteria: Inspection of functionality IMPLEMENTATION Personal assigned: Miss Piggy, Waldorf Bird, Miss Queen, Kate Nelson, Wanda Briggs. Starting date: ___12/15/09____Completion Date: __02/08/10____ Cost (budgeted/actual):____ $315,852.11_____ Legacy comments:

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

GLOBAL POSITIONING SYSTEM GPS


WORK PACKAGE SPECIFICATION: GPS-Detailed Design Activity number: 1.6.2.1.2 Activity name: GPS-Detailed Design Feature description: Create wizard interface to allow user to learn voice navigation functionality Activity Description: create detailed design of functionality that Wizards will provide, integrate with user interface Estimated duration: 20 days Resources needed: Personnel: 1 Team leader, 1 Requirement Analyst, 1 Designer, 1 System Engineer, 2 Senior Programmers, 3 Programmers, 1 Tester Skills: voice navigation knowledge, GUI development tool Tools: GUI builder, CM tool Travel: None Work Product: Detailed Design of functionality for Wizards Risks: Lack of human factors skills Predecessors: Detailed design of Wizards user interface complete Completion criteria: Inspection of functionality IMPLEMENTATION Personal assigned: Jerry, Tom, Carol Monkey. Starting date: ___ 10/20/08____Completion Date: ___ 12/18/09____ Cost (budgeted/actual):____ $3,762,145.00 _____ Legacy comments:

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

COMPILE/LINK/RUNTIME
WORK PACKAGE SPECIFICATION: COMP Detailed Design Activity number: 1.6.3.2.2 Activity name: COMP Detailed Design Feature description: Extend user interface to allow user to compile Java language. Activity Description: Create detailed design user interface function to integrate and compile Java codes. Estimated duration: 10 days Resources needed: Personnel: 1 Team Leader, 1 Requirement Analyst, 1 System Engineer, 1 Senior Software Engineers, 2 Programmers. Skills: GUI Development, Java. Tools: GUI builder, Java, Travel: None Work Product: Detailed Design of Java Compiling functionality user interface. Risks: Lack of human factors skills. Predecessors: Detailed design of compiling user interface complete. Completion criteria: Inspection of functionality. IMPLEMENTATION Personal assigned: Neil Armstrong, Jeff Marcus, Mickey Mouse, Teddy Brown. Starting date: ___ 05/18/09____Completion Date: ___ 08/07/09____ Cost (budgeted/actual):____ $315,602.20 _____ Legacy comments:

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

LANGUAGE INDEPENDENT DEBUGGING AND TESTING


WORK PACKAGE SPECIFICATION: LANGUAGE Detailed Design Activity number: 1.6.1.4.3 Activity name: WEB BROWSER User Interface Feature description: Online link to web site that may offer help when error is found in programming debugging and testing. Activity Description: Create detailed design web browser interface that will integrate with user interface leading to search on web site. Estimated duration: 3 days Resources needed: Personnel: 1 Team Leader, 1 Requirement Analyst, 1 Architecture Designer, 1 System Engineer, 1 Senior Software Engineers, 3 Programmers, 1 QA Engineering. Skills: GUI Development, Object Oriented Development, .NET experience Tools: GUI builder, Java, C, C++, Visual Basic, HTML, UML, .Net Technology. Travel: None Work Product: Detailed Design of functionality for Web Browser Interface Risks: Lack of human factors skills Predecessors: Detailed design of Web Browser user interface complete Completion criteria: Inspection of functionality IMPLEMENTATION Personal assigned: Miss Piggy, Paris, Kate Williams, Bunsen Honeydew, Rizzo Rat. Starting date: ___ 12/24/08____Completion Date: ___ 04/14/09____ Cost (budgeted/actual): ____ $787,192.73_____ Legacy comments:

Crime Data Collection, Aggregation and Assimilation System (CDCAAS) Team Crime Busters SWE 625 - Professor Ken Nidiffer

Version 1.0 12/08/2008

ELECTRONIC INVENTORY AND TRACKING


WORK PACKAGE SPECIFICATION: Electronic Inventory & Tracking-Detailed Design Activity number: 1.6.3.3.2 Activity name: Electronic Inventory & Tracking -Detailed Design. Feature description: Create wizard interface to allow user to track the inventory. Activity Description: create detailed design of functionality that Wizards will provide, integrate with user interface. Estimated duration: 20 days Resources needed: Personnel: 2 senior software engineers, 2 system analysts Skills: voice navigation knowledge, GUI development tool Tools: GUI builder, CM tool Travel: None Work Product: Detailed Design of functionality for Wizards. Risks: Lack of human factors skills. Predecessors: Detailed design of Wizards user interface complete. Completion criteria: Inspection of functionality. IMPLEMENTATION Personal assigned: Rakesh Roshan, Sam Eagle, Sonica Eagle, Jeff Marcus. Starting date: ___ 05/26/09____Completion Date: ___ 08/17/09____ Cost budgeted/actual):____ $587,864.55 _____ Legacy comments:

Binder Back Cover

Team Crime Busters

Team Crime Busters consists of, from left to right, Brian H. Park, John Kraus, Sonal Verma and Akhil Pathania.