Vous êtes sur la page 1sur 24

Test Automation Strategy

<Project Name in Properties "Subject">

Author:

<Author>

Master Test Plan

Version 1.0- 10/14/2009

Version History
Changes to the Test Automation Strategy must follow the update process to ensure that the most current version(s) are being used by all project team members. This section outlines how these updates to the Test Automation Strategy will be published and approved. The project teams versioning standards should be adhered to for numbering convention. Circumstances requiring approvals for changes to the Test Automation Strategy could include any of the following: Changes in automation scope or requirements resulting in corresponding updates to the Automation Test Strategy Risks, Constraints, Dependencies and Assumptions as and when identified as far as automation is concerned.

Following any significant revision, the Test Automation Strategy will be distributed by email for review and sign off by individuals identified as approvers in section 1.2. Voting button options will include approve, approve with change (note in email response) or reject. Smaller or less significant changes do not require project team approval. Review Log Version
0.1

Date

Section(s)
Original Release

Changed By Approval
QA Lead

Document Status: Final

Page 2 Template Version: 1.1

Master Test Plan

Version 1.0- 10/14/2009

This Test Automation Strategy template reflects the standard processes and procedures for the test automation-planning phase. The format of this template is designed to facilitate the creation of a Test Automation Strategy by providing the essential sections that should be included in a comprehensive test automation plan, and by suggesting subject matter for each section. The author should be familiar with the Principal Financial Group development processes and related automation testing procedures. Text displayed in blue italics (style=InfoBlue) is included to provide guidance to the author and should be deleted before publishing the document. To customize automatic fields in Microsoft Word (which display a gray background when selected), select File>Properties and replace the Title, Subject, Project, and Version fields with the appropriate information for this document. After closing the dialog, automatic fields may be updated throughout the document by selecting Edit>Select All (or Ctrl-A) and pressing F9, or simply click on the field and press F9. This must be done separately for Headers and Footers. Alt-F9 will toggle between displaying the field names and the field contents. See Word help for more information on working with fields. All sections of this Test Automation Strategy document are to be preserved to maintain uniformity and indicate areas of deliberate omissions. The author should not deliberately omit sections with the notation NA. Each section of this template provides guiding information for the author and or actual text that the author may use as examples of the content of each section. Action Item: Any text in blue serves as a placeholder and/or sample data. Substitute it with real information or delete it. Action Item: Please make sure to update all fields and tables in this document. (The key sequence is <Ctrl A> then <F9>)

Document Status: Final

Page 3 Template Version: 1.1

Master Test Plan

Version 1.0- 10/14/2009

Table of Contents

1.0 1.1 1.2 2.0 2.1

INTRODUCTION .........................................................................................................................................6 PURPOSE......................................................................................................................................................6 TARGET AUDIENCE .....................................................................................................................................6 ESTABLISHING THE ASSIGNMENT ......................................................................................................7 IDENTIFY THE SCOPE ...................................................................................................................................7 In scope ..................................................................................................................................................7 Out of scope ...........................................................................................................................................7 IDENTIFY THE PRECONDITIONS....................................................................................................................8 Dependencies/Constraints/Considerations .............................................................................................8 IDENTIFY THE ASSUMPTIONS .......................................................................................................................9 IDENTIFY THE RISKS .................................................................................................................................. 10 DETERMINING THE STRATEGY ......................................................................................................... 13 TEST LEVELS ............................................................................................................................................. 13 System Test .......................................................................................................................................... 13 Regression Testing ............................................................................................................................... 13 RISK TABLE ............................................................................................................................................... 13 HIGH LEVEL MILESTONES .................................................................................................................. 14 HIGH LEVEL MILESTONE ESTIMATED DATES ............................................................................................ 14 ENTRANCE / EXIT CRITERIA ............................................................................................................... 15 SUSPENSION CRITERIA ......................................................................................................................... 15 DEFINE THE TEST PRODUCTS ............................................................................................................ 16 TEST STRATEGY ........................................................................................................................................ 16 TEST SCRIPTS ............................................................................................................................................ 16 TEST DATA ................................................................................................................................................ 16
PROGRESS REPORTS ................................................................................................................................... 16

2.1.1 2.1.2 2.2 2.2.1 2.3 2.4 3.0 3.1 3.1.1 3.1.2 3.2 4.0 4.1 5.0 6.0 7.0 7.1 7.2 7.3 7.4 7.5 8.0 8.1 8.2

TEST EVALUATION REPORTS ..................................................................................................................... 16 DEFINE THE ORGANIZATION ............................................................................................................. 17 ROLES & RESPONSIBILITIES ...................................................................................................................... 17 TRAINING NEEDS ....................................................................................................................................... 18 Page 4 Template Version: 1.1

Document Status: Final

Master Test Plan

Version 1.0- 10/14/2009

9.0 9.1 9.2 9.3 9.4 9.5 9.6 9.7 10.0 10.1 10.2 10.3 11.0 11.1 11.2 11.3

COMMUNICATION PLAN ...................................................................................................................... 19 MEETINGS AND REPORTS ........................................................................................................................... 19 TELECONFERENCE INFORMATION .............................................................................................................. 21 WEB CONFERENCE INFORMATION ............................................................................................................. 21 PROJECT DISTRIBUTION LISTS .................................................................................................................... 21 OUTLOOK PUBLIC FOLDERS ...................................................................................................................... 21 DOCUMENT STORAGE ............................................................................................................................... 21 QUERY LOG............................................................................................................................................... 21 DEFINE THE INFRASTRUCTURE ........................................................................................................ 22 APPLICATION SETUP IN SYSTEM TEST ENVIRONMENT .............................................................................. 22 TEST USERS SETUP .................................................................................................................................... 22 SYSTEM SETUP .......................................................................................................................................... 22 ORGANIZE TEST MANAGEMENT ....................................................................................................... 23 TEST PROCESS MANAGEMENT .................................................................................................................. 23 TEST CASE COVERAGE RATIOS .................................................................................................................. 24 INFRASTRUCTURE MANAGEMENT ............................................................................................................. 24 Meetings .......................................................................................................................................... 24 Environment, Tools and Workplaces .............................................................................................. 24

11.3.1 11.3.2 11.4

DEFECT MANAGEMENT ............................................................................................................................. 24

Document Status: Final

Page 5 Template Version: 1.1

1.0

INTRODUCTION

This Test Automation Strategy communicates the approach to implementing the automation testing effort using a test automation tool. This document serves as both a communication tool for the automation team and a governing document for the test automation initiative. It will be maintained to reflect all significant changes to the scope, approach, test methods or entry/exit criteria defined in this strategy. In addition, the only authorized deviations from standard practice controls and deliverables are those noted in the approved plan. All activities and their deliverables will be conducted in accordance with standard policy and procedures and are subject to a compliance audit. 1.1 PURPOSE The purpose of this Test Automation Strategy is to communicate to the project stakeholders and to the testing team how the Strategy will be implemented and to establish authorization to proceed with the test initiative using the referenced approach, standards and controls. This is a governing document for the test activities therefore the deliverables and key controls identified in this Strategy are subject to review for compliance with standard policies and procedures. 1.2 TARGET AUDIENCE Indicate who will approve document. Sign-off is only required for approvers.

The target audiences for this document are those that represent the following roles on this project: Role QA Lead Test Manager(As per the RUP template) Test Analyst Automation Business Analyst Name Approver / Reviewer Sign-off Date

Document Status: Final

Page 6 Template Version: 1.1

2.0

ESTABLISHING THE ASSIGNMENT


2.1 IDENTIFY THE SCOPE In scope 2.1.1

This Test Strategy for the <Project Acronym in Properties "Project"> testing initiative includes the following areas of scope: In this section, identify the scope of automation testing activities planned. Include the following topics: Explain about the need/goals of automation for the project and how it would help in overall testing. What types of testing are covered under this automation strategy, such as functional, regression, smoke or other tests? Why a particular test tool is selected. How the proof of concept was carried out, how much coverage is expected through automation, which types of test cases are to be automated. Define dependencies, such as, impact of changes in development plan, features, etc. Make recommendations on the time-line for automation. Provide a high level design that is scalable and allows future addition of test scripts. Provide a plan to manage and maintain the Automation scripts (post-delivery) What components of the overall application are to be tested? What is the relationship of the application to be automated to the overall system / environment?

2.1.2

Out of scope

The following functions, systems, processes are specifically out of scope for the <Project Acronym in Properties "Project"> testing initiative: In this section, identify specific items that are outside the scope of testing What functional business areas will not be automated? Mention that high complex functions will not be automated. Mention that tests requiring large calculation or more user interference will not be automated. Files containing multiple records would not be tested. Mention if tests involving backend validations are not done Mention that non repetitive Test Cases (They may be automated but is not feasible) Mention that tests requiring longer time to automate but manual testing time would be too less, would not be automated.
Document Status: Final Page 7 Template Version: 1.1

Mention that tests involving Graphics might not be automated. Mention that test cases involving testing of printing features can not be tested Mention that test cases involving the Mozilla browser can not be tested 2.2 IDENTIFY THE PRECONDITIONS Dependencies/Constraints/Considerations

2.2.1

List any dependencies/constraints/considerations that may impact the design, development or implementation of automation testing; e.g., dependencies on other organizations (such as DEV, Environment group, Users, and or SMEs) and or schedules, resource dependencies, etc. Some samples have been provided below. Indicate other projects that may affect this project. Description Type
Dependency, Constraint, Offshore consideration, XBU consideration, Vendor consideration, other

Impact/Consideration

Manual test cases not ready

Dependency

Automation may not start as the framework maynot be created without knowing all the test cases that needs to be automated. Resource availability Time frame/ Schedule Regulatory

Contractors contract expires 12/01/2010 There is a company freeze beginning 12/15/2009 PFG will be assessed a fine if change does not occur by 12/31/2009 SharePoint Access Timing Issues related to cycle Communication across business units Determine which Quality Center Domain & Project will be used Defect Resolution

Constraint Constraint Constraint

Offshore consideration Offshore consideration XBU consideration XBU consideration

Who will need access to SharePoint Time difference How will communication be handled? Access/Security to Quality Center Domain & Project How will defects be communicated to vendor and how will vendor communicate when defect has been
Page 8 Template Version: 1.1

Vendor consideration

Document Status: Final

resolved?

2.3

IDENTIFY THE ASSUMPTIONS Identify the significant assumptions made in developing this plan, which if false, may impact the design, development or implementation of testing The Business Analysts for this project will be available to answer questions when necessary. The <<enter test environment here>> environment will be available and stable. Manual testing will be completed before automation begins. The results will be available to the project team. If there is a slippage in the project timeline or scope, automation test script creation will also be impacted. Existing production issues will remain unless specifically addressed by the project. QA resources assigned to the project will have previous knowledge of creating test scripts using the test automation tool. All assigned QA resources must be given a background of the project and understand those aspects prior to the start of script creation. Test execution will not start until signoffs are received for all QA artifacts that require signoff. All the security access needed will be given to the test automators prior to test script creation. Test cases will be written based on business and functional requirements and technical designs. If any of these documents are not completed, test scripts will not be in accordance and there will be a risk of having gaps and rework.

Document Status: Final

Page 9 Template Version: 1.1

2.4 IDENTIFY THE RISKS The table below outlines project or process risks to testing. These types of risk generally affect the test teams ability to work efficiently or meet project milestones. The content of the table below are examples only Estimates of project Impact and Probability of occurrence are listed in the table as L=Low, M=Medium, or H=High. The Risk Measure is a calculation of [Impact x Probability] with a scale of 1-9. Values are assigned as follows: L=1 M=2 H=3 Risk Impact Probability Measure Mitigation

Description

Document Status: Final

Page 10 Template Version: 1.1

Description The completion of automation scripts would depend on the Manual Testing completion Certain test cases will not be automated due to high complexity. Hence automation will be incomplete for those area Requirements will change while the project is in development due to changing regulatory requirements, business changes, etc. Testers unfamiliar with new environments.

Risk Impact Probability Measure Mitigation H H 9 As and when the manual test case creation and execution is complete, one can start automation, need not wait for the all the test cases to be complete. This might not be decided at the time of planning but as the work progresses in automation, one may find such a scenario which needs to be discussed and addressed. Project Management will review any changes, assess impact, risk, and cost and present these to the steering committee for review. This will include date change discussions. PM will provide point of contact for environments to the QA Lead. If any environmental issues arise, QA will raise any issues or concerns immediately to the PM. PM has negotiated the dates for Security definition Work with testing resources to set up dedicated testing time. Work with Project Manager to resolve any resourcing issues. Additional QA effort will need to be identified to overcome the timeline gap. This will happen if the timeline is not adjusted for the delay.

Security access and profile timelines. Testing resources assigned to project are unable to commit time for testing. Delay in code delivery.

M M

L M

2 4

Document Status: Final

Page 11 Template Version: 1.1

Document Status: Final

Page 12 Template Version: 1.1

3.0

DETERMINING THE STRATEGY


The test strategy is based on the product risk analysis, and the decision on which quality characteristic must be tested per test level, and to what depth.

3.1

TEST LEVELS The following test levels will be executed: 3.1.1 System Test A system test is a test carried out by the test team, with the aim of demonstrating that the developed system, or parts of the system, meets with the functional specifications and the technical design. (This includes regression testing of existing functionality and interfaces with other systems.) The automation scripts would be created as per the test cases and would be executed end to end to complete the System Testing test cases. 3.1.2 Regression Testing Regression Testing will be to demonstrate rerunning previously run scripts and checking whether previously fixed faults have re-emerged. 3.2 RISK TABLE Determine which test phase characteristics are tested in; System, Performance, Regression. Rate them by High. Medium, Low The manual test cases would cater to these characteristics but in case you want to emphasize on the part creation of test scripts, you can enter the product risk analysis and the decision on which quality characteristic the test scripts must be created per test level, and to what depth. The decision totally depends on the feasibility of creation of automation scripts and support of the test tool. System Performance Regression Connectivity Continuity Data controllability Effectively Efficiency Flexibility Functionality Suitability of Infrastructure Maintainability Manageability

Document Status: Final

Page 13 Template Version: 1.1

Performance Portability Reusability Security Suitability Testability UserFriendliness

4.0

HIGH LEVEL MILESTONES


4.1 HIGH LEVEL MILESTONE ESTIMATED DATES Below are high-level milestone as currently defined. For more detailed schedule and milestone information reference the <Project Acronym in Properties "Project"> project Strategy on the project share drive.
Activity Create Test Automation Strategy Create Automation Framework Identifying the reusable components and preparation Creating the OR Test Data Preparation Script Creation Review Rework Individual Execution Batch Execution Report Analysis Downtime/Issue Log Documentation Estimated Start Date Estimated End Date

Document Status: Final

Page 14 Template Version: 1.1

5.0

ENTRANCE / EXIT CRITERIA

System Test Criteria Entry Criteria

Exit Criteria

(performed by automation Tester role) Description Coding is complete and unit integration testing has been reviewed and accepted by the test organization No outstanding critical defects open The defined System Test environment is prepared and stable System Test scripts have been written and reviewed for coverage and completeness System Test data is available for test execution Change Request and Version Control process is in place Agreed upon coverage ratios have been met and all scripts have been executed. Application has been sufficiently tested to meet requirements (by execution of scripts) No outstanding critical defects remain open

Regression Test (performed by automation Tester roles) Criteria Description Entry Criteria Approval to deploy application has been received from business Application has been deployed to production Exit Criteria All test scripts have been executed No outstanding critical defects remain open

6.0

SUSPENSION CRITERIA
This section will identify and address any situation where the automation test script creation effort might have to be temporarily suspended. 6.1 6.2 Test environment becomes unavailable. Test cases are blocked by a documented defect.

Document Status: Final

Page 15 Template Version: 1.1

7.0

DEFINE THE TEST PRODUCTS


Define the artifacts that will be produced for this project. Examples have been included below. 7.1 TEST STRATEGY System Test Plan Regression Test Plan TEST SCRIPTS Test scripts, test data and documentation and other xls, vbs files attached to the script are housed in Quality center: Domain= Project= TEST DATA Documentation of data to be used for execution of test scripts, both in QC and in the xls data sheet. PROGRESS REPORTS Test execution status reports weekly reports of test execution status and application quality TEST EVALUATION REPORTS Issues after Release Advice Gives insight into the way the test script creation and execution process has gone Provides results from the lessons learned session

7.2

7.3

7.4

7.5

Document Status: Final

Page 16 Template Version: 1.1

8.0

DEFINE THE ORGANIZATION


Define the Roles & Responsibilities that will be utilized with this project. If you will not be using a specific role, indicate Not Applicable. 8.1 ROLES & RESPONSIBILITIES Assigned To

Role

Responsibilities
Prepare Test Automation Strategy Schedules and facilitates automation test related meetings Provides oversight of automation related Defect Management process Assists in mentoring and training of project testers when appropriate Provides testing status updates throughout the automation testing phase Provides resource estimates Create test scripts Execute test scripts within the timeframe allowed Log defects and retest as necessary Communicate status of testing to QA Lead Partner with QA Lead to ensure successful execution and completion of automation testing related activities on business unit projects. Work with Business Analyst to clarify requirements. Collaborate with QA to get clarification while creating test scripts in order to ensure proper coverage of all business requirements and functional specs., that are already dealt with in the test case. Help identify new areas for automation testing, and define validation responsibilities. Ensure the project scope is finalized prior to the team beginning test script creation and execution. Ensure adequate project timelines support the time needed for test preparation and automation script execution. Assist the QA Lead, as a resource, in identifying the reporting needs, specific to the project.

QA Lead/ Automation Assignment Manager

Testers/ Test Automation developer (as per the RUP


template)

Assignment Manager

Business Owner/SME

Project Manager

Document Status: Final

Page 17 Template Version: 1.1

Test Environment Support

Migrate code Test environment issue resolution. Configuring test environment. Participate in Automation Testing Status Meetings as needed.

8.2

TRAINING NEEDS Describe what training, if any, is needed for the test team to come up to speed. Examples may include automation tool training, Quality Center training or application knowledge.

Document Status: Final

Page 18 Template Version: 1.1

9.0

COMMUNICATION PLAN
9.1 MEETINGS AND REPORTS Describe how often and in which way the automation test team will report within and outside the team. Make sure this is in accordance with the customers wishes. Describe how often and with whom a meeting shall take place This section should address all forms of communication; meetings, reporting, etc., list exceptions to the standards. Event Objective Method Chair (Report /Meeting)
QA Lead

Attendees / Audience
Project Manager QA Lead Automation developer Technical Lead Project Sponsor

Frequency

Testing Kickoff

Review timeline and introduce the Test Automation Strategy document to QA resources.

Once

Test Status Review

Review general health of Automation Testing Initiative. Operational Issues Describe any general project meetings known at the time of the Test Strategy that members of the test team will be attending. Shows status of automation script creation and execution. Review status of open defects. Identify any new issues. Release status. Discuss project results and ID lessons learned for future improvement. Report

QA Lead

Test team QA Lead

Weekly

General Project Meeting(s)

Project Manager

Technical Lead Business Analyst Project Sponsor QA Lead

Weekly

Weekly Status

QA Lead

Project Manager QA Lead Project Sponsor

Weekly

Defect Review

QA Lead

QA Lead Project Manager Technical Lead

Weekly or As needed

Lessons Learned

Project Manager

QA Lead Project Manager Technical Lead Automation developer

Once

Document Status: Final

Page 19 Template Version: 1.1

Business Analyst

Document Status: Final

Page 20 Template Version: 1.1

9.2

TELECONFERENCE INFORMATION Update items in red with your specific information. US & Canada: Dial-In Number: XXX-XXX-XXXX International: Dial-In Number: XXX-XXX-XXXX Enter conference Code: XXX-XXX-XXXX# Leader enter *; then PIN: XXXX#, then 1 System will ask you to state your name may do so, or leave blank by pressing # System will sound a tone then connect the call

9.3

WEB CONFERENCE INFORMATION Update items in red with your specific information. Sign in by clicking on the URL keying all information below: Meeting URL: https://www.livemeeting.com/cc/principal/join Meeting ID: XXXXXX Presenter Key: XXXXXX Attendee Key: XXXXXXXX

9.4

PROJECT DISTRIBUTION LISTS Include email distribution lists built for this project. OUTLOOK PUBLIC FOLDERS The following public folders will be used by this project for testing: Add Outlook path here

9.5

9.6

DOCUMENT STORAGE Indicate where your documentation is stored and provide the path or a link. SharePoint Project folder

9.7

QUERY LOG If your project is not utilizing PGS resources, the Query Log is Not Applicable. The Query Log within SharePoint will be utilized to communicate general questions and answers regarding terminology, resources and knowledge gaps encountered while creating automation test scripts.

Document Status: Final

Page 21 Template Version: 1.1

10.0 DEFINE THE INFRASTRUCTURE


10.1 APPLICATION SETUP IN SYSTEM TEST ENVIRONMENT The system test environment needs to mimic the production environment. The environment, including the software and data, must be a controlled environment. Development will be delivering new builds to the System environment on an agreed upon schedule. When each new build is delivered to QA, the following application setup needs to be completed: Provide additional information as needed for your project.

10.2 TEST USERS SETUP Specify any special user setup. Consider the needs for test accounts with various access levels. 10.3 SYSTEM SETUP The following applications must be controlled and available in the System environment to support the System Testing effort.(This should include any hardware setup that needs to be done for the automation tool to run smoothly.) System Included in test?
Yes Yes Yes Yes Yes Yes Yes Yes Yes No No No No No No No No No

Notes
Note special data requirements, etc.

Document Status: Final

Page 22 Template Version: 1.1

11.0 ORGANIZE TEST MANAGEMENT


Describe the manner in which the management of the test process, infrastructure, test products and defects is organized 11.1 TEST PROCESS MANAGEMENT This should be in line with the manual test process management. The defects detected by the execution of the automation scripts should be assigned and tracked. Some in-process, production readiness metrics can be calculated from the data gathered in defect management tool and the test management tool. Some of these real-time metrics help predict whether the quality of the software will be acceptable by the scheduled release. Others provide status on the completeness of the testing phases, which can be compared to the project plan. Typically, these metrics are calculated and reviewed as part of project status. Note: In the metrics below defects include defects only. Defects do not include any enhancements that may be logged. Each test level may report the metrics for that level in their weekly status.

Metric: # open defects by severity

Indicates/Reflects on: Status of the software towards production ready. List of open defects used to evaluate the current status, severity and priority of open defects.

# new defects found by week

Whether the software is approaching production ready. This is viewed graphically as a time series chart. If defect arrival rate does not quickly decrease, team leaders and management should promptly assess the situation.

# new defects found by week by severity # defects resolved by week

Whether the software is approaching production ready. This is viewed graphically as a time series chart. Critical and major defects should decrease very quickly. Whether the software is approaching production ready. This is viewed graphically as a time series chart. If defect resolution rate is not greater than the # of new defects found by week, team leaders and management should promptly assess the situation.

# test cases by execution status pass/fail/no run % test cases executed

Whether the software is converging on production ready. If the # test scripts passed per week does not quickly increase, team leaders and management should promptly
Page 23 Template Version: 1.1

Document Status: Final

Metric:

Indicates/Reflects on: assess the situation.

11.2 TEST CASE COVERAGE RATIOS 80% is the recommended overall test case coverage ratio for test automation. This may be adjusted up or down based on project risks, priorities and impacts. The test coverage ratio is the comparison of the total number of test cases automated to the total number of test cases written. 11.3 INFRASTRUCTURE MANAGEMENT 11.3.1 Meetings The Automation Test Manager & QA Lead will meet on an agreed upon scheduled time to discuss what more needs to be automation and its feasibility.

11.3.2 Environment, Tools and Workplaces Describe any additional items that are specific to your project. Examples have been provided. Test automation validation will occur in Text Pad and/or Excel as needed Test Environment Location: Path/Logical location information needed to access Test Environment Security required for access to the Test Environment Script creation and execution will occur at automation testers desks

11.4 DEFECT MANAGEMENT Defects will be tracked within the following Quality Center Domain & Project. <QC Domain> < QC Project> Justify any deviations from this process.

Document Status: Final

Page 24 Template Version: 1.1

Vous aimerez peut-être aussi