Vous êtes sur la page 1sur 30

Quality Assurance Process

Revision 2008

CONFIDENTIALITY NOTE: THIS DOCUMENT, INCLUDING ANY AND ALL ATTACHMENTS, CONTAINS CONFIDENTIAL INFORMATION INTENDED ONLY FOR THE USE
OF TG. IF THE RECIPIENT OF THIS MESSAGE IS NOT AN AUTHORIZED PARTY OR AN EMPLOYEE OF AN AUTHORIZED PARTY, YOU ARE HEREBY NOTIFIED THAT
READING, DISCLOSING OR EXPLOITING THIS DOCUMENT IS STRICTLY PROHIBITED. IF YOU HAVE RECEIVED THIS DOCUMENT IN ERROR, PLEASE IMMEDIATELY
RETURN IT TO THE SENDER AND DELETE IT FROM ANY DOCUMENT STORAGE SYSTEM.

TG 1 Company Confidential 1
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
Project/Documentation Information

Contact Information

QA Lead: Trinity Sheil Phone: X4931

Document History

Revision Name Date Comments


1.0 Trinity Sheil 10/10/2007 Initial Draft
1.1 Trinity Sheil 02/26/2008 Update custom information
1.2 Trinity Sheil 03/03/2008 Change Title

TG 2 Company Confidential 2
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
Table of Contents

1. Introduction.............................................................................. 5
2. Reference Documents...............................................................6
3. Scope.......................................................................................7
4. Activities & Deliverables............................................................ 8
4.1. Requirements Analysis & Traceability...................................8
4.2. Risk Identification & Mitigation............................................8
4.3. Project Milestones...............................................................9
4.4. Records Collections and Retention........................................9
4.5. Test Case Design.................................................................9
4.6. Test Case Review ..............................................................10
5. Testing...................................................................................11
5.1. External Testing (UAT).......................................................11
5.2. Test Environments.............................................................12
5.3. Test Types.........................................................................12
5.4. Test Results......................................................................13
5.5. Test Tools.........................................................................14
5.6. Mercury Quality Center .....................................................14
5.7. Defect Reporting & Problem Resolution..............................14
5.8. Severity Definitions...........................................................15
5.9. Defect Triage ....................................................................17
5.10. Metrics............................................................................19
6. Moves to the “Q” Environments ..............................................20
7. Release Management Process – Certification ............................22
7.1. Release Management Diagram...........................................22
7.2. Certification Build Process .................................................22
7.3. Entrance & Exit Criteria......................................................22
7.4. Assumption and Risks........................................................25

TG 3 Company Confidential 3
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
Appendix I – Glossary.................................................................26
................................................................................................30

TG 4 Company Confidential 4
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
1. Introduction
The goal of the Enterprise Project Management Office (EMPO) Quality Assurance (QA) team is
to provide the necessary oversight to ensure all software development projects for TG have been
developed and delivered utilizing good software engineering practices.

This includes development for:


(1) Application Support of Platform Convergence, maintenance and repair,
(2) Production, and
(3) Next Generation projects.

Application Support includes, at a minimum, testing of fixes, modifications, and the addition of
minor new functionality to existing TG applications using commercial off the shelf products
(COTS) and any integration with third part tools or in house development efforts.

Production Support includes, at a minimum, adding new functionality to the existing code base
and uses the existing infrastructure comprised of the architecture of databases, applications,
devices, and connectivity.

Next Generation Support includes, at a minimum, two types of support. It includes the testing
of new features and functionality such as network monitoring and database reporting. Some or
all of these require new additions to the underlying infrastructure.

The purpose of this document is to identify standards and processes to be utilized by QA


including the methodology, verification, and validation for application testing.

TG 5 Company Confidential 5
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
2. Reference Documents

Document Share point location


Name
Triage G:\EPMO_Design & Testing
Process Services\EPMO_QA_Team\STANDARDS\Defect Triage
Process.doc
Release To Be Added
Manageme
nt Process
Defect
Manageme
nt Process
Test Case
Writing
Document
Issue To Be Added
Manageme
nt
Change To Be Added
Control
Process

TG 6 Company Confidential 6
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
3. Scope
This document will define the activities and processes employed by QA in support of
development projects and maintenance initiatives as categorized below:
• Verification and validation support of software project initiatives and maintenance
releases,
• QA team management
• Support of process initiatives to enhance existing processes as deemed necessary.

TG 7 Company Confidential 7
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
4. Activities & Deliverables
4.1. Requirements Analysis & Traceability
Typically traceability is performed on the following documents:

• Product Requirements,
• Functional Specification,
• Test Plans and Test Cases
However this list will be specifically defined by the EPMO on a per project basis.

QA will review each of these documents for completeness, correctness, traceability, consistency,
applicability, and testability. QA also verifies there is traceability between the documents.

Traceability verifies product requirements are matched with functional requirements and that
these are in turn matched with Test Cases. In this way QA can ensure complete traceability in
both forward and backward directions.

4.2. Risk Identification & Mitigation


QA will identify and submit project and process risks to the Project Manager (Project Manager)
per project. Although the specific list for each project will be defined by the PM, risks typically
include the following list:
• Description of the Risk
• Likelihood of occurrence factor
• Ability to detect factor
• Impact of occurrence factor

TG 8 Company Confidential 8
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
4.3. Project Milestones
QA testing milestones and activities are incorporated into the overall Project schedule.
Milestone status is reported as the project progresses. Specific milestones for QA testing include:

Milestone/Activity Deliverable
Project Start QA Process Document
Test Strategy
Execution Phase Requirements Traceability
Review or Create Test Plan
Performance Test Plan (if necessary)
Test Cases
Test Review Internal Test Results Audit
UAT Test Plan/Test Cases (if necessary)
Defect Review Test Results/ Defect Count
Certification Review Test Certification – Production Readiness
Production – Go Live Dry run in Production
Post Mortem Participate in Review

4.4. Records Collections and Retention


All applicable documents are stored on the share in the Project folder:
G:\Corporate Projects

4.5. Test Case Design


Test Case Design Reviews will be held for all new test case creation. Reviews should include
availability of the appropriate design of test case based on a Requirements Document, Wire
Frame, HMTL mock up, or Functional Document. Discussions are held between QA and the
Business Analysts (BAs) addressing the questions and/or concerns, and identification of issues
and risks. Successful completion of the design review will result in approved test cases.

Test Case Samples


G:\EPMO_Design & Testing Services\EPMO_QA_Team\STANDARDS\Test Case
Creation_08.doc

TG 9 Company Confidential 9
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
4.6. Test Case Review
Completed test cases are marked ready for review in QC by the tester. The Test Cases will be
reviewed and approved by QA Lead on that project. Any issues or comments will be documented
and tracked via QC. The tester repairs the test cases, for review. Upon approval, the test cases are
marked ready for execution in QC. Evaluation criteria will include completeness, traceability,
testability, and technical assessment. Supporting documents that are base-lined will are added to
QC as a project artifact.

TG 10 Company Confidential 10
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
5. Testing
Testing is an integral part of the product development process. A test strategy is created for each
project that is tested by QA. This will cover testing from the lowest practical level up to the user
and customer level. QA will validate that all tests have been executed successfully prior to
entering certification testing. QA will be a contributor and approver of test strategies, test plans
and automated and/or manual tests cases for integration, system, performance/stress and
regression testing.

The following criteria will be used by QA to review the test plans and test cases.

• High Level Requirements traceability


• Functional Requirements traceability
• Test cases traceability
• Test scope and coverage – appropriate level of testing for features or aspects of the
application(s)
• Integrity
• Correctness of assumptions or pre-conditions
• Accuracy expected results of test cases

This data will be collected by QA.

5.1. External Testing (UAT)


UAT will be conducted to ensure the features to be released meet the user and business needs.
All product releases will require participation and approval from internal business user(s) prior to
deployment.

End Users will define and execute user acceptance testing, (UAT) with a defined set of business
users. Upon completion of UAT, QA will audit the test results to ensure success and readiness
for promotion to production.

This external testing will be conducted for all major releases as a final test gate to Production.

TG 11 Company Confidential 11
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
5.2. Test Environments
A controlled test environment will be used for all QA validation activities. Only QA and the
environments team will have access permissions to test systems and data unless explicitly
approved by QA for a defined, limited time-period. Any maintenance or changes to the QA test
environment shall be coordinated between QA and the environments team. In addition, QA may
randomly request login changes if the integrity of the environment is deemed at risk.

Verification and validation of the build process, and the environment setup is performed in the
test environment. It is this environment where both internal and external (UAT) testing takes
place.

Typically the test environment will be a mirror of the target production configuration,
architecture and performance and contain simulated and actual elements including 3rd-party
applications, proprietary code, servers, and networking devices. Exceptions to the test
environment and the required test data will be determined on a project-basis and defined in the
test strategy document during the Execution phase. The exception to this environment being
mirrored to production will be the advancement of a certification environment.

Issues found during QA testing and UAT will be logged and documented by QA, Development,
UAT Tester and stored in Quality Center, (QC).

5.3. Test Types

The following types of testing will be performed by QA depending on the


project;
• Functional test
• Smoke Test
• System Test
• Regression Test
• Performance, Load And Stress Test
• User Acceptance Test
• Automated Test

5.3.1. Automated Test Architecture


The automated test are created using the automated test architecture guide,
this ensures that all test are created using common design principles,
reusable actions and global repositories.

The guide can be found her:


G:\EPMO_Design & Testing Services\EPMO_QA_Team\PROCESS\QC
Automation\QC Automation Architecture.doc

TG 12 Company Confidential 12
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
5.4. Test Results
The QA team will provide test results for all automated and manual results conducted during
testing. Results contain, at minimum, the following information:

Automated
• Name of test suite
• Name of each test included in test suite
• Day, month, year and time of test run
• Results of test suite specifying total number of tests passed, failed, and not-run
at this time
• Length of run either of individual test or test suite
• Build version tested
• If a Change Request (CR) is created due to a failed test case, the number of
the test case will be included in the CR

Manual
• Name of Development tester
• Date of test
• Description of functionality being tested. If these do not match the
descriptions in this document, a matrix will be provided with the test results
• Results of each test
• Build version tested
• If a CR is created based on a test case, the test case number will be included
• The results of running each test including error message(s) when appropriate

Any issues found in the test results will be documented by QA in the defect tracking tool.

TG 13 Company Confidential 13
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
5.5. Test Tools
Test tools will be utilized wherever possible to improve testing efficiency and effectiveness.
Where appropriate, automated tools will be utilized to help support and augment system testing.

The Test Tool Table below lists specific test tools to be used, dependant upon availability, for
automation/execution and version control of all test documents and scripts.

Tool Purpose Test Activity/Deliverable


Mercury Quality Center Repository Requirements traceability
(QC)
Mercury Quality Center Test Case Creation System Test Cases
(QC)
Mercury Quality Center Test Case Execution System Test Case Execution
(QC)
Mercury Quality Center Defect Tracking Tracking of Defects
(QC)
LoadRunner Performance Testing Load/ Stress Testing
Quick Test Pro (QTP) Automation Automation of System Test Cases

5.6. Mercury Quality Center


Mercury Quality Center is administered by the QA team. All projects are
managed using this tool. In order to gain access to the tool and the project
that you are working, you must request access from
alan.hampton@tgslc.org.

Classes are provided for those not familiar with QC.

5.7. Defect Reporting & Problem Resolution


The defect tracking tool, Mercury Quality Center, will be used for defect collection, tracking,
closure, and reporting. Defects found in the integration test environment will be documented by
QA. Defects will be logged by Development, during Unit Test in Development environment.
UAT Defects will be logged by the RAs and UAT testers, in the appropriate environment.

Defects will be classified by Priority and Severity. Severity is the impact of the defect to system
functionality or performance. Priority is the business teams need to fix the defect prior to
shipment.

QA will participate in managing, prioritizing, and resolving defects. A report of unresolved


issues will be generated upon the completion of Integration, QA & User Acceptance testing and
shall be reviewed and prioritized by the Triage Team.
TG 14 Company Confidential 14
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
All defects found shall be recorded including the following information:
1) Steps to reproduce including authentication & URL information
2) Expected results
3) Actual Results
4) Assign Severity according to the definitions below

Defect Process for QC is here:


G:\EPMO_Design & Testing Services\EPMO_QA_Team\STANDARDS\QC Defect
Status.vsd

5.8. Severity Definitions


All defects shall be assigned one of the following severities. When conflicts arise in assigning
severity to defects, QA will contact business representatives (Service Delivery, Product
Marketing & Sales) for resolution.

• Severity 1- A crash, security breach, 500 or 404 web errors, missing functionality, severe
usability issues causing misuse of tool, etc.
• Severity 2- Severe bug with no work around, bad or wrong data, bug that won’t allow
testing to proceed and/or blocks functionality

TG 15 Company Confidential 15
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
• Severity 3- General bugs, that can be worked around or are not impeding functionality or
testing
• Severity 4- Cosmetic issues, very minor bugs

TG 16 Company Confidential 16
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
5.9. Defect Triage
Defects logged are triaged for priority by the triage team in the following
process:

Bug Triage – severity 1-4

The Triage process is comprised of the following phases.


* Required Information
1. Defect is opened and has a severity*.
2. All defects to have a value in the Severity* field when created.
3. The defect triage team may update the Severity* and Priority field values as necessary for the
tickets selected for evaluation
4. The creator of a new defect will still assign the value of ‘New’ to the ticket Status field.
5. The creator of the new defect will assign the new ticket considering the following criteria:
a. If the ticket can be clearly assigned to a feature-set team: assign the ticket to the lead
software-engineer keeping the ‘New’ status
b. If the ticket clearly needs to be assigned to the Development group, but not certain to
which feature-set team or the problem is global in nature: assign the defect to one of
the two Lead Application Architects keeping the ‘New’ status
c. If there is a problem in identifying the problem for several reasons: assign to the
defect triage team keeping the ‘New’ status
6. All New tickets are assigned to Responsible Party. (Development, P&D, QA)
7. The defect triage team will analyze defects assigned / reassign to them either by the assignment
or by scanning defects as determined by the team for analysis
8. The Lead Architects or the Feature-set leads may assign tickets to the defect triage team
when clarifications that prevent development to continue are required.
9. If incident is a Severity 1:
a. Email notification is provided to Triage Team to ensure that the Defect is on the
radar.
b. Defect is assigned to Dev
c. Dev completes repair
d. QA retest defect
e. Defect is closed or re-opened
10. If incident is a Severity 2 or 3:
a. Email notification is provided to Triage Team to ensure that the Defect is on the
radar.
b. Defect is assigned to Dev
c. Dev completes repair

TG 17 Company Confidential 17
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
d. QA retest defect
e. Defect is closed or re-opened

TG 18 Company Confidential 18
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
5.10. Metrics
Metrics will be collected on a per project basis. Metrics will be used for assessing the
effectiveness of the test planning, test execution processes, defect discovery, defect impacts, and
defect repair processes.

Reporting of test metrics will occur in real-time during the test phase or at the end of project, as
deemed appropriate. This data will be collected by QA and will include:
• Test coverage factors (dependant on available test tools)
• Test Execution Rates
• Defect discovery and fix rates by severity
• Quantity of defects by status
• Defect aging by severity
• Total Defects Open & Closed
• Success rate of releases per project
• Defect removal efficiency
• Requirements leakage percentage

Metrics will be captured from the QC Dashboard and stored in each project folder on the share.

TG 19 Company Confidential 19
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
6. Moves to the “Q” Environments
All code moving into any QA environment (environments starting with “Q”) must follow
the flow below.

o The flow puts all changes to an environment in the hand of a requestor


(Developer, Service Center Ticket (SCT), Work Item Request (WIR), Tester, and
Project Manager).
 This ensures that all versions, including backmigrated code is available.
 This ensures that projects that are scheduled for the same delivery date are
identified
 This ensures that WIRs or SCT are not being duplicated by an ongoing
project effort
 This ensures that warranty work is accounted for and will be integrated
with the correct code base.

o Each change to an environment must be emailed to Don’s Group.

o Each IDMS change must be mailed to the Database group

o Don’s Group works daily with QA to ensure all environments are up and with the
correct information – Don’s group coordinates the effort to get the change into the
“Q” Environment at the right time and the right version.

o If the wrong version is attempted to be placed into “Q” the move is rejected and
the requestor is notified (by environments or the DBAs) to make modifications to
the request.

o Once the move is approved, the Environments Team or the DBA team makes the
changes to the “Q” environment

o All parties are notified and testing can resume

QA will provide support to any requestor attempting to make moves into the “Q”
environments but will not be filling out move sheets or ensuring the validity of the
code prior to moving into testing.

Project Managers are asked to include a line item for environmental changes into their
project plans, since swapping of environments are not uncommon and require some
down time.

TG 20 Company Confidential 20
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
TG 21 Company Confidential 21
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
7.Release Management Process – Certification
The Release Management Process ensures the implementation meets user expectations prior to
deployment into production.

7.1. Release Management Diagram


Being Designed

7.2. Certification Build Process


Upon successful completion of entry criteria, XXX will checkout the new/updated code
and executes a build in the Pre-prod environment. This build is validated by the XXX
Team using a pre-defined set of validation steps provided in the deployment document
provided by the development team.

If the build is successful, the next phase of testing may begin. If a build
fails, QA will notify all teams of the failed units. The build will not enter
Production environment unless all issues found during the build
process have been resolved and the code retagged.

7.3. Entrance & Exit Criteria


Below are tables of entry and exit criteria per the source and target environment(s). All criteria
must be met prior to proceeding to the next phase or environment. Any deviations from the
criteria must be approved by the executive change control board.

Entry Criteria to Enter “Q” Testing Environment


Criteria Owner
Functional Specifications complete and approved BA
Design Specification Documents complete and approved BA/Dev
Code is functionally complete Dev
Code checked-in & Build available Dev
Test Strategy complete & approved QA
Test Cases complete QA
Successful completion of QA tests w/documented test results QA
Successful Audit of test plan/cases and results: QA
Code Freeze: Label is applied to tested code Dev

TG 22 Company Confidential 22
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
Entry Criteria to Enter Pre-Pro Testing Environment
Criteria Owner
Successful completion of Integration Test Entry Criteria QA
Build to Test environment is completed with no errors Support
Smoke tests of QA build are executed with no errors QA
Acceptance sub-set of automated and manual tests are executed QA
successfully in QA
All Sev-1 and Sev-2 defects have root cause identified QA & DEV

All Sev-1 and Sev-2 defects are resolved or an action plan is in QA / BA


place
All known issues and open defects are documented in Release QA & DEV
Notes.

Any Manual configurations needed after implementation into


production is specified here.
Internal test results documented and distributed QA
External (UAT) tests defined Business
Test Readiness Review completed successfully QA

Entry Criteria to Enter Production


Criteria Owner
Build to Certification Test environment is completed with no IT
errors
Smoke tests of QA build execute with no errors QA
UAT cases are executed successfully QA/Business
All Sev-1 and Sev-2 defects have root cause identified QA/DEV
All defects with a high priority have been resolved or an action QA/BA
plan is in place
Test results documented and distributed QA
All appropriate issues and open key defects are documented in DEV
the Release Notes
Any Manual configurations needed after implementation into
TG 23 Company Confidential 23
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
Entry Criteria to Enter Production
Criteria Owner
production is specified here.
Production Readiness Review Completed Successfully QA

TG 24 Company Confidential 24
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
7.4. Assumption and Risks
On-time completion of all projects is contingent upon the following assumptions and associated
risks:

Assumptions Risks
Test environment(s) for Internal/External testing Unavailability of environment(s) will block testing
are purchased and correctly configured thus preventing timely deployment to production

Requirements & Design documents will be Modifications to deliverables and /or their due
completed and approved per the project plan dates will impact the QA & test ability to complete
milestones/activities
Necessary resources (people & tools) will be Resource constraints will extend timeframe for test
available upon completion of development and QA milestones/activities

Build Deployments are consistent Inconsistent build deployments will delay the test
effort and push the delivery date
All Internal & External testing must be complete Planning of parallel testing in same environment
or by both QA and UAT test will cause schedule
slippage

TG 25 Company Confidential 25
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
Appendix I – Glossary
This list represents terms used at TG:

Automated Test(s)
The term automated test in its most generic use implies a test that has been written in
some computer or scripting language to programmatically perform some number of
testing steps and does not carry with it any additional information about scope, function,
or environment.
To carry more meaning, the scope and the environment is added: Automated Integration
Tests in the Test Environment, Automated Regression Tests in Production.
Accessibility Testing
Accessibility is a general term used to describe the degree to which a product
(e.g., device, service, environment) is accessible by as many people as possible.

o Americans with Disabilities Act of 1990


o Section 508 Amendment to the Rehabilitation Act of 1973

Manual Test(s)
This is a description of a test with no automated steps and everything must be done by
hand.
Tag
When collecting files in preparation for deployment a build label is used to group
them together. It contains an enumerated value used as a release designator, as
well as file names, versions, and file locations in VSS.
Change Request (CR)
Change Requests are defects that have been found at any time up to and ending
with the deployment into production. CRs are numbered sequentially, are issued a
severity, given a description, assigned to a responsible party, have steps to
reproduce, and move through various statuses such as: new, open, fixed, and
verified fixed.

Deployment
This is the process also known as “release” in which the developers prepare the
set of source code files in VSS to be included in the release by building a label
that references the latest versions of each individual file to be released, and then
provide instructions to the Support team, who then uses the label to move the
listed set of files into a new environment.

TG 26 Company Confidential 26
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
Development Environment
This is the environment in which the development team works closely with the
business partners, project managers, and the QA team to
(1) translate the Requirements, Functional Specifications, System Design,
(2) write Unit test to verify the code,
(3) run those tests,
(4) prepare the code for release into a test environment, and
(5) fix those defects found during deployment and testing.

The infrastructure of this environment represents only some fraction of the actual
production environment.

GUI Testing
GUI software testing is the process of verifying a product that uses a graphical
user interface, to ensure it meets its written specifications. This is normally done
by executing a variety of test cases.
Load Testing

Load testing is the process of creating demand on a system or device and


measuring its response.

Integration Testing

Integration testing is the phase of software testing in which individual software


modules are combined and tested as a group. It followsunit testing and precedes
system testing.

Integration testing takes as its input modules that have been unit tested, groups
them in larger aggregates, applies tests defined in an integrationtest plan to those
aggregates, and delivers as its output the integrated system ready for system
testing.

Production Environment
This is the environment where the end users make use of the software product on
a daily basis while performing mission critical business functions with and for
customers.

TG 27 Company Confidential 27
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
Requirements Document
This document describes the software project at the highest level and may be
written by a Requirements Analysts. It may describe the project using business
cases, or objectives. It may compare and contrast competitors using terms such as
market segmentation or the competitive landscape and it may describe functional
requirements in terms of customer preference and usability with the inclusion of
Use Cases.
Regression Test

Regression testing is any type of sofware testing which seeks to uncover


regression bugs. Regression bugs occur whenever software functionality that
previously worked as desired, stops working or no longer works in the same way
that was previously planned. Typically regression bugs occur as an unintended
consequence of program changes.

Peer Review
This is the process in which selected team members review and comment on
various documents, tests, test results, and deployments using a pre-designed
format.
QA Department
This Quality Assurance department at TG contains that group of people who are
both responsible for executing the tests against applications, projects and
environments at TG.
Deployment Document
The Deployment Document is prepared in preparation for a deployment of a
software project into the production environment. This document includes an
itemized list of steps to follow in order to deploy the project, steps to verify and
steps to rollback in the event of a deployment failure.

Functional Specification Document


This document contains the following sections: The problem definition may
include the functionality as seen through end-user scenarios, or in relationship to
other components and levels of the hierarchy. The software architecture is also
described at a high level including relationships with other major components and
interfaces with external hardware and software. They may also include memory
and performance requirements and an estimate of additional resources to satisfy
the requirements.

Performance Testing
Performance testing is testing that is performed, from one perspective, to
TG 28 Company Confidential 28
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
determine how fast some aspect of a system performs under a particular
workload. It can also serve to validate and verify other quality attributes of the
system, such as scalability, reliability and resource usage performance testing is
testing that is performed, from one perspective, to determine how fast some aspect
of a system performs under a particular workload. It can also serve to validate and
verify other quality attributes of the system, such as scalability, reliability and
resource usage

System Functional Specifications Document


This document contains a high level list description of functional requirements of
the top-level of the architectural design as seen by the end user. This set of
specifications describes how all of the components of the system fit together and
communicate; however, they do not specify the individual components functional
requirements nor how any of the system components will be built. The functions
may be characterized as behavior and described in terms of how the system
enables the end user to perform work (i.e., the functions, performance, quality
requirements, etc). This architectural design describes the components of the
system, their individual functions and performance, and how they interface with
each other.
Stress Testing
This is a form of testing that is used to determine the stability of a given system or
entity. It involves testing beyond normal operational capacity, often to a breaking
point, in order to observe the results.
Security Testing

The Process to determine that data is maintained with functionality as intended.

The six basic security concepts that need to be covered by security testing are:
confidentiality, integrity, authentication, authorisation, availability and non-
repudiation.

Test Strategy
Test Strategy is a systematic approach to testing a system. The plan typically
contains a detailed understanding of what the workflow.
Test Case
Test case is a set of conditions or variables under which a tester will determine if a
requirement or use case upon an application is partially or fully satisfied. It may
take many test cases to determine that a requirement is fully satisfied.
Unit Test
Unit testing is a procedure used to validate that individual units of source code are
working properly. A unit is the smallest testable part of an application.
TG 29 Company Confidential 29
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.
Smoke Test
Smoke tests are a set of tests performed immediately after the project has been
deployed into a new environment. These tests confirm the success of the
deployment. They typically contain some representative tests from multiple
categories as defined below, integration, system or regression tests. They may
also contain simple confirmations of look and feel, connectivity, or operability.

“Q” Environment
This is the environment in which automated and manual tests of all types are
performed by developers, testers, and end-users. This environment resembles the
production environment as closely as possible and contains all the major
components and functionality of the production environment.
QC
Quality Center – by Mercury is used to collect requirements, test planning, test
cases, execute test cases, track and report on defects.

User Acceptance Test

User Acceptance Testing (UAT) is a process to obtain confirmation by a Subject


Matter Expert (SME), preferably the owner or client of the object under test,
through trial or review, that the modification or addition meets mutually agreed-
upon requirements. In Software Development, UAT is one of the final stages of a
project and often occurs before a client or customer accepts the new system.

TG 30 Company Confidential 30
A printed copy of this document is considered uncontrolled. Refer to the online version for the controlled revision.

Vous aimerez peut-être aussi