Vous êtes sur la page 1sur 9

QUALITY CENTRE

DEFECT LOGGING
& TRACKING PROCEDURE

Table of Contents

1 INTRODUCTION..............................................................................................................................3
2 DEFECT LIFECYCLE.......................................................................................................................3
2.1 DEFECT LOGGED.................................................................................................................................. 3
2.1.1 Summary Naming Convention.................................................................................................... 3
2.1.2 Defect Details............................................................................................................................. 4
2.1.3 Description.................................................................................................................................. 5
2.2 DEFECT CYCLE.................................................................................................................................... 6
2.2.1 Valid defect................................................................................................................................. 6
2.2.2 Not Enough Information.............................................................................................................. 7
2.2.3 Duplicate..................................................................................................................................... 7
2.2.4 Not a Valid Defect....................................................................................................................... 7
2.2.5 To be Fixed in a Later Release................................................................................................... 7
3 APPENDIX A – DEFECT LIFECYCLE.................................................................................................8
4 APPENDIX B – DEFECT PRIORITY CATEGORIES..............................................................................9
5 Appendix C – Defect Types.....................................................................................................10
Quality Centre Defect Logging & Tracking Process

1 Introduction

This document defines the process by which defects should be logged, tracked and updated using Mercury
Interactive’s Quality Centre application. Within Quality Centre there are a number of ‘projects’. The project
under which a defect is logged whilst testing software is the ‘ABCD’ project.

By following this process a developer should have sufficient information to reproduce a defect found in the
Software Under Test (SUT). Once a developer is able to recreate a defect in the development environment
they should be able to fix the code. This will also enable any tester to retest the defect fix once it has been
deployed to the test environment.

The URL for Quality Centre is http://ntinet3tdwsq.ukroi.ABCD.org/qcbin/

2 Defect Lifecycle

The defect lifecycle, as shown in Appendix A – Defect Lifecycle, should be adhered to by all colleagues
involved in logging and updating a defect.

All defects should be reproducible by the tester.

2.1 Defect Logged


A defect can be logged by a Test Analyst from two different areas of Quality Centre – the Defects tab and
from within a test case in the Test Lab tab.

NOTE: if a defect is found whilst executing a test then the defect should be raised from the script. The
Defects tab should only be used when a defect is found outside a test.

The following must be completed in order for a defect to be logged correctly in Quality Centre.

2.1.1 Summary Naming Convention


The summary field must be completed using the following structure.

ENV – PROJECT – FUNCTIONAL AREA – SUMMARY

ENV Environment in which the defect was found. i.e. DevTest, INT2, OPS2 or UAT

PROJECT Project being tested when the defect was found. i.e. Martini Phase 1

FUNCTIONAL AREA The area within the project where the defect was found in. i.e. Add To Basket

SUMMARY Brief summary of the defect. Avoid going into too much detail.

Example

Defect found whilst testing Order Details page on ABCD. The Summary would be:

“OPS - BOB13 - ABCD - Error on Order Details page”

Page 2 20/07/2011
Quality Centre Defect Logging & Tracking Process

2.1.2 Defect Details

The details tab should be completed as follows:

Detected By: this field will automatically populate with the name of the person logging the defect

Build: n/a

Project: select the project where this defect was found from the list

Assigned To: select name of Test Lead from list

Subject: optional, but a good idea to select for larger projects where there are numerous functional
areas

Platform: enter the platform being tested. This is particularly important when executing Compatibility
test cases

RedLightDefect: this should only be selected by the Test Lead/Test Architect

Duplicate Ref: see section 2.2.3

Detected on Date: automatically populated with current date

Page 3 20/07/2011
Quality Centre Defect Logging & Tracking Process

Priority: select the priority from list. Priority categories are shown in Appendix B – Defect Priority
Categories

Reproducible: optional, but should be completed where possible

Status: this field will automatically populate with the status of ‘New’ when a new defect is logged

Browser: enter the browser being tested. This is particularly important when executing Compatibility
test cases

Environment: select the environment where the defect was found from the list

Defect Type: select defect type from the list. Defect Types are shown in Appendix C – Defect Types

2.1.3 Description
The defect detail is very important as it is used by the developer to recreate the defect. It may also be used
by another tester to retest the defect once it has been fixed and deployed to the test environment, and so
should be as complete as possible.

IF THE DEFECT CANNOT BE RECREATED, IT CANNOT BE FIXED OR RETESTED!

Where a defect is logged directly from a test case in the Test Lab, the test case details will be automatically
included in the defect. REMEMBER that a developer cannot view the test cases, and so does not know what
test case was being executed when the defect occurred.

Page 4 20/07/2011
Quality Centre Defect Logging & Tracking Process

Description field:
enter as much detail as possible. Include a detailed description of the problem together with ‘Steps to
Recreate’ and any other information that would be useful to the developer

The ‘Steps to Recreate’ should include the following:


- Customer details. If logged in, then include email address and password. If not logged in then detail
whether cache and cookies were cleared before starting the test, and whether any information was
entered
- Which screens passed through or actions taken to get to the area where the problem was found
- Any specific data which relates to recreating the defect

Errors:
- Include screenshots of any error messages on the website. These can be attached to the defect
using the ‘Attach Screen Capture’ button
- If any related Errors or Warnings are found in EventViewer on the server, attach these using the
‘Paste’ button (this will attach them to the defect rather than including them in the description)
- Where there are no related Errors or Warnings, this should be stated

Example

An error is displayed on the Order Details page where the eGiftcard table should be (even though there are
no eGiftcards to display). This error has been introduced with the deployment of the fix for DR 23549.

Error on page: "An Error has occurred:


Description:-2146233036:The type initializer for "ABCD.Common.PaymentOperations.GiftcardService" threw
an exception."
Error from ntinet4tel1 attached.
Note that this error is displayed regardless of the delivery address for the order.

To recreate:
1. place a grocery order
2. log onto ABCD
3. search for customer
4. click order number link - errors

The attached error log is from EventViewer on ntinet4Tel1

2.2 Defect Cycle


Once the defect has been logged it should be assigned to the project’s Test Lead, who will review the defect
and make one of the following decisions (see Appendix A for flow diagram).

2.2.1 Valid defect


If it is decided that the defect is valid, the Test Lead will update the status to ‘Open’ and assign it to the Lead
Developer. Depending on the type of defect, the Lead Developer will be a member of one of the following
development teams:-

- User Interface Dev


- Middle Tier Dev
- Database Dev

The Lead Developer will assign the defect to a developer to fix. Once the developer is working on a fix the
status should be updated to ‘Fix Pending’.

Once the defect has been fixed and retested in the DevTest environment, the developer should complete the
following:
- add a comment in the R&D Comments section of the Description. This should detail what caused the
defect and how it was fixed. It should also show the related TD or Work Package. This will help
developers in the future should the defect be reintroduced to the test environment.
Example: dllhost.exe.config had references for 2.1.144.0 component. The file has been re-deployed
with correct references.
- enter the version number that the defect was fixed in the ‘Fixed In Version’ field on the Details tab

Page 5 20/07/2011
Quality Centre Defect Logging & Tracking Process

- update the status of the defect to ‘Fixed’


- assign the defect to the Test Lead

Once the defect fix has been deployed to the test environment, the Test Lead should update the status to
‘Fixed Retest’ and assign the defect to a tester.

The tester should retest all defects assigned to them with a status of ‘Fixed Retest’ once a Sanity/Smoke
Test has been completed on the release. There are two possible outcomes of a retested defect:
- the defect was fixed – the tester should:
 add a note to the R&D Comments field indicating that the defect has been fixed and closed
 update the status to ‘Closed’
- the defect was not fixed – the tester should:
 add a note to the R&D Comments field indicating why the defect is not fixed. Include any
additional information which may be helpful
 assign the defect to the Test Lead
 update the status to ‘Open’

2.2.2 Not Enough Information


Occasionally there is not enough information in the defect. In these cases the Test Lead will update the
status to ‘Need More Info’ and reassign the defect back to the tester who logged it. The tester should update
the defect with the required information, update the status to ‘Open’ and assign it back to the Test Lead for
review.

The developer may also find that they need more information from the tester, and so can also assign the
defect back to the tester who logged it after updating the status to ‘Need More Info’. The tester should update
the defect with the required information, update the status to ‘Open’ and assign it back to the Test Lead for
review.

2.2.3 Duplicate
It is the responsibility of the Test Lead to ensure that duplicate defects are not passed to developers. Where
duplicates are found, the Test Lead should update them with a status of ‘Duplicate’ and note the duplicate
defect number on the Details tab in the Duplicate Reference field. Duplicate defects should not be included in
the defect counts for reporting purposes.

2.2.4 Not a Valid Defect


The Test Lead should consider all defects and, where a defect is found to be invalid, should update the
status to ‘Rejected’. Where a developer or Lead Developer considers a defect as invalid, they should discuss
it with the Test Lead in order to validate their findings.

2.2.5 To be Fixed in a Later Release


Certain areas of functionality may be de-scoped during the testing phases. Where this occurs the Test Lead
should update all defects relating to that functional area to ‘Deferred’. This ensures that when the functional
area is re-scoped into a later release, that all related defects are available, although they should be
revalidated in the new release.

Page 6 20/07/2011
Quality Centre Defect Logging & Tracking Process

3 Appendix A – Defect Lifecycle

Process flow for a defect within Quality Centre.

Page 7 20/07/2011
Quality Centre Defect Logging & Tracking Process

4 Appendix B – Defect Priority Categories

The following table defines the priority categories to be applied to defects raised in Mercury’s Quality Centre.
Priority 1 defects can also be assigned a ‘Red Light’ indicator, which is used to prioritize a particular critical
defect over other critical defects.

Defects should be initially prioritised by the project Test Lead. The Project Manager owns the defects, and
can agree reprioritisation with the Test Lead where necessary.

Priority Severity Definition


P1 Critical A defect which prevents any further actions from being performed. This can fall into 2
categories – defects that affect the website as a whole, and defects which affect the
functionality of the release. Also known as a ‘Showstopper’. This type of defect could
also involve data loss, data corruption or system failure. The customer impact is
potentially devastating.
Defects with this priority must be fixed immediately. When the ‘Red Light’ indicator is
used it takes priority over any other development or defect fixes.
Examples:
 unable to log onto the grocery website (website functionality)
 unable to select a Green Delivery Slot (BOB15 functionality)
P2 Serious A defect which has a severe impact, but for which there is a workaround. If there is a
severe impact to the customer experience, the defect severity may be raised to
critical.
Defects with this priority require fixing once all critical defects have been fixed.
Examples:
 grocery top tabs not working
 search not returning correct search results
P3 Minimal A minimal defect which has a minor impact on a function or the customer experience.
Customers may consider these to be annoying, or possibly an indication of a sloppy
development/test process.
Defects with this priority require fixing once all coding is complete and all serious
defects have been fixed.
Examples:
 incorrect copy
 duplicate entries in a dropdown list
P4 Minor A cosmetic defect which does not impede the customer experience but could reduce
customer confidence in the website and so should be fixed, time permitting.
Defects with this priority require fixing once all coding is complete and all minimal
defects have been fixed.
Examples:
 spelling mistakes
 content inconsistent across grocery pages (e.g. colours)

Page 8 20/07/2011
Quality Centre Defect Logging & Tracking Process

5 Appendix C – Defect Types

The following table defines the classifications for a defect.

‘Defect Type’ – QC field Definition


Automation A defect in the software which was discovered during automation testing
Code / Content / A defect in the software which may be requirements based, code based or
DB Defect design based
Environment / Configuration / A defect which may be due to the test environment itself, the
Deployment software/hardware configuration, or the deployment
Load / Performance A defect in the software which was discovered during load and
performance testing
Project Bad / Invalid Data Failure occurred and determination made that it was related to bad or
invalid data
Project Change Request An enhancement to an existing application. It could be code related, data
or process
Project No Requirement / UI / The software does not cause an incident, although also does not behave
BR according to the supporting documentation (eg. business requirements,
wireframes or use cases)
Project Release Note / Failure occurred and determination made that it was related to the project
Documentation release note or documentation
Project UAT Defect A defect in the software which was discovered during UAT testing

Page 9 20/07/2011