Vous êtes sur la page 1sur 59

SOFTWARE QUALITY

SOFTWARE QUALITY is the degree of


conformance to explicit or implicit requirements
and expectations.

Explanation:
● Explicit: clearly defined and documented
● Implicit: not clearly defined and documented but
indirectly suggested
● Requirements: business/product/software
requirements
● Expectations: mainly end-user expectations
SOFTWARE QUALITY

In order to ensure software quality, we undertake


Software Quality Assurance
and Software Quality Control.
SOFTWARE QUALITY CONTROL

SOFTWARE QUALITY CONTROL (SQC)


is a set of activities for ensuring quality in software
products. Software Quality Control is lim
ited to the Review/Testing phases of the
Software Development Life Cycle
and the goal is to ensure that the products
meet specifications/requirements.
SOFTWARE QUALITY
DIMENSIONS
When someone says “This software is of a very
high quality.”, you might want to ask “In which
dimension of quality?”

SOFTWARE QUALITY DIMENSIONS listed here


are some of the major ones out of hundreds.
Which software quality dimension is more
important than the other is subjective and
depends on what dimension you value the most in
the particular situation.
SOFTWARE QUALITY
DIMENSIONS
● Accessibility: The degree to which software can
be used comfortably by a wide variety of
people, including those who require assistive
technologies like screen magnifiers or voice
recognition.
● Compatibility: The suitability of software for use
in different environments like different devices,
operating Systems and browsers.
● Concurrency: The ability of software to service
multiple requests to the same resources at the
same time.
SOFTWARE QUALITY
DIMENSIONS
● Efficiency: The ability of software to perform
well or achieve a result without wasted energy,
resources, effort, time or money.
● Functionality: The ability of software to carry
out the functions as specified or desired.
● Installability: The ability of software to be
installed in a specified environment.
● Localizability: The ability of software to be used
in different languages, time zones, etc.
SOFTWARE QUALITY
DIMENSIONS
● Maintainability: The ease with which software
can be modified (adding features, enhancing
features, fixing bugs, etc.)
● Performance: The speed at which software
performs under a particular load.
● Portability: The ability of software to be
transferred easily from one location to another.
● Reliability: The ability of software to perform a
required function under stated conditions for the
stated period of time without any errors.
SOFTWARE QUALITY
DIMENSIONS
● Scalability: The measure of software’s ability to
increase or decrease in performance in
response to changes in software’s processing
demands.
● Security: The extent of protection of software
against unauthorized access, invasion of
privacy, theft, loss of data, etc.
● Testability: The ability of software to be easily
tested.
● Usability: The degree of software’s ease of
use.
TESTING LEVELS
UNIT TESTING is a
level of software testing
where individual units/
components of a
software are tested. The
purpose is to validate
that each unit of the
software performs as
designed.
TESTING LEVELS
INTEGRATION
TESTING is a level of
software testing where
individual units are
combined and tested as
a group. The purpose of
this level of testing is to
expose faults in the
interaction between
integrated units.
TESTING LEVELS
SYSTEM TESTING is a
level of software testing
where a complete and
integrated software is
tested. The purpose of
this test is to evaluate
the system’s
compliance with the
specified requirements.
TESTING LEVELS
ACCEPTANCE
TESTING is a level of
software testing where a
system is tested for
acceptability. The
purpose of this test is to
evaluate the system’s
compliance with the
business requirements
and assess whether it is
acceptable for delivery.
TESTING LEVELS - ANALOGY
During the process of manufacturing a ballpoint
pen, the cap, the body, the tail and clip, the ink
cartridge and the ballpoint are produced
separately and unit tested separately. When two
or more units are ready, they are assembled and
Integration Testing is performed. When the
complete pen is integrated, System Testing is
performed. Once System Testing is complete,
Acceptance Testing is performed so as to
confirm that the ballpoint pen is ready to be made
available to the end-users.
DEFECT PRIORITY
also known as Bug Priority, indicates the
importance or urgency of fixing a defect. Though
priority may be initially set by the Software Tester,
it is usually finalized by the Project Manager.
DEFECT PRIORITY
CLASSIFICATION
● P1 / URGENT: Must be fixed immediately / in
the next build.

● P2 / HIGH: Must be fixed in any of the


upcoming builds but should be included in the
release.

● P3 / MEDIUM: May be fixed after the release /


in the next release.

● P4 / LOW: May or may not be fixed at all.


PRIORITY CONSIDERATIONS
● Defect Severity / Impact
● Defect Probability / Visibility
● Available Resources (Developers to fix and
Testers to verify the fixes)
● Available Time (Time for fixing, verifying the
fixes and performing regression tests after the
verification of the fixes)
DEFECT SEVERITY
is a classification of software defect (bug) to
indicate the degree of negative impact on the
quality of software.
DEFECT SEVERITY
CLASSIFICATION
● S1 / CRITICAL
● S2 / MAJOR
● S3 / MINOR
● S4 / TRIVIAL
SEVERITY CLASS: CRITICAL
● The defect affects critical functionality or critical
data. It does not have a workaround.
● Example: Unsuccessful installation, complete
failure of a feature.
SEVERITY CLASS: MAJOR
● The defect affects major functionality or major
data. It has a workaround but is not obvious
and is difficult.
● Example: A feature is not functional from one
module but the task is doable if 10 complicated
indirect steps are followed in another module/s.
SEVERITY CLASS: MINOR
● The defect affects minor functionality or non-
critical data. It has an easy workaround.
● Example: A minor feature that is not functional
in one module but the same task is easily
doable from another module.
SEVERITY CLASS: TRIVIAL
● The defect does not affect functionality or data.
It does not even need a workaround. It does not
impact productivity or efficiency. It is merely an
inconvenience.
● Example: Petty layout discrepancies,
spelling/grammatical errors.
DEFECT PROBABILITY
also known as Defect Visibility or Bug Probability
or Bug Visibility, indicates the likelihood of a user
encountering the defect / bug.
● HIGH: Encountered by all or almost all the users
of the feature
● MEDIUM: Encountered by about 50% of the
users of the feature
● LOW: Encountered by very few users of the
feature
DEFECT PROBABILITY

Defect Probability can also be denoted in


percentage (%).

The measure of Probability/Visibility is with


respect to the usage of a feature and not the
overall software. Hence, a bug in a rarely used
feature can have a high probability if the bug is
easily encountered by users of the feature.
Similarly, a bug in a widely used feature can have
a low probability if the users rarely detect it.
DEFECT LIFE CYCLE

DEFECT LIFE CYCLE, also known as Bug Life


Cycle, is the journey of a defect from its
identification to its closure. The Life Cycle varies
from organization to organization and is governed
by the software testing process the organization
or project follows and/or the Defect tracking tool
being used.
DEFECT LIFE CYCLE
Nevertheless, the life cycle in general resembles
the following:
DEFECT LIFE CYCLE
Status
NEW
ASSIGNED/OPEN
DEFERRED
DROPPED/REJECTED
COMPLETED/FIXED/
RESOLVED/TEST
REASSIGNED/REOPENED
CLOSED/VERIFIED
DEFECT LIFE CYCLE: NEW
Tester finds a defect and posts it with the status NEW.
This defect is yet to be studied/approved. The fate of a
NEW defect is one of ASSIGNED, DROPPED or
DEFERRED.
DEFECT LIFE CYCLE:
ASSIGNED/OPEN
Test / Development / Project lead studies the NEW defect
and if it is found to be valid it is assigned to a member of
the Development Team. The assigned Developer’s
responsibility is now to fix the defect and have it
COMPLETED. Sometimes, ASSIGNED and OPEN can
be different statuses. In that case, a defect can be open
yet unassigned.
DEFECT LIFE CYCLE: DEFERRED
If a valid NEW or ASSIGNED defect is decided to be
fixed in upcoming releases instead of the current release
it is DEFERRED. This defect is ASSIGNED when the
time comes.
DEFECT LIFE CYCLE:
DROPPED/REJECTED
Test / Development/ Project lead studies the NEW defect
and if it is found to be invalid, it is DROPPED /
REJECTED. Note that the specific reason for this action
needs to be given.
DEFECT LIFE CYCLE:
COMPLETED/FIXED/RESOLVED
Developer ‘fixes’ the defect that is ASSIGNED to him or
her. Now, the ‘fixed’ defect needs to be verified by the
Test Team and the Development Team ‘assigns’ the
defect back to the Test Team. A COMPLETED defect is
either CLOSED, if fine, or REASSIGNED, if still not fine.
If a Developer cannot fix a defect, some organizations
may offer the following statuses:
Won’t Fix / Can’t Fix: The Developer will not or cannot fix the
defect due to some reason.
Can’t Reproduce: The Developer is unable to reproduce the
defect.
Need More Information: The Developer needs more
information on the defect from the Tester.
DEFECT LIFE CYCLE:
REASSIGNED/REOPENED
If the Tester finds that the ‘fixed’ defect is in fact not fixed
or only partially fixed, it is reassigned to the Developer
who ‘fixed’ it. A REASSIGNED defect needs to be
COMPLETED again.
DEFECT LIFE CYCLE:
CLOSED/VERIFIED
If the Tester / Test Lead finds that the defect is indeed
fixed and is no more of any concern, it is CLOSED /
VERIFIED. This is the happy ending.
DEFECT LIFE CYCLE:
GUIDELINES
● Make sure the entire team understands what each
defect status exactly means. Also, make sure the
defect life cycle is documented.
● Ensure that each individual clearly understands his/her
responsibility as regards each defect.
● Ensure that enough detail is entered in each status
change. For example, do not simply DROP a defect
but provide a reason for doing so.
DEFECT REPORT
DEFECT REPORT is a document that identifies
and describes a defect detected by a tester. The
purpose of a defect report is to state the problem
as clearly as possible so that developers can
replicate the defect easily and fix it.
DEFECT REPORT TEMPLATE
In most companies, a defect reporting tool is used
and the elements of a report can vary. However,
in general, a defect report can consist of the
following elements.
● ID Unique - identifier given to the defect.
(Usually, automated)
● Project - Project name.
● Product - Product name.
● Release Version - Release version of the
product. (e.g. 1.2.3)
DEFECT REPORT TEMPLATE
● ModuleSpecific - module of the product where
the defect was detected.
● Detected Build Version - Build version of the
product where the defect was detected (e.g.
1.2.3.5)
● Summary - Summary of the defect. Keep this
clear and concise.
● Description - Detailed description of the defect.
Describe as much as possible but without
repeating anything or using complex words.
Keep it simple but comprehensive.
DEFECT REPORT TEMPLATE
● Steps to Replicate - Step by step description of
the way to reproduce the defect. Number the
steps.
● Actual Result - The actual result you received
when you followed the steps.
● Expected Results - The expected results.
● Attachments - Attach any additional information
like screenshots and logs.
● Remarks - Any additional comments on the
defect.
DEFECT REPORT TEMPLATE
● Defect Priority - Priority of the Defect.
● Reported By - The name of the person who
reported the defect.
● Assigned To - The name of the person that is
assigned to analyze/fix the defect.
● Status - The status of the defect. (Defect Life
Cycle)
● Fixed Build Version - Build version of the
product where the defect was fixed (e.g.
1.2.3.9)
REPORTING DEFECTS
EFFECTIVELY
It is essential that you report defects effectively so
that time and effort is not unnecessarily wasted in
trying to understand and reproduce the defect.
Here are some guidelines:
● Be specific
● Be detailed
● Be objective
● Reproduce the defect
● Review the report
REPORTING DEFECTS
GUIDELINE: BE SPECIFIC

Specify the exact action: Do not say something


like ‘Select ButtonB’. Do you mean ‘Click ButtonB’
or ‘Press ALT+B’ or ‘Focus on ButtonB and click
ENTER’? Of course, if the defect can be arrived at
by using all the three ways, it’s okay to use a
generic term as ‘Select’ but bear in mind that you
might just get the fix for the ‘Click ButtonB’
scenario. [Note: This might be a highly unlikely
example but it is hoped that the message is clear.]
REPORTING DEFECTS
GUIDELINE: BE SPECIFIC

In case of multiple paths, mention the exact path


you followed: Do not say something like “If you do
‘A and X’ or ‘B and Y’ or ‘C and Z’, you get D.”
Understanding all the paths at once will be
difficult. Instead, say “Do ‘A and X’ and you get D.”
You can, of course, mention elsewhere in the
report that “D can also be got if you do ‘B and Y’
or ‘C and Z’.”
REPORTING DEFECTS
GUIDELINE: BE SPECIFIC

Do not use vague pronouns: Do not say


something like “In ApplicationA, open X, Y, and Z,
and then close it.” What does the ‘it’ stand for? ‘Z’
or, ‘Y’, or ‘X’ or ‘ApplicationA’?”
REPORTING DEFECTS
GUIDELINE: BE DETAILED

Provide more information (not less). In other


words, do not be lazy. Developers may or may not
use all the information you provide but they sure
do not want to beg you for any information you
have missed.
REPORTING DEFECTS
GUIDELINE: BE OBJECTIVE

Do not make subjective statements like “This is a


lousy application” or “You fixed it real bad.”

Stick to the facts and avoid the emotions.


REPORTING DEFECTS
GUIDELINE:
REPRODUCE THE DEFECT
Do not be impatient and file a defect report as
soon as you uncover a defect. Replicate it at least
once more to be sure. (If you cannot replicate it
again, try recalling the exact test condition and
keep trying. However, if you cannot replicate it
again after many trials, finally submit the report for
further investigation, stating that you are unable to
reproduce the defect anymore and providing any
evidence of the defect if you had gathered. )
REPORTING DEFECTS
GUIDELINE:
REVIEW THE REPORT

Do not hit ‘Submit’ as soon as you write the


report. Review it at least once. Remove any
typos.
NEXT DISCUSSION:

SOFTWARE TESTING METHODS


SOFTWARE TESTING TYPES
Trust but Verify.
Verify but also Validate.
http://softwaretestingfundamentals.com/

Vous aimerez peut-être aussi