Académique Documents
Professionnel Documents
Culture Documents
I) Introduction
a) Software, Types of Software
b) Software Project, Application and Product
c) Software Business Process
II) SDLC (Software Development Life Cycle)
a) Requirements Gathering
b) Analysis & Planning
c) Software Design
d) Coding/Implementation
e) Testing
f) Release & Maintenance
III) SDLC Models
a) Sequential Models
1) Waterfall Model
2) ‘V’ Model
b) Incremental / Iterative Models
1) Prototype Model
2) Spiral Model
3) Agile Model
IV) Test Levels
nit Testing/Component Testing/Module Testing
ntegration Testing
ystem Testing
cceptance Testing
V) Software Environment
a) I-Tier or Standalone Applications
b) II-Tier or Client/Server Applications
c) III-Tier or Web Applications
d) N-Tier or Distributed Applications
VI) Test Types
a) Functional Testing
b) Non-Functional Testing
c) Structural Testing
d) Re & Regration Testing
VII) Test Design Techniques
a) Black box Techniques
1) Equivalence Partitioning/ Equivalence Classes (EP/EC)
2) Boundary Value Analysis (BVA)
3) Decision Table Testing
4) State Transition Testing
5) Use Case Testing
b) White box Techniques
1) Statement Testing
2) Decision Testing
3) Condition/Multi Condition Testing
4) Mutation Testing
VIII) Testing Process (STLC)
a) Test Strategy
b) Test Planning
c) Configuration Management
d) Risk Analysis
e) Test Design (Test Scenarios, Test Cases and Test Data)
f) Test Execution
g) Defect Reporting & Tracking
h) Status Reporting
i) Test Closure
IX) Informal Testing
a) Exploratory Testing
b) Error Guessing
X) Quality Standards
a) ISO Standards
b) IEEE Standards
c) CMM/CMM (I) Process Guidelines
Software Quality:
Software satisfies quality only when it meets to Customer Requirement / Customer
Satisfaction / Customer Expectations. Meets Customer Requirement refers to proper output
& Customer Expectations refers to extra characteristics, good interface, speed, privacy,
security, easy to operate, good functionality.
SQA are the Concepts to be followed by company to develop the software. An SQA
team is responsible for Monitoring & Measuring the strength of developing processes.
Software Project:
A set of problems assigned by the client, who can be solved by the software people
through the process of software Engineer called Software project. In short, the problem, the
people, the process called project. Software related problem is solved by software engineers
through software engineer process is software project.
In this stage, Business Analyst studies the requirement of the client /customer and
they prepare Business Requirement Specification (BRS) documents.
Analysis:
In this stage, Sr. Analyst prepares Software Requirement Specification (S/w RS)
document with respect to corresponding BRS document. This document consists of two sub-
documents System Requirement Specification (SRS) & Functional Requirement Specification
(FRS). SRS contain details about software & hardware requirement. FRS contains details
about the functionality to be used in project.
Designing:
In designing phase, Designers creates two documents High Level Document (HLD) &
Low Level Document (LLD). HLD consists of main modules of the project from root to leaf
and multiple LLD’s. LLD consists of sub-modules of main module along with Data flow
diagrams, ER-Diagrams, etc., are prepared by technical support people or designers called
Internal designers.
Testing:
Testing
Development
V MODEL TESTING
Port testing
Test Software changes
Test efficiency
Maintenance
Install Build
Information
From the above refinement form of V-Model, small & medium scale organizations are
maintaining separate testing team for Functional & System testing stage.
After completion of Analysis phase & their reviews, our Project-level designers will
start logical design of application in terms of External & Internal design (HLD’s & LLD’s). In
this stage, they conduct reviews for completeness & correctness of designed documents.
This review focuses on below factors:
1) Are they understandable
2) Are they met with right requirements of client / customer
3) Are they complete
4) Are they follow able
5) Does they handle errors
After completeness of Design & their reviews, software programmers will starts
coding the logical design to physical construction of software. During these coding stage
programmers is conducting Unit Testing through a set of White box testing techniques, Unit
Testing also known as Module / Component / Program / Micro testing
1) Execution Testing
3) Mutation Testing
Mutation means that a change program. White box testers are performing the
change in the program to estimate test coverage on that program. Mutation testing can
decide whether the test coverage is correct or wrong
4) Integration Testing
a) Top-Down Approach
Top-Down Approach
b) Bottom-Up Approach
Bottom-Up Approach
In this approach, testing is conducted taking both Top-Down & Bottom Approaches.
* Build: A finally integrated all modules set *.exe form file is called build.
1) Usability Testing
2) Functional Testing
3) Performance Testing
4) Security Testing
From Above 1 & 2 are Core level and 3 & 4 are Advance level
1) Usability Testing
In general, TT starts with test execution with Usability testing. During test, Testing
team validates User-Friendliness of screens of build. During Usability testing, TT applies two
types of sub-test:
2) Functional Testing
The major part of Black box testing is Functional testing, during this test, Testing
team1 concentrates on “meet customer requirements”. This Functional Testing is classified
into below sub-test.
During this test, Test Engineers validates correctness of every functionality in terms
of below coverage’s.
c) Recovery Testing
It is also known as Reliability testing. During this test, Testing team validates
whether the application is changing from abnormal state to normal state or not.
d) Compatibility Testing
It is also known as Portability testing. During this test, Testing team validates
whether application build run on customer expected platforms or not. During this test,
Testing Engineers are finding Backward compatibility at maximum.
Forward compatibility -> application is ready to run but Operating system is not supporting.
Backward compatibility -> Operating system is supporting but the application has some
internal coding problems to run on Operating system
e) Configuration Testing
It is also known as Hardware compatibility testing. During this test, Testing team
validates whether application build supports different technology Hardware devices or not.
f) Inter-Systems Testing
During this test, Testing team validates whether application build co-existence with
other existing software’s or not and also test whether any Dead lock situation occurs or not.
g) Installation Testing
During this test, Testing team validates whether application build along with
supported software’s into customers site like configured systems. During this test, Testing
team observes below factors.
During this test, Testing team compares application build with competitive products
in market.
During this test, Testing team tries to find extra features in application build w.r.t
customer requirements.
* Defects
During this test, Testing team reports defects to developers in terms of below
categories
A Tester conducts any test on application build without using any testing tool is called
Manual testing, if any testing tool is used then it is called Automation testing
In common testing process, Testing Engineers are using test Automation w.r.t test
impact & criticality. Impact -> test repetition & Criticality -> complex to apply test
manually. Due to these two reasons testing people are using test Automation.
j) Re-testing
The re-execution of a test with multiple test data to validate a function, e.g. To
validate multiplication, Test Engineers use different combinations of input in terms of min,
max, -ve, +ve, zero, int, float, etc.
k) Regression Testing
The re-execution of test on modified build to ensure bug fixing work & occurrence of
any side effects, Test Engineers conducts this test using Automation
A mistake in code is Error, due to errors in coding, Test Engineers are getting
mismatches in application build are defects, if the defects are accepted by developers to be
solves then it is Bug.
Testing Documents
Above Figure, shows the various levels of documents prepared at project testing.
Test Policy is documented by Quality Control. Test Strategy & Test Methodology are
documented by Quality Analyst or Project Manager. Test Plan, Test Cases, Test Procedure,
Test Script & Defect Report are documented by Quality Assurance Engineers or Test
Engineers.
Test Policy & Test Strategy are Company Level Documents. Test Methodology, Test
Plan, Test Cases, Test Procedure, Test Script, Defect Report & Final Test Summary Report
are Project Level Documents.
1) TEST POLICY:
Testing Standards : One defect per 250 lines of code or 10 FP (Functional points)
*******
(C.E.O
)
TTM: Testing Team Measurements, how much testing is over & is yet to complete
PCM: Process Capability Measurements, depends on old project to the upcoming projects.
2) TEST STRATEGY:
This is a Company level document & developed by Quality Analyst or Project Manager
Category people, it defines “Testing Approach”.
Components:
Testing Issues:
3) TEST METHODOLOGY:
Pet Process:
TESTING PROCESS
Information Gathering (Business Requirement
Specifications)
Unit &
Integration
Test
Closure
4) TEST PLANNING:
After finalization of possible test for current project, Test Lead category people
concentration on test plan document preparation to define work allocation in terms of What,
Who, When & How to test. To prepare test plan document, test plan order follows below
approach;
1] Team Formation:
In general, Test planning process starts with testing team formation. To define a
testing team, test plan author depends on below factors;
1. Availability of testers
2. Test duration
3. Availability of test environment resource
After Testing team formation Plan author analysis possible & mitigation (ad hoc
testing)
# Risk 7: Lack of communication in between Test Engineer - > Test team and
Test team - > Development team
After completion of testing team formation & Risk analysis, Test plan author
concentrate on Test Plan Document in IEEE format.
Above (3), (4) & (5) decides which module to be tested – > What to test?
12) Staff & Training: Names of selected Test Engineers & training requirements to them
13) Responsibilities: Work allocation to every member in the team (dependable modules
are given to single Test Engineer)
14) Schedule: Dates & Times of testing modules
15) List & Mitigation: Possible testing level risks & solution to overcome them
16) Approvals: Signatures of Test plan authors & Project Manager / Quality Analyst
After completion of plan document preparation, Test plan author conducts a review
of completion & correctness. In this review, Plan author follows below coverage analysis
5) TEST DESIGNING:
After completion of Test Planning & required training to testing team, corresponding
testing team members will start preparing the list of test cases for their responsible
modules. There are three types of test cases design methods to cover core level testing
(Usability & Functionality testing).
In general, Test Engineers are preparing a set of Test Cases depends on Use Cases
in (S/w RS). Every Use Case describes a functionality in terms of inputs, process & output,
depends on this Use Cases Test Engineers are preparing Test Cases to validate the
functionality
From the above model, Test Engineers are preparing Test Cases depends on
corresponding Use Cases & every test case defines a test condition to be applied.
To prepare test cases, Test Engineers studies Use Cases in below approach:
Steps:
Use Case I:
A login process allows user id & password to validate users. During these validations,
login process allows user id in alpha-numeric from 4 to 16 characters long & password in
alphabets in lowercase from 4 to 8 characters long.
Case study:
ECP
Valid Invalid
During Test design Test Engineers are writing list of Test Cases in IEEE format.
5) Actual:
6) Result: -> Test Execution
7) Comments:
Note: Test Engineers follows list of Test Cases along with step by step procedures only
Example 1:
Prepare Test Procedure for below test cases “Successful file save operation in Notepad “.
b) Input Domain based Test Case design (E-R diagrams / Data Models)
In general, Test Engineers are preparing maximum Test Cases depends on Use
Cases or functional requirements in S/wRS. These functional specifications provide
functional descriptions with input, output & process, but they are not responsible to provide
information about size & type of input objects. To correct this type of information, Test
Engineers study Data model of responsible modules E-R diagram. During Data model study,
Test Engineer follows below approach: Steps:
Critical
A/C Name:
Balance:
Non-
Critical
Address:
Note: In general, Test Engineers are preparing step by step procedure based Test Cases
for functionality testing. Test Engineers prepare valid / invalid table based Test Cases for
input domain of object testing {Data Matrix }
To conduct Usability Testing, Test Engineers are preparing list of Test Cases
depends on our organization User Interface standards or conventions, Global User Interface
rules & interest of customer site people.
1) Spelling check
2) Graphics check (Screen level Align, Font style, Color, Size & Microsoft six rules)
3) Meaning of error messages
4) Accuracy of data displayed
5) Accuracy of data in the database are result of user inputs, if developer restrict the data
in
database level by rounding / truncating then the developer must also restrict the data
in
front-end as well
6) Accuracy of data in the database as the result of external factors. Ex. File attachments
7) Meaningful help messages (Manual support testing)
After completion of all possible Test Cases preparation for responsible modules,
Testing team concentrates on review of Test Cases for completeness & correctness. In this
review, Testing team applies coverage analysis.
Test Case Review
1) BR based coverage
2) Use Cases based coverage
3) Data model based coverage
4) User Interface based coverage
5) TRM based coverage
At the end of this review, Test Lead prepare Requirement Traceability Matrix or
Requirement Validation Matrix (RTM / RVM)
xxxx xxxx
xxxx
xxxx xxxx
xxxx
xxxx
xxxx xxxx
xxxx
xxxx
From RTM / RVM model, it defines mapping between customer requirement &
prepared Test Cases to validate the requirement.
6) TEST EXECUTION:
After completion of Test Cases selections & their review, Testing team concentrate on
Build release from development side & Test execution on that build.
Defect Report
Level – 1
Comprehensive
Level – 3
Final
Regressi
on
Level – 2
Regression
Defect
Fixing
c) Build Version Control:
In general, Testing Engineers are receiving build from development in below model
Testers
Build
Testing
Environment
Server
During Test execution Test Engineer are receiving modified build from software. To
distinguish old & new builds, development team maintains unique version in system, which
is understandable to Tester / Testing team. For this version controlling developers are using
version control tools (Visual Source Safe)
During this Sanity testing, Testing Engineer observes below factors on that build
1) Understandable
2) Operatable
3) Observable
4) Consistency
5) Controllable
6) Simplicity
7) Maintainable
8) Automation
From the above “8” testable issues, Sanity test is also known as Testability testing /
OCT angle testing.
e) Test Automation:
If test automation is possible than Testing team concentrate on Test script creation
using corresponding testing tools. Every Test script consists of navigation statement along
with required check points.
Stable Build
Test Automation
(Select Automation)
After completion of Sanity testing & possible test automation, Testing team
concentrates on test batches formation with depended Test Cases. Test batches are also
known as Test suits / sets. During these Test batches execution, Test Engineers prepare
test log document, this document consist of three types of entries.
Case I:
If development team resolve bugs severity which is high, Test Engineers re-execute
all P0, P1 & carefully selected P2 Test Cases on modified build
Case II:
Bugs severity is medium, then all P0, carefully selected P1 & some of P2 Test
Cases
Case III:
Bugs severity is low, then some P0, P1 & P2
Case IV:
If development team released modified build due to sudden changes in project
requirement then Test Engineers re-execute all P0, P1 & carefully selected P2 Test Cases w.r.t
that requirement modification.
7) TEST REPORTING:
During comprehensive testing, Test Engineer are reporting mismatches as defects to
developers through IEEE format.
Defect Age: The time gap between “reported on” & “resolved on”
Defect Resolution:
After receiving defect report from Testers, developers review the defect & they send
resolution type to Tester as a reply
01) Duplicate, rejected due to the defect is same as previously reported defect
02) Enhancement, rejected due to the defect is related future requirement of customer
03) H/w limitation, rejected due to the defect raised w.r.t limitation of H/w devices
04) S/w limitations, rejected due to the defect raised w.r.t limitation of S/w Techno
05) Not applicable, rejected due to the defect has no proper meaning
06) Functions as designed, rejected due to coding is correct w.r.t to designed doc’s
07) Need more information, not (accepted / rejected) but developers requires extra information
to understand the defect
08) Not reproducible, not (accepted / rejected) but developers require correct procedure to
reproduce the defect
09) No plan to fix it, not (accepted / rejected) but developers want extra time to fix
10) Fixed, developers accepted to resolve
11) Fixed indirectly, accepted but not interested to resolve in this version (default)
12) User misunderstanding, need extra negotiation between testing & development team.
Types of defects:
8) TEST CLOSURE:
After completion of all possible test cycle execution, Test Lead conducts a review to
estimate the completeness & correctness of testing. In this review, Test Lead follows below
factors with Test Engineers
1) Coverage Analysis
a) BR based coverage
b) Use Cases based coverage
c) Data model based coverage
d) User Interface based coverage
e) TRM based coverage
2) Bug density
At the end of this review, Testing team concentrate on high bug density modules or
all modules if time is available.
After completion of User Acceptance testing & their modifications, Test Lead
concentrates on final test summary report creation. It is a part of Software Release Node
(S/w RN). This final test summary report consist of below documents
• Bug Description
• Feature
• Found By
• Status (closed / deferred)
• Comments
Local host
This product maintains a default administrator to create new users and every valid
user search data in database.
Admin
Admin Invalid
(Or) user
Valid
user
Search keys
Existing DB
FUNCTIONAL POINTS: -
→ Last name
→ Last with first name
→ Last name with first name and d-o-b
→ Last name with first name and age
→ Customer id only
“Search records” window consists of “start search” and “stop search” buttons.
Allows last name as full or partial
Allows customer id as full or partial with * or wild card
Customer id
286*----
-
2 8 6 * ------
7. Entry criteria: -
→ Are the necessary documents available?
→ Is the “X” product is ready for release from developers?
→ Is the supporting database available to search required records?
→ Is the test environment ready?
8. Suspension criteria:
→ Data base disconnect may require suspension of testing
→ Suspension of testing is mandatory when searching records process went to infinite.
→ Admin is failed to new user creation, a decision can be mad to continue testing on
“searching records” module using admin or other existing valid users.
9. Exit Criteria: -
→ Ensure that the “X” product provides the required services
→ Ensure that all test documents has been completed up to date.
→All high severity bugs resolved.
14. Responsibilities:
Document / Report Responsibility Effort
Test completion reports Test Lead with At the end of the Testing
K Srinaiah
15. Schedule:
Task Effort Start date End date
2. Lack of time
→ Over-time
17. APPROVALS: -
Test case 1:
Signature of PM and signature of test lead.
BVA (size)
BVA (size)
Min = 4 chars → pass
Max = 8 chars → pass
Min –1 = 3 chars → fail
Min +1 = 5 chars → pass
Max + = 7 chars → pass
Max = 9 chars → fail
Test case 6: Un successful selection of this option due to current user is not admin.
Test case10: unsuccessful new user creation due to given user id is not unique.
Test case 11: successful closing of new user creation window using cancel
(After enter user id and offer enter passes)
BVA (range)
Ecp (type)
Day: min → 01 -----------------
Year: min → 00
max→ 99
Ecp(type)
min → 01 -----------------
max→ 22 valid invalid
-----------------
0- 9 a-z, A-Z
Special characters
Test case 18: successful entry of age to
Test case 19: successful display of matched records with full last name and all other fields
are
Blank
Test case 20: successful display of matched records with full last name and first name
Test case21: successful display of matched records with full last name first name and dob
Test case22: successful display of matched records with full last name first name age
Test case 23: on successful search operation due to invalid combination of filled fields.
Test case 25: unsuccessful search operation due to age from greater than age ‘To’
Test case26: successful display of records with search key as partial last name and other
fields
as blank
Test case 27:successful display of records with search key a partial last name and first
name
Test case 28: successful display of records with search key as partial last name, first name
and
Dob
Test case 29: successful display of records with search key as partial last name, first name
Test case 30: successful display of records with search key as customer id
Test case 31: successful display of records with search key as partial customer id with *
as
wild card
Test case 32: Unsuccessful display of records due to no matching records in database
w.r.to
given search keys
Test case 33: unsuccessful display of records due to “too many” records to display when no
of matched records greater than 1000
Test case 35: Successful termination of search operation when click “stop search” button.
Test case 36: successful closing of records window through click o.k after searching.
Test case 37: spelling check in every screen
Testing
Test case39: Accuracy of data displayed
Ex:Dob dd\mm\yy