Académique Documents
Professionnel Documents
Culture Documents
Version 1.2
Document of Control
Document Approval/Signatures:
Date Approver Role Version Signature
1 Introduction 5
1.1 Purpose of this Document 5
1.2 Project Overview 5
1.3 In Scope 5
1.4 Out of Scope 5
1.5 Assumptions 6
1.6 Document References 6
2 Non Functional Test Overview 7
2.1 Performance Testing 7
2.2 NFT Types 7
2.3 Technical Test Objective 8
2.4 Other Non Functional Test Phase Objectives 9
2.5 Non Functional Test schedule and milestones 11
3 Test Approach 12
3.1 Test Planning and Requirements Gathering 12
3.2 Test Design 12
3.3 Test Development 13
3.4 Test Execution 14
3.5 Test Monitoring 15
3.6 Test Delivery 16
3.7 Test Data 17
3.8 Testing Standards 17
3.9 Progress Reporting 17
3.10 Contingencies 17
4 Test Tools 18
4.1 Non-Functional Testing 18
5 Resourcing 19
5.1 Test Phase Organization 19
5.2 Roles and Responsibilities 19
6 Pre - Production Environment (ELAB) 20
6.1 Application Environment 20
6.2 Test Environment 20
7 Production Environment (LIVE) 21
7.1 Application Environment 21
7.2 Test Environment 21
8 Risks, Issues and Dependencies 22
8.1 Risks & Mitigation 22
Project Neptune was established to outsource the Lloyds Cash Management Operation to securicor Cash
Centres (SCC) to deliver an improved service at lower cost. As a result LTSB transferred bank cash canters,
staff and assets to SCC.The IT systems will allow cash accounts and ensure overall control of the outsourced
operation.
Cash management owned by LTSB is outsourced to Securicor cash centres. After outsourcing SCC
provides full range of cash processing functions including electronic supply of transaction
information back to LTSB for LTSB branches, ATM, bulk customers, bulk cash tills & coin stores. Files can
be created and exported to LTSB by SCC and vice versa via secure connection (connect : Direct) (TCP/IP) on
all working weekdays-not weekends or Bank Holidays. Cash management and forecasting has been taken
care of by a web based tool called iCom.
1.3 In Scope
In the Britannia project, the testing of which is the scope of this document, includes:
Performance tuning the ICOM application, the batch jobs of ICOM and WMS.
Extensive code review for norms other than performance and scalability (example – quality of code,
adherence to standards or best practices, etc.), of the application will not be part of this engagement.
The scope will be redefined in the event of inherent system limitations noticed during the course of project in
consultation with appropriate Test Manager and Project CIO.
1.5 Assumptions
The following assumptions have been made when defining the Non Functional Test Plan:
Controlled Environment (ELAB) is configured and ready for testing for two weeks
Controlled environment (Production) is configured and ready for testing for the remaining weeks
Both ELAB and Production environments should be similar in all the cases including the network
Test data should be provided by the respective team (System Testing team or the Development team)
for each test execution.
Capacity planning team will be monitoring the servers during the NFT execution
Performance testing and monitoring tools are available with sufficient license.
Technical and Functional support from Britannia team is available to support Non-Functional testing of
Britannia application.
Non-Functional Testing done using any simulators would be carried out with the help of System testers.
Demonstrate that the non functional requirements defined in the Business requirements have been met
by the end to end design and infrastructure build.
Demonstrate that the Britannia changes; have no degradation or impact on existing performance and
operational service.
Test results of the performance testing should meet the response time that specified in the NFR document.
S
Type of Testing Modules
No
1 Load Test ICOM Application
ICOM Application, Batch Jobs
2 Expected Response Times
for ICOM & WMS
3 Stress Test ICOM application
4 Scalability Test ICOM application
The table below defines the types of Non-Functional testing that will be performed for ICOM application, Batch
jobs for ICOM & WMS during the Performance and different NFT types.
The test data used by the functional test team will be used for NFT activity or data from TD payment team/
ICOM team/ Fiserv team will be used.
WMS Jobs
S4A1
S4A2
S4A3
S4A4
S1
S4B1
DBC Processor
Limit Application
RTF Implementer
RTF Processor
Indirect Customer charging
S4export
Resource:
The schedule for the planned executions can be found in the below table:
The below diagram illustrates the approach that is used for performance testing the Britannia application. Test
runs through the online interfaces will be coordinated and executed by the Performance Test Team based on the
procedure for execution provided by the Development team.
The following major tasks will be completed as part of the test planning stage:
Interact with application business and technical owners to determine the performance requirements for
performance testing.
Understand application architecture and evaluate scope, requirement documents to estimate the performance
test effort and compile performance test planning material.
Non-Functional Test Plan (this document) is produced in the planning stage, which details all the testing
activities, tasks, duration and resources in order to produce team and individual work plans.
The following major tasks will be completed as part of the test design stage:
Collect application and test environment details from architecture team and plan a process on the
usage of the environment.
Interact with business experts to determine the process flows.
Identify all data necessary to execute performance test scripts.
The Number of VUsers will be calculated based upon the performance of the smoke test.
Build automated performance test scripts for the identified business flows/traversals using
LoadRunner Web (HTTP/HTML) protocol, for ICOM application.
Link test scripts to the associated test data (data from System testers or core team will be used)
Verify the scripts by running individually on the identified traversals and ensure its integrity.
In case of changes in application, test scripts needs to be re-worked if needed and validated.
Simulators will be used to inject load for ICOM application during the performance testing.
Non-Functional testing will commence only when the system is proven to be functionally stable. A sub-set of the
listed non-functional tests will be executed as per priority in onsite, agreed upon by all stakeholders involved.
Scheduling will depend on availability of environments & resources.
The following section details how the test execution and analysis stages will be conducted:
Test Types:
1. Load Test
Definition
Linearly increase the transaction volumes until the expected production load is reached and gauge the
ability of the systems to support the full load in the Application.
Objectives
Linearly increase the number of concurrent user sessions in order to find out the maximum
number of users that application server can handle for given response time criteria.
Identify the throughput and Number of requests/sec (Hits/sec) at this user load.
Resource Needs
Enough data/resources should be provided for all the web transactions that are identified for the
testing purpose. Please refer the Test Data section (section 3.7)
Definition
Start with the given user number and increase the total number of transactions till 2-3 times of the peak
volume is achieved and gauge the performance the behavior of system resource utilization, page load
response times, and application scalability.
Resource Needs
Same as Load Test (Data requirement quantity will be twice or thrice as compared to the Load
test requirements)
The section details the monitoring tools that will be deployed in the test environment to support the performance
test team in identifying performance bottlenecks and to assist in tuning system/applications/databases for
improved performance.
Capacity planning team will be monitoring the servers during NFT test execution
Capacity planning team will share the logs to NFT team for analyzing.
Detailed list of counters for each server will be shared for monitoring.
The following metrics will be monitored during load testing process to draw a conclusion on the system
performance and to fine-tune the same for better performance:
S.N Presented
Reports
o As
Client Side statistics
All the documents, reports should strictly adhere to NFT templates and have exhaustive information that would
allow it to be re-run by a different tester. No information should be retained by a tester. A minimum requirement
would be:
Test plan
Test Objectives
Pass/Fail criteria
Project Diary
A project diary should be maintained for each project and stored for reference after the test has finished. This
should record, as a minimum, the scripts, scenarios, results and analysis file names, and any issues that are
discovered.
Daily status report on the tests run, scripts created and issues faced to the NFT Lead
3.10 Contingencies
Please refer to sec. 7.1
Capacity planning team support required for monitoring the servers while NFT execution.
Server Name / IP
S.No. Type Hardware Requirements Software
Address
Hardware
S.No. Type Server Name / IP Address Software
Requirements
The objective of this section is to identify any known risks to the testing of the project, how they will be
managed and what contingency plans can be put in place. All risk and issues will additionally be managed
through the Project Risks & Issues Log.
3 Resource Testing will be executed by the The Test Team will co-ordinate with the
onsite team; the onsite team concerned teams and will ensure their
will require clarifications from availability for discussions and clarifications
all the teams during the
execution. The availability of
the above people at the right
time is subject to risk
Analysis of some areas needs The Test Team will co-ordinate with the
assistance from experts in concerned teams and will ensure their
other teams. The availability of availability for discussions and
the above people at the right clarifications.
time is subject to risk
All issues during execution will be compiled in an issue log sheet. The Test Lead will be responsible for tracking
all issues to closure.
Issues are captured in DSR.
8.3 Dependencies
Support required from ICOM, TD payments, Capacity planning team, DBS team, SIT Team for carrying out the
NFT testing.
Date,
Section Prepared
Details Approved by
# by
Version #
30-June-08 Draft version Raja P
Draft
09-Jul-08 1.5, 3.4, Comments updated Raja P Rakesh V
1.0 3.5, 4.0,
8.0, 8.3.
9.0
11-Jul-08 2.2, 2.3, Comments updated Raja P. Shankar
1.1 2.4, 3.3,
5.1, 5.2,
9.0
14-Jul-08 3.2, 6.2, Pending details updated Raja P.
1.2 7.2, 9.0,
10.0