Vous êtes sur la page 1sur 26

Lloyds TSB – Britannia

Non Functional Test Plan

Version 1.2
Document of Control

Document Modification Control:


Date Author Role Version Comments Reviewer
30-Jun-08 Raja P. NFT - Lead 0.1 Initial Version Rakesh V
09-Jul-08 Raja P. NFT - Lead 1.0 Updated the comments Shankar
11-Jul-08 Raja P. NFT - Lead 1.1 Updated the comments
14-Jul-08 Raja P. NFT lead 1.2 Pending details updated

Document Approval/Signatures:
Date Approver Role Version Signature

Distribution List for Information:


Date Name Role Version Comments
Debashis Functional Test Manager
Girish Sr. Test Lead
Mark NFT Manager
Rajkumar

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 2 of 26


TABLE OF CONTENTS

1 Introduction 5
1.1 Purpose of this Document 5
1.2 Project Overview 5
1.3 In Scope 5
1.4 Out of Scope 5
1.5 Assumptions 6
1.6 Document References 6
2 Non Functional Test Overview 7
2.1 Performance Testing 7
2.2 NFT Types 7
2.3 Technical Test Objective 8
2.4 Other Non Functional Test Phase Objectives 9
2.5 Non Functional Test schedule and milestones 11
3 Test Approach 12
3.1 Test Planning and Requirements Gathering 12
3.2 Test Design 12
3.3 Test Development 13
3.4 Test Execution 14
3.5 Test Monitoring 15
3.6 Test Delivery 16
3.7 Test Data 17
3.8 Testing Standards 17
3.9 Progress Reporting 17
3.10 Contingencies 17
4 Test Tools 18
4.1 Non-Functional Testing 18
5 Resourcing 19
5.1 Test Phase Organization 19
5.2 Roles and Responsibilities 19
6 Pre - Production Environment (ELAB) 20
6.1 Application Environment 20
6.2 Test Environment 20
7 Production Environment (LIVE) 21
7.1 Application Environment 21
7.2 Test Environment 21
8 Risks, Issues and Dependencies 22
8.1 Risks & Mitigation 22

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 3 of 26


8.2 Issues 23
8.3 Dependencies 23
9 Outstanding actions 24
10 Change Log 25

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 4 of 26


1 Introduction

1.1 Purpose of this Document


The purpose of this document is to explain the approach and procedures followed in Non-Functional testing of
Britannia. This document is intended for Test Managers of Lloyds TSB, and other stakeholders who are
interested in this Britannia testing activity.

1.2 Project Overview

Project Neptune was established to outsource the Lloyds Cash Management Operation to securicor Cash
Centres (SCC) to deliver an improved service at lower cost. As a result LTSB transferred bank cash canters,
staff and assets to SCC.The IT systems will allow cash accounts and ensure overall control of the outsourced
operation.

Cash management owned by LTSB is outsourced to Securicor cash centres. After outsourcing SCC
provides full range of cash processing functions including electronic supply of transaction
information back to LTSB for LTSB branches, ATM, bulk customers, bulk cash tills & coin stores. Files can
be created and exported to LTSB by SCC and vice versa via secure connection (connect : Direct) (TCP/IP) on
all working weekdays-not weekends or Bank Holidays. Cash management and forecasting has been taken
care of by a web based tool called iCom.

1.3 In Scope

In the Britannia project, the testing of which is the scope of this document, includes:

 ICOM Application (Refer Sec 2.1)

 ICOM Batch jobs (Refer Sec 2.4)

 WMS Batch Jobs (Refer Sec 2.4)

1.4 Out of Scope

The following is out of scope for this document:

 Performance tuning the ICOM application, the batch jobs of ICOM and WMS.
 Extensive code review for norms other than performance and scalability (example – quality of code,
adherence to standards or best practices, etc.), of the application will not be part of this engagement.
 The scope will be redefined in the event of inherent system limitations noticed during the course of project in
consultation with appropriate Test Manager and Project CIO.

 Hardware sizing / capacity planning.

1.5 Assumptions

The following assumptions have been made when defining the Non Functional Test Plan:

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 5 of 26


 Application is functionally stable for the NFT.

 Controlled Environment (ELAB) is configured and ready for testing for two weeks

 Controlled environment (Production) is configured and ready for testing for the remaining weeks

 Both ELAB and Production environments should be similar in all the cases including the network

 System Testing Environment is configured and ready for scripting

 LoadRunner environment is configured and available for test execution.

 Test data should be provided by the respective team (System Testing team or the Development team)
for each test execution.

 Capacity planning team will be monitoring the servers during the NFT execution

 Performance testing and monitoring tools are available with sufficient license.

 Technical and Functional support from Britannia team is available to support Non-Functional testing of
Britannia application.

 Non-Functional Testing done using any simulators would be carried out with the help of System testers.

1.6 Document References

The following documents are referenced in this review:


Document Version
Document Name Date Source / Location Author
Type Number
Cash Management Time Lines Rajkumar
Cash Management _Overview Rajkumar
CF Payments CFY068 - E2EPD 1.2 Debashis
ICOM_Screenshots Rajkumar
Britannia PKI Master Test Plan 1.0 Debashis
NFT NFR 0.5 Debashis
Solution Overview Debashis
Non_Functional_Test_Matrix_v0 4 0.4 Girish
Britannia_ Other_ Non_
0.1 Raja
Functional_Test Phase_0.1
ICOM screen shots Vaishali

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 6 of 26


2 Non Functional Test Overview
The overall objective of the Non-Functional testing will be to:

 Demonstrate that the non functional requirements defined in the Business requirements have been met
by the end to end design and infrastructure build.

 Demonstrate that the Britannia changes; have no degradation or impact on existing performance and
operational service.

2.1 Performance Testing


Non-Functional testing will be carried out for the below specified components during Performance testing:
 ICOM application
 Cash Orders
o Branch ATM Orders
o RATM Orders
o Branch Orders
 Confirmations
o Branch ATM Orders
o RATM Orders
o Branch Orders

Test results of the performance testing should meet the response time that specified in the NFR document.

2.2 NFT Types

S
Type of Testing Modules
No
1 Load Test ICOM Application
ICOM Application, Batch Jobs
2 Expected Response Times
for ICOM & WMS
3 Stress Test ICOM application
4 Scalability Test ICOM application

The table below defines the types of Non-Functional testing that will be performed for ICOM application, Batch
jobs for ICOM & WMS during the Performance and different NFT types.

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 7 of 26


2.3 Technical Test Objective
The aim of Technical Test is to measure the ability of the application to process all of the business functions
within specified timeframes or rates of concurrency specified by the service level agreements (SLA). The tests
detailed below should be carried out during this test phase if applicable and there is requirement to do so. The
business or other requirements will usually determine this. The below specified testing would be carried out
during Performance testing and Non Functional testing.

The test data used by the functional test team will be used for NFT activity or data from TD payment team/
ICOM team/ Fiserv team will be used.

Technical Testing Inc. Comments

Load Test Load test will be performed for ICOM


(Refer Non_Functional_Test_Matrix_v0 4 document for
more information)
 application.

Expected Response Times Response times will be captured as a part of


(Refer Non_Functional_Test_Matrix_v0 4 document for
more information)
 Load test.
This test will be performed for ICOM
application.
Stress Test Please refer to Section 3.4
(Refer Non_Functional_Test_Matrix_v0 4 document for
more information)
 This test will be performed for ICOM
application.
.
Scalability Test Stress tests help in identifying the maximum
(Refer Non_Functional_Test_Matrix_v0 4 document for
more information)
 users the system can able to handle.
This test will be performed for ICOM
application.

Refer Non_Functional_Test_Matrix_v0 4 for more information.

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 8 of 26


2.4 Other Non Functional Test Phase Objectives
Other Non Functional Testing Comments

Accessibility This test will be performed for below


Testing to ensure that the application complies with
the Disability Discrimination Act (DDA) and that
 requirements.
 The user interface should compile
processes and systems do not disadvantage users with DDA requirements
with disabilities
Resource:
Archiving This test will be performed for below
The aim of this test is to test that the archiving
function works according to specification and that
 requirements.
 The archive must have to capacity
any pre-determined limits or constraints are tested to archive all orders and
for. It is possible that this testing has taken place as
associated fulfillment files and
part of the Functional Testing. The level of testing
required will differ if the project introduces a new customer details that are over 7
application(s) or changes to an existing one years old and store the records
for 10 years.
 The system archive all reconciled
ledger and statement items that
are over 2 years old.
 The data stored in the iCom and
SQL Server Fulfillment database
will need to be backed up and
archived daily (Mon – Fri)
Resource:
Recovery Test This test will be performed for below
The aim of this test is to ensure that the system can
be brought back into normal operation in the case of
 requirements.
 System is backed up and
a failure such as a server disk failure or data recoverable with 1 hour.
corruption. The level of testing required will differ if
the project introduces a new application(s) or  Hardware contingency for the
changes to an existing one. data file transfers will be required
at both ends (LTSB and G4S).
Data transferred/received must
not be lost. In the event of data
transfer failure, the data will need
to be re-transferred within 1 hour
of transfer failure notification.
 All data files sent to and received
from G4S/Loomis/Security Plus
must be backed up nightly. The
time window for performing these
backups should be between 4am
and 8am.
Resource:
Security Test This test will be performed for below
Testing to determine the security of the software
product, i.e. penetration testing. Usually conducted
 requirements:
 System should use SSO to
via IT Security on customer facing applications. support separation pf concerns

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 9 of 26


with regard to provisioning and
authorization.
 Leased lines should be used for
communication to 3rd parties.
 External communicated
transmission needs to be
encrypted.
Resource:
Batch Run Time This test will be performed for below
Ensures that any Batch processes required execute
successfully and in acceptable timeframes.
 requirements:
ICOM Jobs list
 Nightly Default Main
 Nightly Main
Resource:

WMS Jobs
 S4A1
 S4A2
 S4A3
 S4A4
 S1
 S4B1
 DBC Processor
 Limit Application
 RTF Implementer
 RTF Processor
 Indirect Customer charging
 S4export
Resource:

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 10 of 26


2.5 Non Functional Test schedule and milestones

The schedule for the planned executions can be found in the below table:

Britannia NFT - Schedule


Britannia
Planned Actual
Activities End
Start Date End Date Start Date Date
Requirements gathering and Test Planning Stage
Requirements gathering and 23/06/2008 04/07/2008 23/06/2008
planning
Test Plan Creation and 03/07/2008 11/07/2008 03/07/2008
submission
MILESTONE - Performance Test 11/07/2008 11/07/2008
Plan Sign Off
Test Design Stage
Decompose Critical Business 21/07/2008 21/07/2008
Transactions
Test Environment and tool 08/07/2008 10/07/2008
verification
MILESTONE - Transaction 21/07/2008 21/07/2008
Traversal Document
Test Development Stage
Performance Test Environment 09/07/2008 21/07/2008
Deployment and Validation
Build Test Data 22/07/2008 23/07/2008
Build Test Scripts 23/07/2008 30/07/2008
MILESTONE - Smoke Test 30/07/2008 30/07/2008
Test Execution
Dry Run 31/07/2008 31/07/2008
Test Execution & Preliminary 31/07/2008 11/08/2008
reports preparation
MILESTONE - Execution 11/08/2008 11/08/2008
Complete
Report Generation
Report Generation 12/08/2008 15/08/2008
MILESTONE - Final 15/08/2008 15/08/2008
Comprehensive Report
Other NFT Test
Batch Test 21/07/2008 14/08/2008
Accessibility Test 04/08/2008 08/08/2008
Security Test 04/08/2008 14/08/2008
Availability 11/08/2008 14/08/2008
Archiving & Purge 04/08/2008 08/08/2008
Recovery 11/08/2008 14/08/2008
MILESTONE - Final NFT Test 14/08/2008 15/08/2008
Report

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 11 of 26


3 Test Approach

The below diagram illustrates the approach that is used for performance testing the Britannia application. Test
runs through the online interfaces will be coordinated and executed by the Performance Test Team based on the
procedure for execution provided by the Development team.

3.1 Test Planning and Requirements Gathering

The following major tasks will be completed as part of the test planning stage:
Interact with application business and technical owners to determine the performance requirements for
performance testing.

Understand application architecture and evaluate scope, requirement documents to estimate the performance
test effort and compile performance test planning material.

Non-Functional Test Plan (this document) is produced in the planning stage, which details all the testing
activities, tasks, duration and resources in order to produce team and individual work plans.

3.2 Test Design

The following major tasks will be completed as part of the test design stage:
 Collect application and test environment details from architecture team and plan a process on the
usage of the environment.
 Interact with business experts to determine the process flows.
 Identify all data necessary to execute performance test scripts.

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 12 of 26


The transaction volumes to be achieved by performance test for ICOM application is detailed in the document
below:
Work load distribution details
Load Test Stress Test
Scenario Response Response
Orders/Day Orders/Day
Time Time
Branch ATM Order
Place
<Need to
Order 3000 2 sec 6000
benchmark>
Approval
RATM Order
Place
<Need to
Order 750 2 sec 1500
benchmark>
Approval
Branch Order
Place
Order <Need to
2000 2 sec 4000
benchmark>
Approval

The Number of VUsers will be calculated based upon the performance of the smoke test.

3.3 Test Development


 Non-Functional testing will commence only when the system is proven to be functionally stable

 Build automated performance test scripts for the identified business flows/traversals using
LoadRunner Web (HTTP/HTML) protocol, for ICOM application.

 Link test scripts to the associated test data (data from System testers or core team will be used)

 Complete scripting will be developed in offshore.

 The test scripts should be configured properly in the run-time settings

 Make sure scripts are running fine with multiple iterations.

 Verify the scripts by running individually on the identified traversals and ensure its integrity.

 In case of changes in application, test scripts needs to be re-worked if needed and validated.

 Simulators will be used to inject load for ICOM application during the performance testing.

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 13 of 26


3.4 Test Execution

Non-Functional testing will commence only when the system is proven to be functionally stable. A sub-set of the
listed non-functional tests will be executed as per priority in onsite, agreed upon by all stakeholders involved.
Scheduling will depend on availability of environments & resources.

The following section details how the test execution and analysis stages will be conducted:

Test Types:

1. Load Test

Definition

Linearly increase the transaction volumes until the expected production load is reached and gauge the
ability of the systems to support the full load in the Application.

Objectives

 Linearly increase the number of concurrent user sessions in order to find out the maximum
number of users that application server can handle for given response time criteria.

 Identify the throughput and Number of requests/sec (Hits/sec) at this user load.

Resource Needs

 Enough data/resources should be provided for all the web transactions that are identified for the
testing purpose. Please refer the Test Data section (section 3.7)

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 14 of 26


2. Stress Test

Definition

Start with the given user number and increase the total number of transactions till 2-3 times of the peak
volume is achieved and gauge the performance the behavior of system resource utilization, page load
response times, and application scalability.

Resource Needs

 Same as Load Test (Data requirement quantity will be twice or thrice as compared to the Load
test requirements)

3.5 Test Monitoring

The section details the monitoring tools that will be deployed in the test environment to support the performance
test team in identifying performance bottlenecks and to assist in tuning system/applications/databases for
improved performance.
 Capacity planning team will be monitoring the servers during NFT test execution

 Capacity planning team will share the logs to NFT team for analyzing.

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 15 of 26


Monitoring Counters

System/Application Monitoring Tool Counters


Perfmon (Windows) Processor, Process, Memory, System,
OS Counters
Physical disk, Network interface, Server
Busy Servers, Idle Servers, Hits/Sec,
LoadRunner / server-
Web Server (IIS) Garbage collections, current queue length,
status
Request failed, session timeout, request/sec,
App Server JVM counters, Connection Pool counters and
LoadRunner / Tivoli
(WebSphere) Thread Pool statistics

Perfmon (Windows) SQL locks, User connections, buffer manager,


SQL Server 2005
latch object statistics, SQL server general

Detailed list of counters for each server will be shared for monitoring.

3.6 Test Delivery

The following metrics will be monitored during load testing process to draw a conclusion on the system
performance and to fine-tune the same for better performance:
S.N Presented
Reports
o As
Client Side statistics

Min/Max/Avg Response Time for all the transactions as the


1 Table
defined NFR’s
2 Top 10 poorly performing transaction Table
3 Scalability report (Throughput Vs Load, HPS Vs Load) Graph
4 Error statistics report Table

System Resource statistics

Each Server’s statistics


5  CPU Utilization Graph
 Memory Utilization

Server Side statistics

6 Web server statistics Graph


7 App server statistics Graph
8 Database statistics Report

3.7 Test Data


Test data will be managed as follows:

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 16 of 26


 Performance testing will be carried out on production-level voluminous data for more realistic load
simulation.
 The minimum test data required for each scenario will be estimated; and the data used by the functional
test team will be used for performance testing activity or data from data analysis team will be used
Detailed Test data details, Refer Sec 9.0

3.8 Testing Standards

All the documents, reports should strictly adhere to NFT templates and have exhaustive information that would
allow it to be re-run by a different tester. No information should be retained by a tester. A minimum requirement
would be:

 Test plan

 Detailed Traversal Flow document (e.g. Business Processes)

 Test Objectives

 Pass/Fail criteria

 Business Load Model, including:

 Prioritised list of Business Processes

 Business volumes (at live, +1 year, +2 years)


 Application and data feeds

Project Diary

A project diary should be maintained for each project and stored for reference after the test has finished. This
should record, as a minimum, the scripts, scenarios, results and analysis file names, and any issues that are
discovered.

3.9 Progress Reporting

 Daily status report on the tests run, scripts created and issues faced to the NFT Lead

 Test summary upon completion of every test

 Weekly status report to Britannia Test Manager

 Comprehensive test summary report at end of end of each test cycle.

3.10 Contingencies
Please refer to sec. 7.1

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 17 of 26


4 Test Tools

4.1 Non-Functional Testing


The VuGen component needs to be installed in offshore machine for scripting. In onsite, need to identify the
machine to install the LoadRunner software for NFT execution.

Capacity planning team support required for monitoring the servers while NFT execution.

S.No. Tool Description


1 Mercury Load
Runner
1) Generate user load through the application programming
interface to stress web, application and database servers
2) Simulate HTTP protocol requests being executed through
the front-end user interface
3) Capture transaction response time
VUGen 4) Control the rate of transaction execution
Controller Execution of test run
Analysis Analyzing the test results
Load Generator Simulates Vusers
2 Mercury Quality
Centre
1) Test documents, scripts, scenarios and results
2) Defect Management
3) To obtain metric for progress reporting
3 Monitoring Tools
PerfMon Server resource monitoring
Tivoli Websphere performance monitoring

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 18 of 26


5 Resourcing

5.1 Test Phase Organization

Test Organization chart is placed below

5.2 Roles and Responsibilities


The following table identifies each of the individuals in the above organisation chart with their associated role
and responsibilities within their team.

Name Role Responsibilities Team


Raja Periannan NFT Lead Requirements gathering,
Non Functional Test
Test Strategy and
Team
planning
TA1 Test Analyst Scripting and report Non Functional Test
preparation Team

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 19 of 26


6 Pre - Production Environment (ELAB)

6.1 Application Environment

Server Name / IP
S.No. Type Hardware Requirements Software
Address

1 Application IISTTVIRTUAL00A 2 x Virtual CPU, 4GB Microsoft Internet


Server RAM, Information Server
(IIS),IBM Websphere
(VMware 1 GB Network connection Application Server, iCom
Virtual (to the Application VLAN) Enterprise Application
Machine)

2 Batch jobs APPTTVIRTUAL019 2 x Virtual CPU, 4GB Sterling Commerce


server RAM, Additional Data Connect:Direct, Sterling
Drive, Commerce
(VMware 1 GB Network connection Connect:Direct Secure +
Virtual (to the Application VLAN) Module, C&L (Cash &
Machine) Logistics) Batch Server
Application Framework,
C&L (Cash & Logistics)
Batch Server Service,
Java Runtime Engine
(JRE)

3 Database SQLTPB1XAX5310 2xDual Core CPUs, 8GB Microsoft Windows


Server 8 RAM Server 2003 Enterprise
X64

6.2 Test Environment

S.No. Type Hardware Requirements Software S/w Qty H/w Qty

1. Controller HASP plugin LoadRunner 8.1 1 1

2. Load 1 GB RAM, Pentium 4 CPU LoadRunner 8.1 1 1


Generator 2.00 GHZ

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 20 of 26


7 Production Environment (LIVE)

7.1 Application Environment

Hardware
S.No. Type Server Name / IP Address Software
Requirements

1 Application IISLGVIRTUAL011, 2 x Virtual CPU, 4GB Microsoft Internet


Server RAM Information Server
IISLGVIRTUAL010 (IIS), IBM Websphere
(VMware Application Server,
Virtual iCom Enterprise
Machine) Application,

2 Batch jobs APPLGVIRTUAL026 2 x Virtual CPU, 4GB Sterling Commerce


server RAM, Additional Data Connect:Direct,
Drive Sterling Commerce
(VMware Connect:Direct
Virtual Secure + Module,
Machine) C&L (Cash &
Logistics) Batch
Server Application
Framework, C&L
(Cash & Logistics)
Batch Server Service,
Java Runtime Engine
(JRE)

3 Database SQLLPB2DBL54208 2xDual Core CPUs, Microsoft Windows


Server 8GB RAM Server 2003
Enterprise X64

7.2 Test Environment

S.No. Type Hardware Requirements Software S/w Qty H/w Qty

1. Controller HASP plugin LoadRunner 8.1 1 1

2. Load 1 GB RAM, Pentium 4 CPU LoadRunner 8.1 1 1


Generator 2.00 GHZ

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 21 of 26


8 Risks, Issues and Dependencies

The objective of this section is to identify any known risks to the testing of the project, how they will be
managed and what contingency plans can be put in place. All risk and issues will additionally be managed
through the Project Risks & Issues Log.

8.1 Risks & Mitigation


S.No Category Risk Mitigation
1 Schedule The Test Schedule is subject The performance testing team will closely
to delivery of the Software & monitor the progress of the development
support to the testing Team on teams and escalate anticipated delays in
time. advance. In case of delays, the team shall
spare no effort at completing tasks in time.

Change in Application might The effort involved in the rework of scripts


require re-work of test scripts, will be intimated to the project management
which requires additional and the test team shall spare no effort at
efforts. completing tasks in time.
Incase of server failure during The effort involved in restoring the
the course of load testing, the environment will be intimated to the
testing will come to an abrupt development team after consultation with
halt. Additional efforts are support services and the test team shall
required to bring the spare no effort at completing tasks in time.
environment back up and
running.
If there is any change in The effort involved in restoring the
application due to performance environment will be intimated to the
tuning, application should be development team after consultation with
re-tested for performance support services and the test team shall
spare no effort at completing tasks in time.

2 Technology Testing is subject to the The issue will be escalated to the


availability of the environment Infrastructure support team for the problem
to be resolved.
Movement to production Environment should be as similar as ELAB
environment environment (application deployment, data
setup, network setup, Server configuration,
etc)

3 Resource Testing will be executed by the The Test Team will co-ordinate with the
onsite team; the onsite team concerned teams and will ensure their
will require clarifications from availability for discussions and clarifications
all the teams during the
execution. The availability of
the above people at the right
time is subject to risk
Analysis of some areas needs The Test Team will co-ordinate with the
assistance from experts in concerned teams and will ensure their
other teams. The availability of availability for discussions and
the above people at the right clarifications.
time is subject to risk

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 22 of 26


8.2 Issues

All issues during execution will be compiled in an issue log sheet. The Test Lead will be responsible for tracking
all issues to closure.
Issues are captured in DSR.

8.3 Dependencies

Dependency (Pre- Depend.


Contact
requisite, co-requisite, Type (IT Description
Person
post-requisite) / Bus)

Debashis – Support resource to help in Non


ICOM related, Functional testing from ICOM
Pre-requisite IT application, ICOM Jobs, WMS
Rajkumar – Jobs
WMS Jobs

Controlled or dedicated ELAB &


environments – ICOM
application available with
required access rights provided
to the resources for first 2
weeks.
Pre-requisite IT Debashis
Controlled or dedicated
Production environment – ICOM
application available with
required access rights provided
to the resources for remaining
weeks.

Testing is dependent on the test


infrastructure being available in
Co-requisite IT Debashis
time and support being provided
during test execution.

Sufficient time as estimated


Co-requisite IT Debashis should be provided for carrying
out the non-functional testing.

In case of any changes to the


requirements or functional
specifications, adequate time
Co-requisite IT Debashis
should be provided for re-
planning, reworking and test
execution.

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 23 of 26


9 Outstanding actions

Support required from ICOM, TD payments, Capacity planning team, DBS team, SIT Team for carrying out the
NFT testing.

Britannia NFT Team


Support - High level plan

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 24 of 26


10 Change Log
This section details the changes this document has undergone.

Date,
Section Prepared
Details Approved by
# by
Version #
30-June-08 Draft version Raja P
Draft
09-Jul-08 1.5, 3.4, Comments updated Raja P Rakesh V
1.0 3.5, 4.0,
8.0, 8.3.
9.0
11-Jul-08 2.2, 2.3, Comments updated Raja P. Shankar
1.1 2.4, 3.3,
5.1, 5.2,
9.0
14-Jul-08 3.2, 6.2, Pending details updated Raja P.
1.2 7.2, 9.0,
10.0

Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 25 of 26


Britannia NFT Test Plan V1.2 Lloyds TSB, 2018 Page 26 of 26

Vous aimerez peut-être aussi