Académique Documents
Professionnel Documents
Culture Documents
. When the test target is to be tested, then the testers follow the
instruction in this FVS and check if the result of each test matches that stated in the
FVS.
In the FVS, the whole test process will be specified step by step. The FVS states
the precondition, test actions, and the test result for each test. In order to automate
some of the test cases, the test developers usually have to read through the FVS, select
the test cases that do not require any human intervention. This may require that the
[7]
combinations of sectors and
carriers, it does not concern
the other values (such as HS).
Thus, if the fields value is
'ALL', it only means all the
sectorXcarrier attributes are
included.
The value HS indicates that
the script needs an HSDPA
capable RBS.
EUL_2ms_TTI means
enhanced uplink with 2
milisecond transmission
intervals, thus a RBS must
support this type of carrier.
These features are defined in
3GPP specification.
rbsSw RBS software version
Example:
'P4-': Script supports revisions from project P4
onwards.
'-P5': Script supports revisions up to and including P5.
'R2A-R2D' : Script supports revisions between R2A
and R2D.
Supports both project
revisions (such as P4, P5) and
software revisions (such as
R3A).
For example, 'P5:R3A': Script
will run if R3A revision of
software, in the P5 project, is
on the node.
Moreover, a dash ('-') should
be used if trying to specify
that that the script supports
up to a revision.
nbapVer
Nbap version
Example 4.6-, 5.9-, 6.9- and ALL
niteVer
NITE version supported by the script.
Example:
R<version number>, example R20. Means R20 of
NITE is needed.
-R<version number>, example -R15. Means NITE
versions up to and including R15 are supported.
R<version number>-, example R25-. Means NITE
versions from R25 and later supported.
extConfig External tools that the test script needs. If more than one value is
HS stands for HSDPA resource required, EUL_2MS_TTI stands for enhanced uplink 2mili-seconds transmission interval, TX_DIV stands for Tx Diversity,
MIMO stands for Multi antenna output/input, RET stands for Remote Electronic Tilt unit required, ASC stands for Antenna System Controller required, All
means no special configuration required
28
WCDMA Automation Workflow Analysis and Implementation
Supported values:
RDBTH
MDL
MSSIM
RNCSIM
FCCU
given for this field, it means
the test position configuration
must contain equipment that
satisfies all the values. That is,
there is an AND relationship
between the values.
If MSSIM is included, then the
script needs a MS Simulator.
If RNCSIM is included, then
the script needs and RNC
simulator.
msConfig
MSIM Configuration, capabilities
Supported values:
HSDPA
HSUPA,
EUL
MUE
MBMS
MIMO
TX_DIV
transmissionType
Transmission type, capability
Supported values:
Chan_STM1_E1
E1
E3
IP
J1
STM1
T1
T3
ALL
transmissionProtocol
The type of the transmission protocol(s) supported.
Supported values:
IP
ATM
IP_ATM
ALL
IP_ATM is not supported until
later in P6.
rbsUeConnections
Which cells needs to be connected to the UE. For
example: a Test Case would require S1C1 and S1C2
to be connected.
Supported values:
S1C1
29
WCDMA Automation Workflow Analysis and Implementation
S2C1
S3C1
...
Test Script Body
1st Step: Initialize the environment variables
At the very beginning of every test script, initialization is required. This
avoids possible conflicts between tests in a sequence of test cases in a
test suite. At initialization, all of the variables will be initialized and set
to an initial value. Since every test case begins with this step, the
initialization method will usually will be placed in the test base class, so
that every script can reuse this function.
2
nd
Step: Initialize Pre-condition
As specified in the FVS, some test cases need specific preconditions to
be satisfied in order to run. These requirements are defined in the
setTcData block. Additionally, a test case precondition may include
some parameter changes, system configuration control, and so on. Thus,
setting up the system to satisfy all of the preconditions will be the 2
nd
step in each test case script.
3
rd
Step: Test Script body
The test scripts body contains all of the actions and specifies the
expected results for this test case. Test developers should follow the
FVS, preferably using methods provided by either the test base class or
NITE platform. After each action, the script must check the result to
confirm that the RBS system under test is working properly. A variety of
check functions have been implemented in the test base class and NITE
platform. A list of these check functions can be found in [30].
Note that in different RBS software projects, i.e. P6 or P7, the RBS work
procedure, interface, or parameters may have changed (from earlier
versions). Therefore, test developers should be aware of such changes
and carefully implement the test script with the necessary adaptation to
these changes.
Moreover, if the result check is unsuccessful, then an exception will be
raised and the test case terminated after generating a log entry indicating
the nature of the error for subsequent analysis.
Test Script End
Since the precondition and scenarios for each test case are different,
when a test case execution has finished, the setup precondition,
parameters and states must be cleaned up so that this test will not
influence the next test case. As a test design principle, every action in the
30
WCDMA Automation Workflow Analysis and Implementation
test case should be stored in an action stack. At the end of each test case,
the restore process will run the opposite action in reverse order according
to the action stack. If the restoration procedure fails, an exception will be
raised to warn the tester and a RBS restart operation should be
performed.
Note that inverting the actions to return the RBS to a given state is
preferable to performing a restart because in live network, the RBSs are
not frequently restarted with those operations. The RBSs stable working
status must be remained until something goes wrong.
Restart RBS
For the test cases that change the RBS configuration, these changes will
not be saved after the test cases execution. In another words, any (new)
settings are temporary. Ericssons WRBS setting parameters are stored
in a file called the configuration version. Thus, changes in parameters
will not be saved permanently in the configuration version file, but only
modified for the current test case. After a RBS restart, the unsaved
changes will be erased (i.e., they will not become permanent changes).
This makes it possible to restore a RBS to a known initial state after a
restart operation. The restart operation can be invoked by calling the
NITE function AL_API_restartrbs(). After the RBS restart, to ensure that
the next test case runs smoothly, this new test case first checks the
RBSs status and will continue to execute the test case only after the
RBS software has been re-loaded and all the relevant components in the
RBS have been properly configured.
5.2.3 Legacy Regression Suite Usage
Within Ericssons WRBS I&V department, the test scripts are integrated in a test
suite called the legacy regression suite. When testers upgrade an RBS with the latest
software build, they start their testing with the legacy suite to verify that the legacy
functions generate the same output as previously. Currently, the legacy regression
suite takes serveral hours to run the entire suite (including time to log any errors).
Usually, the testers start the regression test in the afternoon and examine the
results of this test suite the next morning. Given these test results, the testers will
report their result(s) in a database and analyze all of the results. If there is any
problem with the RBS software or with the test script itself, then a trouble report
should be submitted indicating where there is a problem.
Legacy Suite Run-time Environment
The legacy suite will be executed in a UNIX (or Linux) environment.
Moreover, the workstation will (pre-)load several load modules to ensure
that the legacy suite can be executed correctly. These modules are:
Python 2.5.1
NITE Platform (latest version)
31
WCDMA Automation Workflow Analysis and Implementation
NITE Scripts (latest version)
Mobile station simulator with latest software version
RNC simulator with latest software versions
RBS Software, test version
Test Environment, such as decoder, matched to RBS software
These modules should be carefully loaded and correctly chosen to ensure
that the test process will be accurate. This may require that the human
tester ensure that a specific version of the software is loaded into all of
the external test equipment (including simulators).
Run Legacy Suite
Depending on the node configuration and RBS type some test cases
will be skipped as specified in the TcData description. Three examples
of running the test suite are shown below:
Running the Whole Test Suite python regr_TestSuite.py wrbs096 --testsuite=TestSuite \
--clearLogDir --nbapver=6.9
Run a Object of the Test Suite
in this case mbdSuite
python regr_TestSuite.py wrbs096 --testsuite=mbdSuite \
--clearLogDir --nbapver=6.9
Run only one TC, in this case
mbd_R4N3.Test
python regr_TestSuite.py wrbs096 clearLogDir \
--nbapver=6.9 --testcase=mbd_R4N3.Test
NOTE: The --nbapver option defines the NodeB Application
Protocol (NBAP) version and --clearLogDir will clear the logs which are
stored in a temporary directory.
Failure Analysis
1. The legacy suite is mainly used to do regression testing. The
main goal is to ensure the quality of each software build. At a
minimum the RBS has to work correctly, thus it is very important
that testing starts by running the legacy test suite. Following this
test suite, the testers check if any alarms occurred and that the
data in the database is as expected. The testers also need to
check the software feature license status (Ericsson WRBS
software features are all based on license. The customers have
to buy and activate a certain feature license if they want some
functions to be run in theire network), before activating any of
the features that they want to test. The operation is based on
the software testing philosophy in Ericsson WRBS I&V. It is
believed that all the features should be deactivated unless a
single test case needs to test and then activate a certain feature.
32
WCDMA Automation Workflow Analysis and Implementation
33
2. Since the testing environment is a combination of different
parts, such as Mobile Station simulator and RNC simulator, the
testers may have to check the status of each of these test
facilities. Usually, the testers try to ping the MS and RNC to see if
the equipment (or emulation extension) are on-line.
3. After running a test suite, the failed test cases will be re-run for
several times (which can be defined by the testers). As the
testers will collect the test result after all the test process, the
testers will analyze the results and re-run failed test cases to
establish if these are reproducible errors.
4. The legacy test suite generates test results along with log file
entries. During analysis, the testers utilize the log to identify if
the apparent problem is in the RBS software or test script itself.
WCDMA Automation Workflow Analysis and Implementation
Chapter 6 Automation Tool Evaluation and Status
In this chapter, an evaluation of the automated tool development will be
presented. Moreover, some future work will also be suggested. To evaluate the tools,
the evaluation compares the cost of developing and maintaining an automated test
environment versus the cost of doing the same testing manually or by relying on any
faults being found (or not found at all) in other parts of the verification chain.
6.1 Test Coverage Improvement
In Ericssons WRBS P7 Wiona Project, automated regression test was moved to
the development (testing) phase. As a result a significant number of trouble reports (in
the range of 50-70%) are detected internally within the RBS prior to delivery. This
means that during the development phase, the automated legacy test suite can be very
helpful finding the bugs concerning the legacy functionalities. Therefore, the number
of test cases executed during development testing will significantly increase, as will
the number of trouble reports generated. The legacy suite also has support for
configurations other than those which were used (in the field) in order to increase the
breadth of the test coverage, thus enabling factor testing to find likely faults. (For
details of factor testing see section 9.5.2 of [20].)
6.2 Money saved by avoiding the error slipping into the next stage
There is a factor of two (based on earlier figures from the WBTS project[8]) in
increased cost - if a potential problem propagates to the next stage. Therefore, if the
problem remains undiscovered beyond the current stage, then the cost of finding this
problem will increase even more. The automated test tool is designed to give a daily
indication of the softwares current status. During the development phase, legacy
regression testing provides an indication of the softwares status. Although, not all
failures of legacy tests lead to trouble reports, the automated tests give the developers
a heads up of the softwares status, while providing project management with
information related to the planned software deliveries to the subsequent stages of the
development chain (or delivery to the end user).
Moreover, some of the faults that legacy testing may detect, might never have
been found during manual verification (i.e., as manual verification or
integration testing may not perform the same functional tests) or only found at a later
stage in the development and testing chain (or after deployment at the customers site,
i.e., in the operators network). Detecting these faults during the development phase
by using automated tools greatly decreases the cost of errors. By stopping problems
from reaching the live network, a large costs savings can be achieved and the risk
associated with an upgrade is reduced.
34
WCDMA Automation Workflow Analysis and Implementation
6.3 Impact on Development
The RBS system is a large and complex system, thus it is very hard (almost
impossible) to predict the impact on legacy code due to the introduction of new
functionality due to either new software or hardware. Therefore development projects
that introduce changes to legacy code will probably increase the number of trouble
reports generated. This is especially true given that I heard someone say that in P7
things are done to the hardware that it was not originally designed for. Whether this
statement is true or not, the RBS system is a very large system and making changes
that cause the hardware to operate in ways that it was not anticipated to operate are
even more likely to cause unexpected problems. In this setting, an automated testing
method gives the company the ability to test for unexpected impact(s) on the legacy
functionality. If problems are detected early, it may be possible to correct the problem
while the cost of correcting it is low.
A potential effect of integrating legacy testing with development testing is to
increase the stability of the resulting system, as it more likely that both the new and
legacy functions with both work and that they will be compatible. Another potential
effect is that developer can be more aggressive in making changes to the legacy code,
rather than simply trying to make minimal patches as they can quickly learn if a
major reorganization of the code retains the legacy functionality while facilitating the
addition of new functionality. If this effort is realized, then it may be possible to
simplify the code while enhancing the functionality thus resulting in fewer lines of
code and fewer potential bugs!
6.4 Manual versus automated testing
Today the automated test tool automates more than 40% within all functional
regression tests. The cost of the automated tool maintenance versus the
manual regression testing of the automated test cases is an important metric. Manually
executing the current test cases for every build requires more human resources than
are available. However, with the automated tool these tests can be executed for a
number of different configurations daily (i.e., several times per week) and at a cost
that is far less than what it would cost in human resources to conduct far fewer tests.
The number of trouble reports (concerning the RBS software and not the test
scripts themselves) written after the introduction of the automated test environment
has increased compared to the earlier manual case
4
. This is perhaps an indicator that
the automated tool significantly increases the probability of finding potential bugs.
4
Measurements show that the automated testing is up to twelve times more efficient than the manual regression test within the
same time period.
35
WCDMA Automation Workflow Analysis and Implementation
6.5 Test coverage over time
Using the earlier manual regression test, all of the earlier functions could only be
checked within the time of the entire project period. However, the automated test
environment enabled regression testing to be executed for a large number of test cases
in a single night, thus giving developers useful feedback the next day. This
approached also reduced the time it takes for a new package build to reach a
customer. Figure 8 shows the increasing number of test cases that are being used with
this new automated test tool. (Note that the flatness in the curve for the July to August
period reflects the summer vacation period for most employees; and the spurt of
growth just before this period is due to lots of people trying to finish their work up
before the start of their vacation.)
KPI NITE Automation - Goal Scorechart
0
50
100
150
200
250
a
p
r
/0
8
m
a
j/
0
8
ju
n
/0
8
ju
l/0
8
a
u
g
/
0
8
s
e
p
/
0
8
o
k
t
/0
8
n
o
v
/
0
8
d
e
c
/
0
8
ja
n
/0
9
Outcome
Figure 8: Increase in the number of test cases of the automated test tool as a function of time.
This thesis project accomplished its basic goal of automating part of the
regression testing in Ericsson WRBS I&V. However, there remains additional work to
be done some of the most immediate work is described in the next section.
6.6 Future work
Today automated testing handles 40% of all test cases. This increases to 57% if
all of the tests which require manual interaction are excluded. These manual test cases
(17% of all test cases) need human intervention (such as the observation of an LED or
pulling out/insertion of a plug-in module). In the near future, the development team is
considering deploying a web camera to further increase the number of test cases
which can be automated. It is anticipated that 70% of the existing manual test cases
36
WCDMA Automation Workflow Analysis and Implementation
37
can be automated. Thus the introduction of automated testing will supplants more
than 70% of all the previous manual test cases, but has enabled these 70% of tests to
be run daily - rather than only once over the entire testing cycle for a given release!
WCDMA Automation Workflow Analysis and Implementation
References
[1] Robert Oshana, DSP Software Development Techniques for Embedded and
Real-time Systems, Newnes, ISBN 0750677597, 9780750677592, page 345,
published, 2005
[2] Armstrong Process Group, Inc., Test Case Design with UML, Course
description, Armstrong Process Group, Inc., New Richmond, Wisconsin,
USA, 15 June 2006
http://www.aprocessgroup.com/training/pdf/APG_TestCaseDesign_Course
Desc_01_0804_v2_0.pdf latest visit 2009.02.01
[3] Ericsson, WRBS Regression Function Verification Specification, June 2008
(confidential document)
[4] Yuanfang Cai, Sunny Huynh, Tao Xie: A Framework and Tool Supports for
Testing Modularity of Software Design
http://www.cs.drexel.edu/~yfcai/Papers/ASE2007.pdf lastest visit
2009.02.02
[5] Martin Pol, Ruund Teunissen, and Erik van Veenendaal, Software Testing
A Guide to the Tmap Approach, ISBN 0-201-74571-2, published by Person
Education Limited, UK, 2002
[6] Erik van Veenendaal and Martin Pol, A Test Management approach for
structured testing, Published in Achieving Software Product Quality, Erik
van Veenendaal and Julie McMullan (eds.), UTN publishers, Den Bosch,
The Netherlands, 1997. http://www.improveqs.nl/pdf/tmap.pdf
[7] Gunnar Kalsson, NITE TestScript Design Rules, 1/102 60-CXC 124 1020+
Uen, Ericsson Internal Document
[8] Summary of WBTS Project , Petter Isaksson, NITE Script Developer,
2005.11.10
[9] Pierr Jahan and Denis Degioanni, Third Generation Partnership Project
(3GPP) web site, http://www.3gpp.org, 2008, last accessed: 2009.01.31.
[10] Harri Holma and Antti Toskala (Editors), WCDMA for UMTS: Radio Access
for Third Generation Mobile Communications, Wiley Technology
Publishing, First edition, June 2000, 344 pages, ISBN-10: 0471720518 and
ISBN-13: 978-0471720515.
[11] Erik Dahlman, Stefan Parkvall, Johan Skold, and Per Beming, 3G Evolution,
Second Edition: HSPA and LTE for Mobile Broadband, Academic Press,
Second edition, October 2008, 648 pages, ISBN-10: 0123745381 and
ISBN-13: 978-0123745385.
[12] Rosaline Makar, "Get started with unit and component testing using IBM
Rational tools: A comprehensive guide for unit and component testing",
38
WCDMA Automation Workflow Analysis and Implementation
IBM, on-line tutorial, 11 October 2007,
http://www.ibm.com/developerworks/edu/ws-dw-ws-testing.html
[13] Dimitri van Heesch, Doxygen: Source code documentation generator tool,
Web page, Last modified 2 January 2009.
http://www.stack.nl/~dimitri/doxygen/
[14] RNC simulator User Guide (TietEnator, confidential) Lastest release
2008.11
[15] Test Mobile Maual Book (confidential) Latest release 2009.2
[16] ERICSSON WRBS3000 User Guide (confidential) Latest release 2009.4.6
[17] RDBTH User Guide, 1553-CAL 115 0784, (confidential) Latest release
2009.3.17
[18] FCCU User Guide (Ericsson product, confidential) Latest release 2008.8
[19] Python community, "pylint 0.16.0: python code static checker", web page,
Python Software Foundation, http://pypi.python.org/pypi/pylint - last
accessed 2009.02.01
[20] Daniel Galin, Software Quality Assurance: From Theory to Implementation,
Pearson Education, 2004, 590 pages, ISBN 0-201-70945-7, 9780201709452.
[21] IBM Clear Case introduction,
http://www-01.ibm.com/software/awdtools/clearcase/features/index.html?S_
CMP=rnav, last vist 2009.03.05
[22] Standardization and related activities -- General vocabulary, ISO/IEC Guide
2:2004 (replaced ISO/IEC Guide 2:1996, that had replaced ISO/IEC Guide
2:1991), 8
th
Edition (Trilingual), International Standards Organization,
Geneva, Switzerland, 2004, 60 pages.
[23] Yang Bin, Quality Assurance, Masters Thesis, Department of
Communication Systems, Royal Institute of Technology, Stockholm,
Sweden, work in progress.
[24] Christine B. Tayntor, Six Sigma software development, CRC Press, 2002,
322 pages, ISBN 0849311934, 9780849311932.
[25] Capers Jones, Applied Software Measurement, 3rd edition, McGraw-Hill
Professional, 2008, 662 pages, ISBN 0071502440, 9780071502443.
[26] W. H. C. Bassetti, William E. Lewis, and Gunasekaran Veerapillai, Software
Testing and Continuous Quality Improvement, 2nd Edition, CRC Press,
2004, 560 pages, ISBN 0849325242, 9780849325243.
[27] Stephen H. Kan, Metrics and Models in Software Quality Engineering, 3rd
edition, Addison-Wesley, 2003, 528 pages, ISBN 0201729156,
9780201729153.
39
WCDMA Automation Workflow Analysis and Implementation
40
[28] Peter Farrell-Vinay, Manage Software Testing, CRC Press, 2008, 600 pages,
ISBN 0849393833, 9780849393839.
[29] Jeff Younker, Foundations of Agile Python Development, Apress, 2008, 416
pages, ISBN 1590599810, 9781590599815
[30] NITE Scripts Function list (Ericsson Confidential) Latest release 2009.4.19
www.kth.se
TRITA-ICT-EX-2009:4