Vous êtes sur la page 1sur 9

MEAP:

MEMORY EXERCISE OF
ALZHEIMER’S PATIENT

Quality Assurance Plan

TEAM 07:
KAVYA BOHRA
ANGUS CHEN
AMIRSAMAN FAZELIPOUR
TRAVIS FRIDAY
DESMOND TRANG
GRACE ZHANG
Table of Contents
Revision History ........................................................................................................................................... 1

Testing Framework ....................................................................................................................................... 1

Internal Deadlines ......................................................................................................................................... 2

Version 1 ..................................................................................................................................... 2

Version 2 ..................................................................................................................................... 2

Version 3 (Final Version) ........................................................................................................... 3

Software Testing ........................................................................................................................................... 3

Unit Testing ................................................................................................................................ 3

Integration Testing ...................................................................................................................... 3

System Testing ............................................................................................................................ 4

User Acceptance Testing (Alpha Testing) .................................................................................. 4

Project Complexity ....................................................................................................................................... 5

References ..................................................................................................................................................... 6
Revision History

Revision Status Publication/Revision By


Date

1.0  Created the document October 18th, 2018 Desmond


 Sectioned the requirements into Trang
appropriate headings

1.1  Finished “Testing Framework” October 18th 2018 Desmond


 Began “Internal Deadlines” Trang

1.2  Researched and finished “Internal October 19th, 2018 Desmond


Deadlines” as well as “Software Trang
Testing”

1.3  Finished “Project Complexity” October 20th, 2018 Desmond


Trang

2.0  Formatted and edited document October 20th, 2018 Desmond


Trang

Testing Framework
Quality Assurance testing is a big portion of an app development cycle; hence a lot of
projects put an emphasis on using or creating a testing framework that will allow them to test all
aspects of their product. Table 1 below shows a variety of different frameworks, each with their
pros and cons, that can be used for automated unit testing along with a brief description of their
main functionality [1].

Framework Brief Overview

XC Test  Allows user to create automated unit testing using existing pre-made
Apple functions
 Used for XCode projects
 Framework allows for unit testing as well as UI testing

Calabash  Cross-platform support (iOS and Android)


Calabash  Used to automated acceptance tests
 Tests scenarios using Behavior-driven development
 Discontinued active development in April 2017

EarlGrey  iOS UI automation test framework


Google

1|Page
 Allows for automatic synchronization with UI, network requests, and
queues
 Synchronization ensures that the UI is stable before testing

Appium  Open source automation test framework for iOS, Android, and Windows
Appium apps
 Allows full access to back-end APIs
 Strictly made for mobile testing
Table 1. Brief Description of the Different Frameworks

Out of the frameworks listed above in table 1, we choose to use XC Test because it
provides the most flexibility for our needs and because XC Test was made for XCode projects,
that will ensure that XC Test will incorporate smoothly into all our testing. XC Test allows us to
conduct unit testing for both the back-end as well as front-end, in the form of UI test cases. To
generate our unit test cases, the QA Developers will communicate to the Software Developers to
see what the main functionalities are important to the system as a whole and test the functions
accordingly.

Internal Deadlines
The internal deadlines are both for the QA Developers as well as the Software
Developers because after the QA conduct unit, integration, system, or unit acceptance testing any
bugs must be fixed by the Software Developers before it can be released; hence both teams must
be given sufficient time in order to find the bugs and apply the fixes. There will be three hard
deadlines, version one, two, and three, as well as multiple soft deadlines. Unit and integration
testing will be conducted during version 1 and 2, and as features are finished being implemented
by version 3, we will conduct system and acceptance (Alpha) testing.

Version 1

The hard deadline for version 1 is November 5th, 2018 and the features mentioned in the
design document will have to be finished and tested by that date. The soft deadline for the
Software Developers to release the code to the QA Developers will be on October 29th, 2018.
QA Developers will conduct unit testing and integration testing for the features on October 30th,
2018 at SFU. Any testing that is not completed will be finished the next day, and any bugs found
will be passed back to the Software team, in order to give them sufficient time to incorporate the
fixes before the hard deadline.

Version 2

The hard deadline for version 2 is November 19th, 2018 and similar to version 1, the
features mentioned will have to be tested by that date. The soft deadline for the Software
Developers will be on November 14th, 2018 and the testing by the QA team will be conducted
on November 15th, 2018 and November 16th, 2018 at SFU. Along with unit and integration

2|Page
testing for version 2 features, the QA team will conduct regression testing for any bugs found
during version 1 to make sure they are fixed. The Software team will have three days to fix the
bugs before the deadline.

Version 3 (Final Version)

The hard deadline for version 3 is December 3rd, 2018. Version 3 testing will differ
from previous versions because we will have finished the significant features in our application
so we will begin testing the system as a whole and making sure it fits exactly as the customer
desires (refer to Alpha Testing for complete procedure). The soft deadline for Developers is
November 26th, 2018. The QA Developers will conduct unit, integration, and regression testing
for any features that are added but the main responsibility for the QA team will be system testing
and alpha testing. The QA and Software team will conduct system and Alpha testing together
from November 27th - November 28th at SFU and the Software team will have five days to fix
any bugs before the final deadline.

Software Testing

Unit Testing

The QA Developers will conduct unit testing after the soft deadline for every
version to make sure the code does what is intended. An important aspect to our unit testing is
that we will keep the functions as independent as possible and prevent interactions between
components because it allows us to accurately determine the cause of the bug. Anytime that a
bug occurs, we will create a test case for that defect and continue testing for it even after it has
been patched, in the form of regression testing [2].

An example of unit testing would be making sure the scoring algorithm works properly.
We would test this by keeping all the variables constant except for the one variable; as we adjust
the variable, the output would change accordingly. The other variables will be tested by having
similar unit test cases. Along with unit testing the code, we will also test UI elements to verify
that each UI element holds the correct value: a “pause” button will pause the game, the “menu”
button will bring up the menu, the game titles have the correct titles, etc. XC Test allows for
efficient unit testing through its subclass XCTestCase; the subclass tests the output of a function
by outputting a custom output message depending on the value of output [3].

Integration Testing

Integration testing will take place after unit testing is done, for each version
release. Integration testing will involve testing multiple units together to ensure they function
properly together because often, components will function independently but when joined
together, they do not achieve the intended result. The testing methodology that our team will use
to do integration testing is Bottom-Up Testing [4] and it involves using the lower-level functions
and classes, that were tested during unit testing, in conjunction with higher level functions that

3|Page
call upon the lower-level components. This kind of testing suits our team because we will be
developing lower-level methods first, in order to build towards the completed system [5]. Figure
1 shows how the higher-up elements in the hierarchy calls upon the lower-level elements.

Figure 1. Bottom-Up Integration Testing

Integration differs from unit testing because unit testing revolves around testing each
component individually whereas integration testing takes the different units and tests their
interaction with each other. Integration testing will be done after unit testing to ensure that if the
higher-level elements fail, the error is with the higher-level element.

System Testing

System testing is very similar to user acceptance testing; both methods rely on testing the
completed system, but the main difference is that the customers’ requirements are not in
consideration in system testing but is a key aspect during user acceptance testing [6]. System
testing involves testing our mobile application, after all the key features have been implemented.
This will occur during version 3, or earlier depending on when the major features are finished,
after we have conducted unit and integration testing.

User Acceptance Testing (Alpha Testing)

User acceptance testing is done to make sure the completed system adheres to the
customers’ satisfaction. This stage of testing can be split up into many different parts: alpha
testing, beta testing, ad-hoc testing, etc. [7]. Alpha testing is used to imitate real users and how
they would normally interact with the application on a daily basis, to make sure the key
functionalities are bug-free. It is conducted within the internal organization, usually by members
that are not involved in the development or testing of the code. Beta testing involves selecting
developers or users outside of the organization, or team, to test the application for an extended
period of time, where the testers will provide feedback to the team [8]. Because our project will
be limited on time, we will only conduct Alpha Testing of our mobile application during the
version 3 release.

Alpha testing for our application will involve the user first trying to input a username,
upon opening the app. The user will input a predetermined username that is already in our
database to test that the username must be unique; the user will be prompted to try a different

4|Page
name and after they successfully enter a unique name, they will arrive at an automatic tutorial.
After the tutorial, the user is brought to the main page where they will click on both the “menu”
button and “question mark” button to make sure both buttons work accordingly. After checking
both works as intended, the user will go back to the home page and proceed to play the two
games. By following the instructions and completing both games, the user will be brought to a
summary page, where they can see their results, and a history page. The user will verify that their
score during the game matches with the scoring in the graphs. The user will conduct the games
multiple times over a span of a few days in order to make sure that the scoring is saved and
represented correctly in the graphs.

Project Complexity
Instruments by Apple is a multiverse tool that excels in performance-analysis using a
variety of different methods: tracking the performance of an iOS mobile application, pinpointing
source code errors, general-system troubleshooting, etc. Instruments is part of XCode so
similarly to XCode Test, it will incorporate smoothly with our development tools [9].

CLOC is a source code counter that counts the amount of code, comments, and number of
files, given a directory to scan. Our team chose to use CLOC because it is user-friendly and easy
to use; they provide a thorough tutorial and give example outputs. Figure 2 below shows an
example output after using CLOC to scan a directory. They also support a range of different file
inputs in case we have different file types in our development [10].

Figure 2. Example Output using CLOC

5|Page
We will measure the lines of code and number of files for version 1, 2, and 3 and create a
graph using Microsoft Excel, allowing us to see how the complexity and size of our project is
growing as we add more functionality to our app in each version. We will plot the lines of code
and numbers of files on the Y-axis, represented by two different colors, against the three
different versions on the X-axis. Figure 3 shows a possibility of what our graph could look like at
the end of version 3.

Figure 3.1 Lines of Code for Each Version

Figure 3.2 Number of Files for Each Version

References
[1] Vince Power, “Top 5 iOS Testing Frameworks | Sauce Labs”, 14-Sept-2017. [Online].
Available: https://saucelabs.com/blog/top-5-ios-testing-frameworks. [Accessed: 18-Oct-
2018].

[2] Software Testing Fundamentals (STF), “Unit Testing - Software Testing Fundamentals”.
[Online}. Available: http://softwaretestingfundamentals.com/unit-testing/. [Accessed: 19-
Oct-2018].

[3] Developer.Apple.Com, "Defining Test Cases and Test Methods | Apple Developer
Documentation". [Online]. Available:

6|Page
https://developer.apple.com/documentation/xctest/defining_test_cases_and_test_methods
. [Accessed: 18-Oct-2018].

[4] Tutorials Point (I) Pvt. Ltd, “Bottom Up Testing”. [Online]. Available:
https://www.tutorialspoint.com/software_testing_dictionary/bottom_up_testing.htm.
[Accessed: 18-Oct-2018].

[5] Software Testing Fundamentals (STF), “Integration Testing - Software Testing


Fundamentals”. [Online}. Available: http://softwaretestingfundamentals.com/integration-
testing/. [Accessed: 19-Oct-2018].

[6] Software Testing Fundamentals (STF), “System Testing - Software Testing


Fundamentals”. [Online}. Available: http://softwaretestingfundamentals.com/system-
testing/. [Accessed: 19-Oct-2018].

[7] Software Testing Fundamentals (STF), “Acceptance Testing - Software Testing


Fundamentals”. [Online}. Available: http://softwaretestingfundamentals.com/acceptance-
testing/. [Accessed: 19-Oct-2018].

[8] Guru99, “Alpha Testing Vs Beta Testing”. [Online]. Available:


https://www.guru99.com/alpha-beta-testing-demystified.html. [Accessed: 19-Oct-2018].

[9] Apple, “Instruments Overview - Instruments Help”. [Online]. Available:


https://help.apple.com/instruments/mac/current/#/dev7b09c84f5. [Accessed: 20-Oct-
2018].

[10] Al Daniel, “CLOC -- Count Lines of Code”. [Online]. Available:


http://cloc.sourceforge.net/. [Accessed: 20-Oct-2018].

7|Page

Vous aimerez peut-être aussi