Vous êtes sur la page 1sur 19

Centers for Medicare & Medicaid Services

<Business Owners Office/Center>


<Business Owners Group>
7500 Security Blvd
Baltimore, MD 21244-1850

Test Summary Report Template

Test Summary Report

Version: <1.0>
Last Modified: <Month Day, Year>
[Test Summary Report Template Version 1.1 August 26, 2008
Approved for use by ESD Deliverables Workgroup]

Document Number: <documents configuration item control number>


Contract Number: <current contract number of company maintaining document>

<System Name and/or Acronym>

Note to the Author


[This document is a template for creating a Test Summary Report for a given project. Multiple
Test Summary Reports may be created throughout the projects life cycle. For example, a Test
Summary Report may be generated at the end of development testing, at the end of validation
testing, and at the end of implementation testing. Separate Test Summary Reports may also be
prepared and referenced, as appropriate, for individual test functions (e.g., user acceptance
testing (UAT), Section 508 testing, Security Test and Evaluation (ST&E),etc.).
Ensure that the content of the prepared Test Summary Report(s) is consistent with the
terminology and testing functions defined in the CMS Testing Framework Overview published
as a part of the CMS Integrated IT Investment & System Life Cycle Framework available at
http://www.cms.hhs.gov/SystemLifecycleFramework/.
It is recommended that a CMS-approved automated testing tool be used to collect and report test
results. Use of an automated testing tool will make it much easier to maintain the volume of test
results that may be collected for a given project, and enable more efficient delivery of the
information that is to be included in the Test Summary Report. In the absence of an automated
testing tool, other general office automation tools such as Microsoft Excel may also be used.
This template includes instructions to the author, boilerplate text, and fields that should be
replaced with the values specific to the particular project.

Blue italicized text enclosed in square brackets (i.e., [text]) provides instructions to the
document author, or describes the intent, assumptions and context for content included in
this document.

Blue text enclosed in angle brackets (i.e., <text>) indicates a field that should be
replaced with information specific to the particular project.

Text and tables in black are provided as boilerplate examples of wording and formats
that may be used or modified as appropriate.

When using this template, follow these steps:


1.) Replace all text enclosed in angle brackets (e.g., <System Name (Acronym)>) with the
appropriate information for the specific project. These angle brackets appear in both the
body of the document and in headers and footers.
2.) Modify any boilerplate text as appropriate to the specific project.
3.) To add any new sections to the document, ensure that the appropriate header and body
text styles are maintained. Styles used for the section headings are Heading 1 (Times
New Roman 16 pt and Section Sub-Headings are Heading 2 (Times New Roman 14 pt).
The style used for boilerplate and body text is Body Text (Times New Roman 12 pt).
4.) To update the Table of Contents, right-click and select Update field and choose the
option Update entire table. Ensure that sub-headings at each level in the Table of
Contents are appropriately indented for improved readability.
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
i

<System Name and/or Acronym>


5.) Delete this Notes to the Author page and all instructions to the author (i.e., all blue
italicized text enclosed in square brackets) before finalizing the initial draft of the Test
Summary Report.]

_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
ii

<System Name and/or Acronym>

APPROVALS
[Obtain signature approval of the final document from the delivering organizations Project
Manager and the primary CMS recipient (i.e., generally the Government Task Leader (GTL)).
Additional signature lines may be added as needed (e.g., CMS Business Owner).]
Submitting Organizations Approving Authority:

Signature

Printed Name

Date

Phone Number

<Position Title> [e.g., <System Name and/or Acronym> Project Manager]


CMS Approving Authority:

Signature

Printed Name

Date

Phone Number

<Position Title> [e.g., <Contract or System Name> Government Task Leader]

_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
iii

<System Name and/or Acronym>

REVISION HISTORY
[Use the table below to record information regarding changes made to the document over time.]
Versio
Date
Organization/Point of Contact
Description of Changes
n
1.0
<mm/dd/yy> <Organization Identifier / Point-of- Baseline Version
Contact Name>

_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
iv

<System Name and/or Acronym>

TABLE OF CONTENTS
1. INTRODUCTION......................................................................................................................1
2. REFERENCED DOCUMENTS...............................................................................................1
3. OVERVIEW...............................................................................................................................1
4. ASSUMPTIONS/CONSTRAINTS/RISKS.............................................................................2
4.1. Assumptions ..........................................................................................................................2
4.2. Constraints..............................................................................................................................2
4.3. Risks........................................................................................................................................2
5. SUMMARY ASSESSMENT ....................................................................................................2
6. DETAILED TEST RESULTS .................................................................................................4
6.1. <Test Category/Function>....................................................................................................4
6.2. < Test Category/Function >..................................................................................................5
7. VARIANCES..............................................................................................................................5
8. TEST INCIDENTS....................................................................................................................5
8.1. Resolved Test Incidents.........................................................................................................5
8.2. Unresolved Test Incidents.....................................................................................................6
9. RECOMMENDATIONS...........................................................................................................6
10. GLOSSARY..............................................................................................................................6
11. ACRONYMS............................................................................................................................6
12. APPENDICES..........................................................................................................................7
APPENDIX A: TEST INCIDENT REPORTS...........................................................................8

_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
v

<System Name and/or Acronym>

LIST OF FIGURES
[Insert a List of Figures appearing within the Test Summary Report along with a page reference
for each identified figure as appropriate. Labels of Figure titles and descriptions are to be
placed centered, above the figure within the main body of the document. All figures must have
an associated tag providing appropriate alternative text for Section 508 compliance.]
<Figure #: Figure Title or DescriptionPage Number>

LIST OF TABLES
[Insert a List of Tables appearing within the Test Summary Report along with a page reference
for each identified table as appropriate. Labels of Table titles and descriptions are to be placed
centered, above the table within the main body of the document.]
<Table #: Table Title or Description..Page Number>
TABLE 1: REFERENCED DOCUMENTS...............................................................................1
TABLE 2: TEST CASE SUMMARY RESULTS......................................................................3
TABLE 3: TEST INCIDENT SUMMARY RESULTS.............................................................4
TABLE 4: <TEST CATEGORY/FUNCTION> RESULTS.....................................................5

_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
vi

<System Name and/or Acronym>

1. INTRODUCTION
[Provide full identifying information for the automated system, application, or situation for
which the Test Summary Report applies, including as applicable, identifications number(s),
title(s)/name(s), abbreviation(s)/acronym(s), part number(s), version number(s), and release
number(s). Summarize the purpose of the document, the scope of activities that resulted in its
development, the intended audience for the document, and expected evolution of the document.
Also describe any security or privacy considerations associated with use of the Test Summary
Report.]

2. REFERENCED DOCUMENTS
[Summarize the relationship of this document to other relevant documents (e.g., Project Process
Agreement (PPA), Test Plan(s), Test Case Specification, other Test Summary Reports, and
Version Description Document (VDD), if they exist).
Provide identifying information for all documents used to arrive at and/or referenced within the
Test Summary Report (e.g., related and/or companion documents, prerequisite documents,
relevant technical documentation, etc.). For example, an associated Version Description
Document (VDD) should generally be referenced. Include reference also to the corresponding
Test Plan(s) and Test Case Specification.]
Table 1: Referenced Documents
Document Name
<document name>

Document Number
<documents configuration item
control number>

Issuance Date
<Month Day, Year>

3. OVERVIEW
[Provide a brief description of the testing process employed. Summarize what testing activities
took place, including the versions/releases of the software, environment, etc. Identify the test
functions performed, the test period(s), test location(s), and the test participants and their roles
in the testing process.]

_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
1 of <max pages>

<System Name and/or Acronym>

4. ASSUMPTIONS/CONSTRAINTS/RISKS
4.1. Assumptions
[Describe any assumptions and/or dependencies that may have impacted actual testing, test
results, and test summarization.]

4.2. Constraints
[Describe any limitations or constraints that had a significant impact on the testing of the system
and the test results. Such constraints may have been imposed by any of the following (the list is
not exhaustive):
a) Hardware or software environment
b) End-user environment
c) Availability of resources
d) Interoperability requirements
e) Interface/protocol requirements
f) Data repository and distribution requirements]

4.3. Risks
[Describe any risks associated with the test results and proposed mitigation strategies.]

5. SUMMARY ASSESSMENT
[Provide an overall assessment of the build or release tested, with a summary of the test results,
including the number of test incidents summarized by impact/severity level. Include in the
Glossary section of this document operational definitions for each of the reported
impact/severity levels established for the project. If test results are maintained in an automated
tool, the information may be exported or printed from the tool for inclusion in this document.]
The following is a summary of the test case results obtained for the reported test effort. Refer to
subordinate sections of this document for detailed results and explanations of any reported
variances.

_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
2 of <max pages>

<System Name and/or Acronym>


Table 2: Test Case Summary Results

Summary Assessment
Test Cases Planned

Test Cases Run

Test Cases Reviewed

Test Cases Passed

Test Cases Failed

Test Cases To Be Run

Test Cases Held

Total Number
Test Cases
<total number of
test cases that
were planned for
the entire reported
test effort>
<total number of
test cases that
were run (i.e.,
completed) for
the entire reported
test effort>
<total number of
test cases that
were reviewed for
the entire reported
test effort>
<total number of
test cases that
passed for the
entire reported
test effort>
<total number of
test cases that
failed for the
entire reported
test effort>
<total number of
test cases that
were incomplete
and still need to
be run for the
entire reported
test effort>
<total number of
test cases that
were held (i.e.,
not attempted) for
the entire reported
test effort>

% of Total Planned

Comments

100%

<(Total Number Test


Cases Run divided by
Total Number Test
Cases Planned)
multiplied by 100 > %
<(Total Number Test
Cases Reviewed
divided by Total
Number Test Cases
Planned) multiplied by
100> %
<(Total Number Test
Cases Passed divided
by # Test Cases
Planned) multiplied by
100> %
<(Total Number Test
Cases Failed divided
by Total Number Test
Cases Planned)
multiplied by 100> %
<(Total Number Test
Cases To Be Run
divided by Total
Number Test Cases
Planned) multiplied by
100> %
<(Total Number Test
Cases Held divided by
Total Number Test
Cases Planned)
multiplied by 100> %

_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
3 of <max pages>

<System Name and/or Acronym>


The following is a summary of the test incidents (i.e., unexpected results, problems, and/or
defects) that were reported during the testing:
Table 3: Test Incident Summary Results
Impact/Severity
Level

Total
Reported

Total #
Resolved

% Total
Resolved

Total #
Unresolved

% Total
Unresolved

<projectestablished
impact/severity
level (e.g.,
High/Severe,
Moderate/
Serious, Low/
Insignificant)>

<total number
of Test
Incident
Reports
(TIRs)
associated
with this
impact/
severity
level>

<total
number of
resolved
TIRs
associated
with this
impact/
severity
level>

<(Total #
Resolved
TIRs for
this impact/
severity
level
divided by
total TIRs)
multiplied
by 100> %

<total
number of
unresolved
TIRs for
this impact/
severity
level>

<(Total #
Unresolved
TIRs for this
impact/
severity
level
divided by
total TIRs)
multiplied
by 100> %

Combined Totals

<total number
of TIRs>

<total
number of
resolved
TIRs>

<percent of
resolved
TIRs to total
TIRs> %

<total
number of
unresolved
TIRs>

<percent of
unresolved
TIRs to total
TIRs> %

6. DETAILED TEST RESULTS


[Provide a brief description of the testing process employed for each test category (i.e.,
development testing, validation testing, implementation testing, and operational testing) and
each test function performed (i.e., a collection of related test cases comprising a specific type of
test (e.g., user acceptance testing, Section 508 testing, regression testing, system acceptance
testing, ST&E, etc.). Also provide a summary of the test results for each test category/function.
As appropriate, separate sub-sections should be included for each test category/function
performed. If test results are maintained in an automated tool, the information may be exported
or printed from the tool for inclusion in this document.]

6.1. <Test Category/Function>


The table below summarizes the test cases employed for <test category/function> and the test
results obtained for each test case:

_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
4 of <max pages>

<System Name and/or Acronym>


Table 4: <Test Category/Function> Results
Test Case/Script ID

Test Case/Script
Description

Date Tested

Pass/Fail

Comments

[If the test case failed, list the corresponding TIR ID in the Comments column.]
The calculated level of success for <test category/function> was <the percentage of the total
number of test cases defined for the test that passed>%.

6.2. < Test Category/Function >


[All of the information described above in section 6.1 should be replicated for each defined test
category/function. The reported test categories/functions should be consistent with what are
defined in the corresponding Test Plan.]

7.VARIANCES
[Describe any variances between the testing that was planned and the testing that actually
occurred. Also, explain if the number of planned tests has changed from a previous report. It is
important to account for all planned tests. Also, provide an assessment of the manner in which
the test environment may be different from the operational environment and the effect of this
difference on the test results.]

8.TEST INCIDENTS
[Provide a brief description of the unexpected results, problems, or defects that occurred during
the testing.]

8.1. Resolved Test Incidents


[Identify all resolved test incidents and summarize their resolutions. Reference may be made to
Test Incident Reports that describe in detail the unexpected results, problems, or defects
reported during testing, along with their documented resolutions, which may be included as an
appendix to this document. If test results are maintained in an automated tool, the information
may be exported or printed from the tool for inclusion in this document.]

_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
5 of <max pages>

<System Name and/or Acronym>

8.2. Unresolved Test Incidents


[Identify all unresolved test incidents and provide a plan of action for their resolution.
Reference may be made to Test Incident Reports that describe in detail the unexpected results,
problems, or defects reported during testing, which may be included as an appendix to this
document. If test results are maintained in an automated tool, the information may be exported
or printed from the tool for inclusion in this document.]

9.RECOMMENDATIONS
[Provide any recommended improvements in the design, operation, or future testing of the
business product that resulted from the testing being reported. A discussion of each
recommendation and its impact on the business product may be provided. If there are no
recommendations to report, then simply state as such.]

10. GLOSSARY
[Provide clear and concise definitions for terms used in the Test Summary Report that may be
unfamiliar to readers of the document. For example, include operational definitions for each of
the reported impact/severity levels established for the project. Terms are to be listed in
alphabetical order.]
<Term Name>
<Term definition>
<Term Name>
<Term definition>

11. ACRONYMS
[Provide a list of acronyms and associated literal translations used within the document. List the
acronyms in alphabetical order utilizing a tabular format as depicted below.]
<ACRONYM>

<Literal Translation>

CMS

Centers for Medicare & Medicaid Services

ST&E

Security Test and Evaluation

TIR

Test Incident Report

UAT

User Acceptance Testing

_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
6 of <max pages>

<System Name and/or Acronym>


VDD

Version Description Document

12. APPENDICES
[Utilize appendices to facilitate ease of use and maintenance of the Test Summary Report. Each
appendix should be referenced in the main body of the document where that information would
normally have been provided. Suggested appendices include (but are not limited to):
a) Resolved Test Incident Reports include a completed TIR for each unexpected result,
problem, or defect reported and resolved during testing.
b) Unresolved Test Incident Reports - include a completed TIR for each unexpected result,
problem, or defect reported during testing that remains unresolved.]

_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
7 of <max pages>

<System Name and/or Acronym>

APPENDIX A: TEST INCIDENT REPORTS

_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
8 of <max pages>

<System Name and/or Acronym>

TEST INCIDENT REPORT (TIR)


Test Incident ID:

___________________

Test Incident Date:

_____________

Test Case ID:

___________________

Test Incident Time:

_____________

Tester Name:

___________________

Tester Phone:

_____________

Incident Description:
1. Error message and/or description of unexpected result, problem, or defect. For unexpected
results, describe how the actual results differed from the expected results:
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
2. Test case procedure step where incident occurred, if applicable:
_____________________________________________________________________________
_
_____________________________________________________________________________
_
3. Failed software (e.g., program name, screen name, etc.), if known:
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
4. Test case anomalies or special circumstances (e.g., inputs, environment, etc):
_____________________________________________________________________________
_
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
9 of <max pages>

<System Name and/or Acronym>


_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
5. Impact on testing or test item:
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
Description Prepared By:

_________________________

Date: ________________

_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
10 of <max pages>

<System Name and/or Acronym>


Test Incident ID: ___________________

TIR -Page 2

Incident Resolution:
1. Incident referred to:

__________________________

Date: ________________

2. Incident determined to be the result of:


____ Program Error (Program/Module Name: ___________________________________)
____ Data Error
____ Environmental Problem
3. Impact/Severity level determined to be:
____ High/Severe
____ Moderate/Serious
____ Low/Insignificant
4. Description of all resolution activities:
_____________________________________________________________________________
_____________________________________________________________________________
__
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
11 of <max pages>

<System Name and/or Acronym>


_____________________________________________________________________________
_
Resolution Prepared By:

__________________________

Date: ________________

_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
12 of <max pages>

Vous aimerez peut-être aussi