Académique Documents
Professionnel Documents
Culture Documents
Version: <1.0>
Last Modified: <Month Day, Year>
[Test Summary Report Template Version 1.1 August 26, 2008
Approved for use by ESD Deliverables Workgroup]
Blue italicized text enclosed in square brackets (i.e., [text]) provides instructions to the
document author, or describes the intent, assumptions and context for content included in
this document.
Blue text enclosed in angle brackets (i.e., <text>) indicates a field that should be
replaced with information specific to the particular project.
Text and tables in black are provided as boilerplate examples of wording and formats
that may be used or modified as appropriate.
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
ii
APPROVALS
[Obtain signature approval of the final document from the delivering organizations Project
Manager and the primary CMS recipient (i.e., generally the Government Task Leader (GTL)).
Additional signature lines may be added as needed (e.g., CMS Business Owner).]
Submitting Organizations Approving Authority:
Signature
Printed Name
Date
Phone Number
Signature
Printed Name
Date
Phone Number
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
iii
REVISION HISTORY
[Use the table below to record information regarding changes made to the document over time.]
Versio
Date
Organization/Point of Contact
Description of Changes
n
1.0
<mm/dd/yy> <Organization Identifier / Point-of- Baseline Version
Contact Name>
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
iv
TABLE OF CONTENTS
1. INTRODUCTION......................................................................................................................1
2. REFERENCED DOCUMENTS...............................................................................................1
3. OVERVIEW...............................................................................................................................1
4. ASSUMPTIONS/CONSTRAINTS/RISKS.............................................................................2
4.1. Assumptions ..........................................................................................................................2
4.2. Constraints..............................................................................................................................2
4.3. Risks........................................................................................................................................2
5. SUMMARY ASSESSMENT ....................................................................................................2
6. DETAILED TEST RESULTS .................................................................................................4
6.1. <Test Category/Function>....................................................................................................4
6.2. < Test Category/Function >..................................................................................................5
7. VARIANCES..............................................................................................................................5
8. TEST INCIDENTS....................................................................................................................5
8.1. Resolved Test Incidents.........................................................................................................5
8.2. Unresolved Test Incidents.....................................................................................................6
9. RECOMMENDATIONS...........................................................................................................6
10. GLOSSARY..............................................................................................................................6
11. ACRONYMS............................................................................................................................6
12. APPENDICES..........................................................................................................................7
APPENDIX A: TEST INCIDENT REPORTS...........................................................................8
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
v
LIST OF FIGURES
[Insert a List of Figures appearing within the Test Summary Report along with a page reference
for each identified figure as appropriate. Labels of Figure titles and descriptions are to be
placed centered, above the figure within the main body of the document. All figures must have
an associated tag providing appropriate alternative text for Section 508 compliance.]
<Figure #: Figure Title or DescriptionPage Number>
LIST OF TABLES
[Insert a List of Tables appearing within the Test Summary Report along with a page reference
for each identified table as appropriate. Labels of Table titles and descriptions are to be placed
centered, above the table within the main body of the document.]
<Table #: Table Title or Description..Page Number>
TABLE 1: REFERENCED DOCUMENTS...............................................................................1
TABLE 2: TEST CASE SUMMARY RESULTS......................................................................3
TABLE 3: TEST INCIDENT SUMMARY RESULTS.............................................................4
TABLE 4: <TEST CATEGORY/FUNCTION> RESULTS.....................................................5
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
vi
1. INTRODUCTION
[Provide full identifying information for the automated system, application, or situation for
which the Test Summary Report applies, including as applicable, identifications number(s),
title(s)/name(s), abbreviation(s)/acronym(s), part number(s), version number(s), and release
number(s). Summarize the purpose of the document, the scope of activities that resulted in its
development, the intended audience for the document, and expected evolution of the document.
Also describe any security or privacy considerations associated with use of the Test Summary
Report.]
2. REFERENCED DOCUMENTS
[Summarize the relationship of this document to other relevant documents (e.g., Project Process
Agreement (PPA), Test Plan(s), Test Case Specification, other Test Summary Reports, and
Version Description Document (VDD), if they exist).
Provide identifying information for all documents used to arrive at and/or referenced within the
Test Summary Report (e.g., related and/or companion documents, prerequisite documents,
relevant technical documentation, etc.). For example, an associated Version Description
Document (VDD) should generally be referenced. Include reference also to the corresponding
Test Plan(s) and Test Case Specification.]
Table 1: Referenced Documents
Document Name
<document name>
Document Number
<documents configuration item
control number>
Issuance Date
<Month Day, Year>
3. OVERVIEW
[Provide a brief description of the testing process employed. Summarize what testing activities
took place, including the versions/releases of the software, environment, etc. Identify the test
functions performed, the test period(s), test location(s), and the test participants and their roles
in the testing process.]
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
1 of <max pages>
4. ASSUMPTIONS/CONSTRAINTS/RISKS
4.1. Assumptions
[Describe any assumptions and/or dependencies that may have impacted actual testing, test
results, and test summarization.]
4.2. Constraints
[Describe any limitations or constraints that had a significant impact on the testing of the system
and the test results. Such constraints may have been imposed by any of the following (the list is
not exhaustive):
a) Hardware or software environment
b) End-user environment
c) Availability of resources
d) Interoperability requirements
e) Interface/protocol requirements
f) Data repository and distribution requirements]
4.3. Risks
[Describe any risks associated with the test results and proposed mitigation strategies.]
5. SUMMARY ASSESSMENT
[Provide an overall assessment of the build or release tested, with a summary of the test results,
including the number of test incidents summarized by impact/severity level. Include in the
Glossary section of this document operational definitions for each of the reported
impact/severity levels established for the project. If test results are maintained in an automated
tool, the information may be exported or printed from the tool for inclusion in this document.]
The following is a summary of the test case results obtained for the reported test effort. Refer to
subordinate sections of this document for detailed results and explanations of any reported
variances.
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
2 of <max pages>
Summary Assessment
Test Cases Planned
Total Number
Test Cases
<total number of
test cases that
were planned for
the entire reported
test effort>
<total number of
test cases that
were run (i.e.,
completed) for
the entire reported
test effort>
<total number of
test cases that
were reviewed for
the entire reported
test effort>
<total number of
test cases that
passed for the
entire reported
test effort>
<total number of
test cases that
failed for the
entire reported
test effort>
<total number of
test cases that
were incomplete
and still need to
be run for the
entire reported
test effort>
<total number of
test cases that
were held (i.e.,
not attempted) for
the entire reported
test effort>
% of Total Planned
Comments
100%
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
3 of <max pages>
Total
Reported
Total #
Resolved
% Total
Resolved
Total #
Unresolved
% Total
Unresolved
<projectestablished
impact/severity
level (e.g.,
High/Severe,
Moderate/
Serious, Low/
Insignificant)>
<total number
of Test
Incident
Reports
(TIRs)
associated
with this
impact/
severity
level>
<total
number of
resolved
TIRs
associated
with this
impact/
severity
level>
<(Total #
Resolved
TIRs for
this impact/
severity
level
divided by
total TIRs)
multiplied
by 100> %
<total
number of
unresolved
TIRs for
this impact/
severity
level>
<(Total #
Unresolved
TIRs for this
impact/
severity
level
divided by
total TIRs)
multiplied
by 100> %
Combined Totals
<total number
of TIRs>
<total
number of
resolved
TIRs>
<percent of
resolved
TIRs to total
TIRs> %
<total
number of
unresolved
TIRs>
<percent of
unresolved
TIRs to total
TIRs> %
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
4 of <max pages>
Test Case/Script
Description
Date Tested
Pass/Fail
Comments
[If the test case failed, list the corresponding TIR ID in the Comments column.]
The calculated level of success for <test category/function> was <the percentage of the total
number of test cases defined for the test that passed>%.
7.VARIANCES
[Describe any variances between the testing that was planned and the testing that actually
occurred. Also, explain if the number of planned tests has changed from a previous report. It is
important to account for all planned tests. Also, provide an assessment of the manner in which
the test environment may be different from the operational environment and the effect of this
difference on the test results.]
8.TEST INCIDENTS
[Provide a brief description of the unexpected results, problems, or defects that occurred during
the testing.]
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
5 of <max pages>
9.RECOMMENDATIONS
[Provide any recommended improvements in the design, operation, or future testing of the
business product that resulted from the testing being reported. A discussion of each
recommendation and its impact on the business product may be provided. If there are no
recommendations to report, then simply state as such.]
10. GLOSSARY
[Provide clear and concise definitions for terms used in the Test Summary Report that may be
unfamiliar to readers of the document. For example, include operational definitions for each of
the reported impact/severity levels established for the project. Terms are to be listed in
alphabetical order.]
<Term Name>
<Term definition>
<Term Name>
<Term definition>
11. ACRONYMS
[Provide a list of acronyms and associated literal translations used within the document. List the
acronyms in alphabetical order utilizing a tabular format as depicted below.]
<ACRONYM>
<Literal Translation>
CMS
ST&E
TIR
UAT
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
6 of <max pages>
12. APPENDICES
[Utilize appendices to facilitate ease of use and maintenance of the Test Summary Report. Each
appendix should be referenced in the main body of the document where that information would
normally have been provided. Suggested appendices include (but are not limited to):
a) Resolved Test Incident Reports include a completed TIR for each unexpected result,
problem, or defect reported and resolved during testing.
b) Unresolved Test Incident Reports - include a completed TIR for each unexpected result,
problem, or defect reported during testing that remains unresolved.]
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
7 of <max pages>
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
8 of <max pages>
___________________
_____________
___________________
_____________
Tester Name:
___________________
Tester Phone:
_____________
Incident Description:
1. Error message and/or description of unexpected result, problem, or defect. For unexpected
results, describe how the actual results differed from the expected results:
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
2. Test case procedure step where incident occurred, if applicable:
_____________________________________________________________________________
_
_____________________________________________________________________________
_
3. Failed software (e.g., program name, screen name, etc.), if known:
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
_____________________________________________________________________________
_
4. Test case anomalies or special circumstances (e.g., inputs, environment, etc):
_____________________________________________________________________________
_
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
9 of <max pages>
_________________________
Date: ________________
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
10 of <max pages>
TIR -Page 2
Incident Resolution:
1. Incident referred to:
__________________________
Date: ________________
__________________________
Date: ________________
_____________________________________________________________________________________________
Test Summary Report <Version # / Date>
12 of <max pages>