Académique Documents
Professionnel Documents
Culture Documents
Digitally printed by
Colourtech, Ashford, Kent
www.colourtechgroup.com
INTERNATIONAL STANDARDS FOR
THE QUALIFICATION OF AEROPLANE
FLIGHT SIMULATORS
THIRD EDITION
JUNE 2005
Evaluation Handbook 3rd Edition
PREFACE
In spite of the technological advances in both the gathering and processing of aircraft
flight test data and the development and proving of mathematical models for flight
simulators, there are still many unknown factors present within any flight test data which
cause considerable difficulties when those data are used in the design of a flight
simulator. One purpose of this document, therefore, is to assist all sections of the
industry in applying "engineering judgment" to those situations. Hopefully, it will also
provide guidance for constructing a Qualification Test Guide and conducting simulator
evaluation tests. This Handbook, known as Volume 1, is complemented by a second,
separate volume aimed at providing guidance in the conduct of the Functions and
Subjective Tests - as distinct from the Validation Tests, which are the primary subject
of this volume.
In 1995, the ICAO Manual was published by the International Civil Aviation Organisation
(ICAO) as the “Manual of Criteria for the Qualification of Flight Simulators” (hereafter
known as the ICAO Manual). During 2001 an international working group under the
joint chairmanship of the European Joint Aviation Authorities and the United States
Federal Aviation Administration reviewed and modernised the Standards contained in
the Manual. The majority of the validation tests of Appendix B of the ICAO Manual
(what was Appendix 2 in the original RAeS International Standards document) were
revised. A few tests were added, and about an equal number were deleted. This Third
Edition of the Evaluation Handbook is intended to be a companion to the Second
Edition of the ICAO Manual. There are additional improvements to expand the
usefulness of the Handbook both for flight simulator evaluation and for planning and
conducting validation tests.
The document, in common with the general atmosphere of cooperation found within the
flight simulation industry has been generated from contributions received from
interested parties worldwide. Nevertheless it is not intended that the Evaluation
Handbook is referred to or quoted as being the definitive reference source for
determining policy - that function remains with the regulatory authorities themselves.
It is hoped that this Third Edition of the Handbook will continue to develop as a useful
and 'living' document, and provide some of the background and details on testing and
evaluation needed by those engineers, pilots, managers and regulatory authorities who
are entrusted with the complex task of evaluating an aeroplane flight simulator.
Thanks are once again due to many individuals and organisations within the flight
simulation industry and it seems fair to acknowledge their valued co-operation, without
which this Third Edition could not have been produced.
M I Blackwood
June 2005
i
Evaluation Handbook 3rd Edition
DISCLAIMER
The Royal Aeronautical Society does not accept responsibility for the
technical accuracy nor for the opinions expressed within this publication.
Published by
The Royal Aeronautical Society , 4 Hamilton Place, London, W1J 7BQ
ii
Evaluation Handbook 3rd Edition
AMENDMENT RECORD
2nd Edition (Reformatted for PDF) 01/02/02 First Issue to the World Wide
Web
iii
Evaluation Handbook 3rd Edition
LIST OF CONTENTS
1.0 INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xx
1,1 BACKGROUND . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xx
1.2 SIMULATOR STANDARDS . . . . . . . . . . . . . . . . . . . . . . xx
1.3 VALIDATION TESTS . . . . . . . . . . . . . . . . . . . . . . . . . . . xx
1.4 QTG REVIEW TECHNICAL EVALUATION . . . . . . . . xxiii
1.5 FUNCTIONS AND SUBJECTIVE TESTS . . . . . . . . . xxvi
iv
Evaluation Handbook 3rd Edition
9.0 REFERENCES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . l
v
Evaluation Handbook 3rd Edition
vi
Evaluation Handbook 3rd Edition
LIST OF FIGURES
There follows a list of diagrams in the order in which they appear. Note that the
diagrams are included for general information only and have been deemed to
form a reasonable cross-section of the listed tests. There is no intention to
suggest that any of them fulfil the precise requirements of the test to which they
belong - indeed many of them have been specifically chosen because they fail
to fulfil those requirements. In any case the reader should be aware they will
differ in many details from those plots encountered during the evaluation of a
given simulator. The comments associated with the diagrams are intended to
either provide insight into that test, or to make a comment which can be applied
more generally across a range of tests.
Section 1
vii
Evaluation Handbook 3rd Edition
viii
Evaluation Handbook 3rd Edition
Section 2
ix
Evaluation Handbook 3rd Edition
x
Evaluation Handbook 3rd Edition
xi
Evaluation Handbook 3rd Edition
xii
Evaluation Handbook 3rd Edition
xiii
Evaluation Handbook 3rd Edition
Section 3
xiv
Evaluation Handbook 3rd Edition
xv
Evaluation Handbook 3rd Edition
Section 4
xvi
Evaluation Handbook 3rd Edition
Section 5
xvii
Evaluation Handbook 3rd Edition
Appendices
xviii
Evaluation Handbook 3rd Edition
E-9 Elevator Error for Closed Loop Match 20% Reduction E-11
in Effectiveness
xix
Evaluation Handbook 3rd Edition
1.0 INTRODUCTION
1.1 BACKGROUND
1.3.1 General
xx
Evaluation Handbook 3rd Edition
1.3.2 Parameters
The parameter list for a test may vary from simulator to simulator
depending on the way the flight test was conducted, the method of data
gathering used and also of the basic aeroplane configuration. To provide
flexibility to the industry in meeting the Standard, a required parameter list
for each test was not part of the ICAO Manual. (A list of recommended
test parameters for validation tests is included in the IATA Document,
“Flight Simulator Design and Performance Data Requirements”, Appendix
D, Reference 12). In general, parameters plotted must include all those
which have a tolerance applied, all those required to confirm initial
conditions or verify the flight condition, all input parameters throughout the
time history, and any other parameters that may affect the test result. In
essence, all those parameters which are necessary and helpful to
determine the outcome of the test.
The parameters which are being used as the test drivers, for example
pilot control positions, must also be noted. For each Validation Test case
discussed in this Handbook, a suggested list of plot parameters is
provided, though it should be noted that these lists are by no means
definitive in every case. One other comment worth making is that it is not
necessary to provide vast numbers of plots for every test - though in
general the issue has in the past been that too few, rather than too many
plots have been provided.
xxi
Evaluation Handbook 3rd Edition
The Standard requires proof of match within tolerance for the specified
variables. In every case where performance is out of tolerance and
correct implementation of the simulation mathematical model is ensured,
the first step should be to bring this condition to the attention of the data
provider. If the data provider is unavailable or is unable to resolve the
problem, and since it is ultimately the responsibility of the operator to
demonstrate the performance of the simulator, the operator may choose
to finely tune the mathematical model to bring the performance within
tolerance or else to search for another flight data sample which supports
a test that will meet the Standard. If fine-tuning proves impossible, or
alternative data do not exist, and at the discretion of the evaluator
(meaning the regulatory authority in charge of the evaluation),
performance mismatches may be discussed with a view to the use of
rational engineering judgement which may provide a reason for the
mismatch. Such reasons should typically only be provided for extremely
short duration excursions which can clearly be explained by reference to
other presented flight data or accounted for by an acceptable, logical
explanation. This type of procedure would address situations where short
duration variations in the flight data (for example a significant wind gust,
which may not be fully accommodated in the flight model or the test data)
cause an excursion in a performance parameter outside of the tolerance.
Out-of-tolerance results which are to be justified in this manner should not
cover a significant proportion of the test parameters, or of the time history.
Ideally, the out-of-tolerance excursion should not last more than 10% of
the pertinent portion of the time history. Such examples of the use of
engineering judgement are to be adequately documented with each test.
The main value of automatic testing is the ease and rapidity of conducting
tests and the inherent repeatability. Care must be exercised in the design
and implementation of automatic tests to ensure that the objective of
repeatability does not overshadow the objective of the test itself; i.e.
validation of a feature of the simulator. For more information, see Section
2.0 on page xxvii.
Manual testing serves to cross check and verify automatic testing and is,
therefore, very important to simulator validation. These tests must be run
“end-to-end” and without any backdriven parameters to confirm correct
implementation of the automatic tests as well as compliant simulator
performance. The manual test procedure listed for the test must be
complete enough so that the pilot evaluator can conduct the test using
xxii
Evaluation Handbook 3rd Edition
The Standard demands that all manual tests must meet the same
performance match as the automatic tests. To complete all the QTG
Validation Tests manually to an exact performance match would be a time
consuming task; therefore, at the discretion of the evaluator, manual test
results can be accepted that do not overlay the test data provided that a
logical interpretation of the results indicates a performance match. For
example, if a stick shaker speed is the performance parameter required,
the time history need only show that the simulator configuration and the
environment matched the test conditions, and that the aircraft was in a
steady, 1 knot/sec deceleration at the time the shaker activated. Where
multiple parameters must be in tolerance, each excursion beyond
tolerance of any parameter must be explainable by deviations in other
presented parameters. These excursions should be of short duration, i.e.
less than 10% of the pertinent time history. Failing these rationalisations
of results, the test must be rerun until a suitable match is obtained or until
it is determined that a problem with the test exists. For more information,
see Section 4.0 on page xxxv.
xxiii
Evaluation Handbook 3rd Edition
1.4.1 General
First check to see if the QTG is complete. Are all the required tests
presented, all the statements of compliance provided, and all the pertinent
information on the device provided?
xxiv
Evaluation Handbook 3rd Edition
b) the procedure matched the flight data, i.e. are the actions of the
test pilot being accurately duplicated?
Once it is determined that the test meets the objective then the following
should be reviewed:
ii) Data Sources - are the correct data sources nominated and
included in the QTG? Are the data presented adequate to prove
the test? Overplots of flight test data on simulator data is the
preferred method of test presentation. However, this is
insufficient to meet the data presentation requirements. Copies
of the reference data must also be part of the QTG. With the
advent of Electronic Qualification Test Guides, it may be that the
‘copies’ of the data are on a CD or other media only, but they
should nevertheless be included.
xxv
Evaluation Handbook 3rd Edition
xxvi
Evaluation Handbook 3rd Edition
2.1 INTRODUCTION
In the early days of Approval Test Guide (“ATG”, the term has now been
superseded by Qualification Test Guide, “QTG”) evaluations, the
validation tests would all be run manually, including the setting up of the
simulator initial conditions. Better design techniques, together with the
regulatory authority requirements for recurrent testing of the simulator at
regular intervals, has enabled all validation tests to be performed
automatically, using computer-driven stimuli.
The principles of automatic testing are not difficult to understand once the
basic testing requirements have been analysed. Essentially, the QTG
validation tests fall into two fundamental categories:
xxvii
Evaluation Handbook 3rd Edition
xxviii
Evaluation Handbook 3rd Edition
The evaluator will be looking for a set of results which demonstrates that
the test criteria have been met. Short term excursions exceeding the
tolerances may well be acceptable, but ideally, all validation tests will
remain within the specified tolerances for the entire duration of the test.
However, in reality this may not always be the case because of the many
vagaries of the flight test data, of the atmosphere and of the mathematical
model itself. It is under these circumstances that engineering judgement
must be used by the evaluator.
The 2nd edition of the ICAO Manual introduces much more definitive
criteria for QTG tests which have been developed and run against
engineering resource data rather than flight test resource data.
Specifically, the tolerances to be used have been reduced to 20% of
those which would apply against actual flight test parameters. This
requirement is also relevant to, for example, tests based on engineering
data which has been used for backup purposes because the flight test
data has been supplied with a different engine or avionics fit.
xxix
Evaluation Handbook 3rd Edition
3.1 INTRODUCTION
Given aircraft manufacturer's flight test data, it might be assumed that the
method of performing a particular test for a simulator QTG is obvious. In
practice, however, this is not necessarily the case, largely because of the
challenges experienced when any mathematical model is applied to
represent a real-world situation.
3.2 BACKGROUND
As time has progressed, it is clear that the validation tests which are to be
run in the simulator need to be run automatically, primarily for the
following reasons:
b) When flying the simulator, pilots have difficulty matching the exact
inputs of the flight test pilot for a given test. This becomes very
time intensive.
xxx
Evaluation Handbook 3rd Edition
This in itself does not mean that a test run inhouse cannot be carried
across to the simulator at all, or even with a low degree of confidence.
Indeed for many QTG tests it makes very little difference whether it is run
on a fully integrated and functioning simulator if one ignores any
requirement to make the test ‘look good’ from an aesthetic point of view
in the cockpit. A prime example is a minimum radius turn, however it may
be driven, for which the results should look identical with or without the
simulator hardware.
xxxi
Evaluation Handbook 3rd Edition
In an ideal world, there would either be a single engine variant for a given
aircraft type, or else the simulator data package would contain sufficient
flight test data for a complete QTG to be generated for each
aircraft-plus-engine combination. For example, for one large jet transport
aeroplane, the flight test was performed with a Pratt & Whitney engine,
whereas some operators use General Electric engines and other use
Rolls Royce engines. The simulator data are presented in terms of thrust,
not EPR or PLA/CSA (TRA for a FADEC engine) and in fact Gross
Thrust/Ram Drag model is required for the tests to work properly. Thrust
overwrites do solve this problem, but more rigorous testing of the engine
model is really needed and can only be achieved by driving a parameter
which is much closer to the pilot's controls, i.e. Power Lever Angle,
Cross-Shaft Angle, Throttle Resolver Angle etc. Sometimes the only
solution in the simulator is to set and/or drive the throttles to arbitrary
values which produce the requisite levels of net thrust.
The disadvantages of driving the simulated engines in this way are firstly
that slight differences in an engine type (e.g. derated engine) mean that
a different TRA/PLA etc may be required for each simulator and secondly
that the QTG tests can become highly susceptible to any modifications
made to the engine system software in the simulator. Also, engine
characteristics differ slightly from one engine to another even with
engines of the same type - and they change over time. The flight test
engines may not be representative of an ‘average’ engine in the fleet.
xxxii
Evaluation Handbook 3rd Edition
Modern jet transport fleets rarely retain the same avionics systems
throughout their useful life, and the flight simulator will reflect this. For
example, the installation of a TCAS system in the aeroplane is almost
always followed by a similar installation in the simulator. Whilst such
systems may not affect performance or handling as such, the QTG must
be flexible enough to take account of such changes, with at least
referencing their use in the Functions and Subjective section. Other types
of avionics changes may have a significant affect on the aeroplane
xxxiii
Evaluation Handbook 3rd Edition
xxxiv
Evaluation Handbook 3rd Edition
4.1 INTRODUCTION
In the past, attention has sometimes been lacking concerning the clarity
and practicality of QTG manual test procedures. Therefore, it is
considered both useful and necessary to redefine the objectives and
emphasise the techniques when performing QTG tests manually. The
regulatory authorities consider the ability to run each test manually an
important feature of simulator testing, and one which should certainly not
be discarded in favour of the tendency towards automatic testing. Both
methods should be treated with equal importance.
4.2 OBJECTIVES
4.3 TECHNIQUES
xxxv
Evaluation Handbook 3rd Edition
more behind the pilot's seat should be borne in mind and avoided. The
presence of a second crew member may help in these circumstances, as
it is possible that access to values that are not instrumented in the flight
deck may be necessary (e.g. rudder position, engine thrust, etc.) . The
second person may also be able to assist so that the tasks can be divided
or by giving call-outs.
For some tests, it can be helpful if the exact technique differs slightly from
the way the test is run automatically. One possible example is climb
tests, for which it is considerably easier to run the test manually if the
initial altitude is set to, say, 1000 feet below the altitude at which the test
officially begins, so as to allow the pilot to stabilise the condition prior to
recording. Also, the use of the autopilot might be considered in this case.
All instructions should ideally reference one parameter only. This will
usually be time, but can also be height or airspeed depending upon the
test being flown. Complicated or long-duration manoeuvres are usually
best described by simplifying or removing the less important details in
favour of allowing the pilot to understand what he is trying to achieve.
Whilst this may result in the pilot’s actions not precisely replicating the test
data, it is normally much easier to examine the simulator results for the
correct trends, and will often save testing time by alleviating the need for
several attempts at the same manoeuvre.
When carrying out a manual QTG check the pilot should first observe an
automatic check and note throttle and control positions, rates and any
relevant indications before starting the manual test. It may also be worth
checking that the pedals are aligned with zero rudder, as some earlier
simulators were prone to loose datum with vigorous wear.
xxxvi
Evaluation Handbook 3rd Edition
5.1 DISCUSSION
Functions tests should be run in a logical flight sequence at the same time
as performance and handling assessments. This also permits real time
simulator running for 2 to 3 hours, without repositioning or flight or
position freeze, thereby ascertaining (at least to a degree) proof of
reliability.
The ground and flight tests and other checks required for qualification are
listed in the ICAO Manual, Appendix C, table of functions and subjective
tests. The table includes manoeuvres and procedures to ensure that the
simulator functions and performs appropriately for use in pilot training and
checking in the manoeuvres and procedures normally required of a
training and checking program.
xxxvii
Evaluation Handbook 3rd Edition
xxxviii
Evaluation Handbook 3rd Edition
6.1 GENERAL
The way that the handling qualities of these aeroplanes are experienced
by the pilot may therefore depend on the operating mode of these
computers or even the sensor inputs they use or the state of the hydraulic
systems required to actually move the surfaces.
6.2 DISCUSSION
Tests in the non-normal state should always include the least augmented
state. Tests for other levels of control state degradation may be required
if significantly different handling qualities result from these states. The
xxxix
Evaluation Handbook 3rd Edition
An aeroplane where the pilot inputs to the primary control surfaces are
transferred and augmented via computers.
Normal control is the state where the intended control, augmentation and
protection functions for the activation of the primary control surfaces are
fully available.
Note:
xl
Evaluation Handbook 3rd Edition
N - Normal Control
NN - Non-normal Control
6.6 NOTES
c) Where "NN" appears it indicates that test data must be provided for
one or more non-normal control states, including the least
augmented state, if (and only if) the envelope protection is still active,
but different in this degraded mode.
xli
Evaluation Handbook 3rd Edition
There have been many examples of engineers losing sight of basic test
accuracies through using line printer formats that give aeroplane variables
to untold decimal places. Outside air temperature to 3 decimal places,
centre of gravity to 3 or 4 decimal places, pitch angle and angle of attack
to 3 decimal places, etc. Acceptable tabulated simulator result accuracies
should be defined in order to keep the engineers 'on track' and also give
a better hard copy result.
Instances also abound of data being plotted to scales that do not allow
proper or easy interpretation.
xlii
Evaluation Handbook 3rd Edition
Usually the most convenient method is to plot to the same scales as the
data supplier's hard copy, but there may be occasions where expanding or
contracting the scales facilitates better understanding of the results.
It is possible that the only available flight test results have problems, or that
no flight-test data exists for some parameters or tests.
Some examples:
c) No flight-test data available - use alternate data (see list below for
recommended order of priority) and request exemption from the
regulatory authority.
d) Certain variables not available from the flight test - request use of an
alternate variable or plot the simulator variable and use a rationale
to justify the value.
xliii
Evaluation Handbook 3rd Edition
xliv
Evaluation Handbook 3rd Edition
xlv
Evaluation Handbook 3rd Edition
For each test a set of basic parameters should be printed out which defines
the initial conditions for the test. These conditions should include:
Additional parameter time histories are required as defined for the test.
Appendix D of the IATA Document, “Flight Simulator Design and
Performance Data Requirements” (Reference 12) provides a list of
minimum parameters that are recommended for each validation test. The
main section of this Handbook also gives guidance on a test-by-test basis.
xlvi
Evaluation Handbook 3rd Edition
8.1 GENERAL
xlvii
Evaluation Handbook 3rd Edition
The CCS should clearly establish what changes may be made to the
simulator during the course of simulator maintenance. This should cover
both hardware and software changes that deviate from the simulator
configuration as qualified. The system should provide procedures to
advise management of any maintenance items requiring modification to
correct a discrepancy and the proposed change should be reviewed to
ensure it does not alter the configuration of the simulator as qualified.
The SCM should provide backup and recovery procedures which enable
the operator to recover from unintended software losses.
xlviii
Evaluation Handbook 3rd Edition
xlix
Evaluation Handbook 3rd Edition
9.0 REFERENCES
The following list is not exhaustive but is indicative of the major documents
which were used in compiling the International Standards for the
Qualification of Aeroplane Flight Simulation published by the RAeS in 1992
and subsequently released by ICAO as the “Manual of Criteria for the
Qualification of Flight Simulators” in 1995 and revised in 2003. For
completeness, the ICAO Manual itself is included as Reference 19.
l
Evaluation Handbook 3rd Edition
li
Evaluation Handbook 3rd Edition
This Handbook recognises that several of the above documents are now
essentially obsolete, but they have been left in for the benefit of those readers
who may wish to know the history of the ICAO Manual and its requirements.
lii
Evaluation Handbook 3rd Edition
This Handbook is the product of much thought and effort by a large number
of people within the flight simulator industry. The international working
group convened under the auspices of the Royal Aeronautical Society in
1990 met on a total of 5 occasions plus additional meetings of various
subcommittees. In 2001, an international working group was convened
under the co-sponsorship of the European Joint Aviation Authorities and
the U.S. Federal Aviation Administration to develop the Second Edition of
the ICAO Manual. The total membership of both these working groups
consisted of the following organisations:
Aeroformation
Air France
American Airlines
Ansett Airlines of Australia
liii
Evaluation Handbook 3rd Edition
British Airways
Delta Airlines
Deutsche Lufthansa
FlightSafety Boeing
GECAT
KLM
Monarch Airlines
Northwest Airlines
QANTAS
SAS
Simuflite Training International
Swissair
United Airlines
United Parcel Service
US Air
10.5 OTHERS
The author would also especially like to thank the following individuals for
the valuable input they have given to this 3rd edition:
Ian Bateman
Bob Curnutt
liv
Evaluation Handbook 3rd Edition
Gerry Elloy
Murph Morrison
Ken Neville
Andy Ramsden
Ron Sarich
Dave Shikany
lv
Evaluation Handbook 3rd Edition
Each test prescribed in Appendix B of the ICAO Manual is listed in the INDEX on
the following pages, and described in the Sections that follow the index. In both
the index and the Sections following, each test is assigned a number or
designator which corresponds to the designation in the ICAO Manual "Table of
Validation Tests". For example, in the ICAO Manual, the first main test heading
is "1. PERFORMANCE". The first sub-heading is "a) taxi", and the first test
listed is "(1) Minimum Radius Turn".
In this Handbook, the first section is Section 1, Performance and the contents of
Section 1 are 1a Taxi, 1b Takeoff and so on. Section 1a Taxi lists tests 1a(1),
Minimum Radius Turn and 1a(2), Rate of Turn versus Nosewheel Steering Angle.
This Handbook is arranged, therefore, so that each test designator corresponds
to the same test in the ICAO Manual of Criteria for the Qualification of Flight
Simulators (Reference 19).
The "PAGE" referred to below is with reference to the sections of this handbook
which follow, the purpose of which is to give general guidance on individual tests.
1 PERFORMANCE 1-1
1a TAXI 1A-1
1b TAKEOFF 1B-1
lvi
Evaluation Handbook 3rd Edition
1c CLIMB 1C-1
1e STOPPING 1E-1
lvii
Evaluation Handbook 3rd Edition
1f ENGINES 1F-1
lviii
Evaluation Handbook 3rd Edition
lix
Evaluation Handbook 3rd Edition
2c LONGITUDINAL 2C-1
lx
Evaluation Handbook 3rd Edition
lxi
Evaluation Handbook 3rd Edition
lxii
Evaluation Handbook 3rd Edition
2e LANDINGS 2E-1
2g WINDSHEAR 2G-1
lxiii
Evaluation Handbook 3rd Edition
* all tests should be run in both normal and non-normal control states where
the function is different.
lxiv
Evaluation Handbook 3rd Edition
lxv
Evaluation Handbook 3rd Edition
lxvi
Evaluation Handbook 3rd Edition
lxvii
Evaluation Handbook 3rd Edition
12.1 GENERAL
The Evaluation Criteria for a particular test within a simulator QTG may
well be very specific to that test or that aeroplane, and take account of
anomalies in the data, etc. To use the word 'Criteria' in a document which
by its very nature can only be generic may be misleading.
Finally, many sections contain example test results, and for many of these
it was decided to choose marginal failure cases, as there seemed little
merit in producing a Handbook in which all cases met the criteria perfectly,
since to do so would not be representative. The nature of the engineering
task associated with generating a QTG is such that it would be unheard
of to produce a document that invites no comments whatsoever, but it is
not the intent of this Handbook to suggest that QTG’s should contain
results that fail so that the regulators can find and comment on them. The
examples given are in some cases illustrations of things that can go wrong
during test development, and in others merely a comment on the lack of
lxviii
Evaluation Handbook 3rd Edition
perfection which will, for the foreseeable future, always be inherent in the
flight simulator evaluation process. That, after all, is the reason for this
Handbook.
Note that any set of results shown by example in this Handbook can only
be a sub-set of the full results for that test, as space does not permit full
sets of results to be included. Also, the comments accompanying these
plots may in many cases be applied to other tests as well - the intention
is to give a fair cross-section of possible problems which may arise during
QTG development in the hope of promoting increased understanding of
the task and thereby a greater ability to evaluate.
lxix
Evaluation Handbook 3rd Edition
SECTION 1
PERFORMANCE
1a TAXI
1b TAKEOFF
1c CLIMB
1d CRUISE/DESCENT
1e STOPPING
1f ENGINES
1-1
Evaluation Handbook 3rd Edition
Concerning the engine variant, it is often the case that there are little or
no flight test data available (for simulator use) from the aeroplane
manufacturer with the airframe/engine combination operated by the
purchaser of the simulator. Under these circumstances it may be
necessary to present the engine information in terms of thrust (or torque,
etc) but this does not remove any need to also show the engine
parameter displayed to the flight crew so that it can be proven that the
QTG test is being run with the engines (and other) mathematical model
included as comprehensively as possible.
1-2
Evaluation Handbook 3rd Edition
SECTION 1a
TAXI
1A-1
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES In the past, Operations Manual data were often used
for this check, since these data were not typically
recorded during an aeroplane flight test program.
More recently however Ops Manual data are not
used, being replaced by time histories of the relevant
ground steering information, including the locus of
the cg calculated from flight data and of the paths of
nosewheel and each main gear, which are plotted
1A-2
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The simulated aeroplane should be aligned with either left or right main landing
gear nearest to the runway edge to ensure enough runway width for the
manoeuvre. The aeroplane speed should be kept constant (using engine
power as necessary) at a value just sufficient for the manoeuvre (typically less
than 10 knots, so that the tyre slip angle is kept to a minimum) and the tiller
should be at its maximum deflection.
EXAMPLE
Figure 1a-1 shows some of the time history plots for an older simulator. The
results look very good versus the aeroplane data, but the calculation of the
turning radius should be shown along with these results so that the simulator
can be compared with the aeroplane data.
1A-3
Evaluation Handbook 3rd Edition
Figure 1a1-1
Example of Simulator Test Results for Minimum Radius Turn
1A-4
Evaluation Handbook 3rd Edition
Figure 1a1-2
Example of OPS Manual Data for Minimum Turn Radius
1A-5
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES Compare the simulator yaw rate with that of the
aeroplane for the various combinations of nosewheel
angle and ground speeds provided. The simulator
ground speed is typically driven to the flight test
values to ensure the test conditions are correct. In
some older data packages, it may have been the
case that neither the engine power settings (except
perhaps thrust) nor runway slope and condition were
specified in the flight test data. Nevertheless the
evaluator should be satisfied that no external
influences such as crosswinds, asymmetric engine
1A-6
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
EXAMPLE
Figure 1a2-1 shows a continuous time history, but the ground speed only
varies from around 20 knots down to 18.5 knots, and so does not fulfil the
requirements, though the time histories shown do meet the tolerances.
1A-7
Evaluation Handbook 3rd Edition
Figure 1a2-1
Example of Simulator Test Results for Rate of Turn versus Nosewheel Steering Angle
1A-8
Evaluation Handbook 3rd Edition
SECTION 1b
TAKEOFF
1B-1
Evaluation Handbook 3rd Edition
Most flight test packages for simulator use provide data for more than
one normal takeoff test, though for some aeroplanes there is only one
test available for the engine inoperative and/or the crosswind takeoffs.
The regulatory authorities are concerned that a spread of test data is
used to validate the simulator which identifies the correct characteristics
across all commonly-used takeoff flap settings, so care must be taken
when choosing which set of data is to be used for the QTG for tests
1b(3), 1b(4), 1b(5) and 1b(6).
Often this choice will be limited to which ‘normal’ takeoff data to use,
but if, as is the case in at least one data package, the only available
engine inoperative and crosswind takeoff tests were performed at the
same flap setting, then it will be necessary to use separate flap settings
for the maximum and light weight scenarios required for test 1b(4).
However, the requirements do allow for use of the same test data for
both the Ground Acceleration test (1b(1)) and either the Normal Takeoff
test (1b(4)) or the Rejected Takeoff test (1b(7)), since the former test
must start at brake release and therefore includes the entire ground roll.
1B-2
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED AIRSPEED
PARAMETERS GROUND SPEED
PITCH CONTROLLER POSITION
ELEVATOR ANGLE
PITCH ANGLE
STABILISER ANGLE
WIND SPEED COMPONENTS
ENGINES KEY PARAMETERS
DISTANCE ALONG RUNWAY
1B-3
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
EXAMPLE
The test results shown in Figures 1b1-1 and 1b1-2 show good correlation
between the aeroplane and the simulator. The engine thrust could perhaps
have been slightly better matched by driving the power levers more judiciously
to give the desired thrust increase at approximately 15 seconds, but the total
distance for the simulator is a little lower than for the aeroplane and so
1B-4
Evaluation Handbook 3rd Edition
correlates. The airspeed clearly shows that there was some wind present
during the manoeuvre.
Figure 1b1-1
Example of Simulator Test Results for Ground Acceleration Time & Distance (Part 1)
1B-5
Evaluation Handbook 3rd Edition
Figure 1b1-2
Example of Simulator Test Results for Ground Acceleration Time & Distance (Part 2)
1B-6
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED AIRSPEED
PARAMETERS ENGINES KEY PARAMETERS
LATERAL DEVIATION FROM RUNWAY
1B-7
Evaluation Handbook 3rd Edition
CENTRELINE
RUDDER PEDAL POSITION
RUDDER ANGLE
NOSEWHEEL ANGLE
ROLL CONTROLLER POSITION
AILERON ANGLE
SPOILER ANGLES
SIDESLIP ANGLE
HEADING ANGLE
YAW RATE
BANK ANGLE
WIND SPEED COMPONENTS
RUDDER PEDAL FORCE (IF REVERSIBLE
CONTROLS)
1B-8
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The minimum control speed test (or equivalent) is very dependent on the timely
input of both the engine failure (fuel cut/throttle slam to idle, etc.) and the
1B-9
Evaluation Handbook 3rd Edition
appropriate rudder input at the required speed to correct the resultant yaw.
Thus it will almost certainly be necessary for some form of assistance to be
obtained by most pilots when manually running this test on the simulator. The
normal procedure will be to commence a normal takeoff roll with the power set
accurately and allow the pilot flying to concentrate on maintaining control of the
simulator. At the requisite speed the engine should be cut such that the thrust
decay compares favourably with the data and, immediately upon recognition of
this failure, the pilot should input full rudder to arrest the yaw such that the
lateral deviation stops increasing (or even decreases) and then regain and
maintain directional control. It is not necessary to perform the takeoff itself, but
the test should be terminated once it is confirmed that directional control has
been regained.
EXAMPLE
The test result shown in Figure 1b2-1 shows a very poor match for lateral
distance. Examination of the pilot input traces for engine thrust and rudder
position clearly reveals that the engine thrust decay begins approximately one-
third of a second late. The effect is that the rudder is driven correctly in relation
to time and airspeed but not in relation to the state of the engine, and the
simulated aeroplane responds to the rudder before it should. This is a classic
example of why the engine failure needs to be within ±1kt of the aeroplane
data, but there may be certain occasions and certain simulators for which the
timing of the engine failure is even more critical than this. Here, the flight test
shows the engine failure to be at approximately 94 knots, but the simulator
engine failure is around 95.5 knots, so the difference is only slightly greater
than the requisite 1 knot, yet the test still fails badly.
1B-10
Evaluation Handbook 3rd Edition
Figure 1b2-1
Example of Simulator Test Results for Minimum Control Speed, Ground
1B-11
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
1B-12
Evaluation Handbook 3rd Edition
Note that Vmu should be within 3 kt, not just that the
trace should be always within 3 kt, but with unstick
(for example) several seconds late. If the airspeed
match does not hold within 3 kt throughout the time
history (within ‘reason’), but unstick is on-speed and
on-pitch then the test is a pass.
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The easiest way to run this test manually is to begin the takeoff roll from the
static condition. This will enable the power to be set accurately well in advance
of the elevator being applied. The plotting need not begin until 10 knots before
start of rotation though. The pitch control input must match the aeroplane data
as precisely as possible or the test is unlikely to pass. This may or may not
1B-13
Evaluation Handbook 3rd Edition
mean that full nose up elevator is applied - typically the elevator is relaxed by
the flight test pilot in order to gently achieve ground contact pitch attitude. The
test may be concluded no sooner than 5 seconds after the main gear has left
the ground.
EXAMPLE
Figure 1b3-1
Example of Simulator Test Results for Minimum Unstick
1B-14 Speed
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED AIRSPEED
PARAMETERS PITCH CONTROLLER POSITION
PITCH CONTROLLER FORCE (IF REVERSIBLE
CONTROLS)
ELEVATOR ANGLE
MAIN GEAR HEIGHT ABOVE GROUND/RADIO
ALTITUDE
PITCH ANGLE
PITCH RATE
ANGLE OF ATTACK
RATE OF CLIMB
FLIGHT PATH ANGLE
LANDING GEAR POSITION
STABILISER ANGLE
BANK ANGLE
ROLL CONTROLLER POSITION
1B-15
Evaluation Handbook 3rd Edition
AILERON ANGLE(S)
SPOILER ANGLES
WIND SPEED COMPONENTS
ENGINES KEY PARAMETERS
1B-16
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The typical form of the aeroplane flight test data will require the pilot to perform
a normal takeoff with the requisite preset conditions. The main issues in this
test are controllability on the runway, rotation forces at Vr and ground effects.
Careful reference to the flight test data may reveal some abnormality in the
method used during the flight test itself and hence the "feel" of this particular
test should not be used solely for any criticism of the simulated takeoff
performance. Ideally the data should call for the full regime from brake release
through at least 61m (200ft) above ground level, but if this is not the case the
manual test will probably be easier to execute when carried out in this fashion,
rather than beginning the test at a preset speed.
Obtaining a good match of airspeed, pitch angle and angle of attack may take
several attempts as it will be very difficult to follow all of the flight test pilot's
inputs simultaneously. Comparison of the results should therefore take this into
account.
EXAMPLE
The results shown in Figure 1b4-1 are generally quite good, but also illustrate
the principal mentioned above that it can be difficult to precisely match all
parameters within tolerance continuously for the entire duration of the test.
Ideally for a normal takeoff, there would be no wind, but in this test there is at
least one wind component which exhibits gusting of several knots, and it may
be this that accounts for the larger-than-desired bank angle excursion on liftoff.
Bank angle is, strictly speaking, not a toleranced parameter for a normal
takeoff test, but the regulatory authorities may choose not to ignore it as being
relevant if its value differs significantly from the aeroplane data. Another
feature of these plots is the scale used for the radio altitude trace, which has
been poorly chosen such that it is difficult to determine compliance with the
±20ft tolerance.
1B-17
Evaluation Handbook 3rd Edition
Figure 1b4-1
Example of Simulator Test Results for Normal Takeoff
1B-18
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
1B-19
Evaluation Handbook 3rd Edition
1B-20
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The data will usually provide for the use of normal takeoff procedures up to the
point at which the engine is failed. The timing of the failure is obviously critical,
and whilst the ICAO Manual allows the engine failure speed to be within ±3
knots of the flight test speed, the reality is that a speed much more accurate
than this is needed in order to facilitate a reasonable comparison. Reasons for
this are explained in the comments for test 1b(2). After the engine has been
failed, the pilot should continue with the takeoff manoeuvre while maintaining
and assessing both bank and heading control. The entire takeoff profile should
be recorded from brake release through 61m (200ft) above ground level. See
notes for the Normal Takeoff test regarding achievement of a good match.
EXAMPLE
The results shown in Figure 1b5-1 again illustrate the point that it can be
problematic to fully comply with the tolerances for the entire test duration. The
sideslip angle deviates from the requisite ±2 degrees, but this could probably
be improved by a better match of bank angle (and therefore heading, not
shown). Interestingly, however, the captain’s wheel position is already
significantly greater than the aeroplane data, and using wheel to reduce the
bank angle would make this even worse. The methods used to improve such
results as these can be very complex, but usually entail fine adjustments of
several parameters - particularly the airspeed and the liftoff point in this
particular case.
1B-21
Evaluation Handbook 3rd Edition
Figure 1b5-1
Example of Simulator Test Results for Engine Inoperative Takeoff
1B-22
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED AIRSPEED
PARAMETERS ROLL CONTROLLER POSITION
CONTROL WHEEL FORCE (IF REVERSIBLE
CONTROLS)
BANK ANGLE
ROLL RATE
AILERON ANGLE(S)
SPOILER ANGLES
RUDDER PEDAL POSITION
RUDDER PEDAL FORCE (IF REVERSIBLE
CONTROLS)
RUDDER ANGLE
ELEVATOR ANGLE
PITCH CONTROLLER POSITION
LONGITUDINAL CONTROL FORCE (IF
REVERSIBLE CONTROLS)
STABILISER ANGLE
PITCH ANGLE
PITCH RATE
ANGLE OF ATTACK
MAIN GEAR HEIGHT ABOVE GROUND/RADIO
1B-23
Evaluation Handbook 3rd Edition
ALTITUDE
ENGINES KEY PARAMETERS
NOSEWHEEL ANGLE
SIDESLIP ANGLE
HEADING ANGLE
WIND SPEED COMPONENTS
1B-24
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
Perform takeoff manoeuvre with the preset conditions as per the flight test
data. The entire ground run should be flown, from brake release through at
least 61m (200ft) above ground level while maintaining heading control. The
simulation of this test will require the use of special test data for the wind profile
rather than the simple insertion of a generalised crosswind from the instructor
station.
Pitch angle, angle of attack, airspeed, height above ground, bank and sideslip
angles should be compared versus actual aeroplane data, but see the notes
from the Normal Takeoff test concerning the achievement of an exact match.
EXAMPLE
The example test results in Figure 1b6-1 show a generally good match
between the simulator and the aeroplane data for those parameters shown.
However, the plots begin when the airspeed reaches approximately 120 knots,
whereas the requirements call for the entire ground roll to be compared. Some
aeroplane data for this test has in the past been presented in two sections,
brake release to liftoff and liftoff to 200 ft above ground. If the data is presented
in this way it can mean that merging the two sets of flight test/proof of match
can present problems when attempting to use that data directly in the
simulation. The engine thrust match indicates that the engines were correctly
driven using power lever angles (or equivalent), but there is a significant
reduction in thrust - present in the aeroplane data, and matched by the
simulator - which may warrant explanation from the data provider.
1B-25
Evaluation Handbook 3rd Edition
Figure 1b6-1
Example of Simulator Test Results for Crosswind Takeoff
1B-26
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
1B-27
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The usual procedure calls for a full takeoff roll with the preset conditions as per
the validation data. The manoeuvre should be executed from brake release
and the aborted takeoff initiated at the appropriate speed. Maximum braking
effort (auto or manual) should be applied, and braking continued until the
aeroplane comes to a complete stop. Autobrakes should be used if they are
available.
This is usually a fairly simple test and so a good match can often be achieved
versus the flight test data. It is necessary though to properly coordinate the
actions required to initiate the abortion of the takeoff.
EXAMPLE
The example in Figure 1b7-1 provides for generally good matches for each of
the parameters shown, though the initial value of engine thrust could perhaps
be increased slightly to remove the offset. However, the aeroplane data begins
at approximately 150 knots - only 3 seconds or so prior to the abort manoeuvre
is initiated, rather than showing the entire ground roll from brake release as per
the requirements. Little can be done by the simulator manufacturer if the data
is presented in this way, especially if there is wind profile data present, since all
parameters have indeterminate values below the initial speed provided in the
data. If there is a choice, then selection of a different set of data may be the
answer. In this particular case, there was no alternative.
1B-28
Evaluation Handbook 3rd Edition
Figure 1b7-1
Example of Simulator Test Results for Rejected Takeoff
1B-29
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
1B-30
Evaluation Handbook 3rd Edition
SPOILER ANGLES
ENGINES KEY PARAMETERS
EVALUATION NOTES When comparing the aeroplane time history with the
simulator test results, it should be borne in mind that
the parameters in question for this test are the
simulated aeroplane body rates (roll, pitch and yaw)
which are generated by the engine failure with the
aeroplane in takeoff configuration. These in turn will
be highly dependent on the rate at which the failed
engine thrust decays and also on the thrust levels at
both takeoff and windmill (or idle) power, as well as
on the simulated aerodynamics themselves.
Therefore great care should be taken to ensure that
these parameters above all others are being
accurately reproduced during the test. This is
especially so, bearing in mind the very short duration
of the test after the engine has been failed (5
seconds or less). Further changes in the aeroplane
state beyond this period should not be expected to
correspond well with the aeroplane data, especially if
there is pilot activity for this portion of the
manoeuvre. The engine failure is usually replicated
by a snap deceleration to the idle position rather then
a fuel cut. The speed at which the engine failure is
introduced must be within ±3 knots of the aeroplane
data.
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
After takeoff, fail the critical engine as per flight test data. The test parameters
should be recorded from 5 seconds before engine failure (which must be
1B-31
Evaluation Handbook 3rd Edition
actuated as close as possible to the speed at which the engine was failed
during the flight test, and certainly within ±3 knots) to 5 seconds after engine
failure or when the aeroplane reaches a bank angle of 30 deg, whichever
occurs first. Naturally, the simulated aeroplane should not be allowed to fly out
of control beyond the above time or bank angle, but in any case the test will
usually be terminated as soon as it is feasible to do so, which may be prior to
wings level recovery.
The tolerances apply to pitch rate, roll rate and yaw rate, but for manual testing
it may be more useful to base the procedure on the appropriate angles
themselves rather than on the angular rates, which can be awkward to
manually quantify to the extent required.
EXAMPLE
The example result in Figure 1b8-1 appears to exhibit generally good matches
for the three angular rates (pitch rate, roll rate and yaw rate), but closer
examination is likely to reveal that the ‘aeroplane’ data used here is actually
engineering simulation data, not true flight test. On this basis, the parameters
in question must follow the data more closely than for flight test data, and the
result shown would not adequately meet the tolerances.
1B-32
Evaluation Handbook 3rd Edition
Figure 1b8-1
Example of Simulator Test Results for Dynamic Engine Failure After Takeoff
1B-33
Evaluation Handbook 3rd Edition
1B-34
Evaluation Handbook 3rd Edition
SECTION 1c
CLIMB
1C-1
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The prime consideration for this test is whether the
recorded rate of climb matches that of the aeroplane.
However, rate of climb time histories can exhibit
sensitivities which are difficult to follow in the
simulator. Therefore, an equivalent method of
measuring the overall climb rate during the test is to
measure the time taken to climb from one specific
altitude to another at the airspeed recorded during the
flight test program. The altitude interval must be at
1C-2
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
Typically, the method for this test will require the pilot to climb at constant
calibrated airspeed for at least 60 seconds using stabiliser trim as required. The
average rate of climb for the whole manoeuvre is checked by measuring the
altitude change over 60 seconds, provided that the altitude change during this
period exceeds 300m (1000ft). It will be useful to the pilot flying this test if the
altitude is initially set to 1000 feet or so below that at which the recording/plotting
needs to commence, so as to allow him to stabilise the aeroplane before the
tolerances are applied. Also, it may be feasible to make use of the autopilot.
EXAMPLE
1C-3
Evaluation Handbook 3rd Edition
Figure 1c1-1
Example of Simulator Test Results for Climb in Clean Configuration
1C-4
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The prime consideration for this test is whether the
recorded steady state rate of climb matches that of the
1C-5
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The ICAO Manual specifies that the climb be maintained over an altitude interval
1C-6
Evaluation Handbook 3rd Edition
of at least 300 m (1000 ft). Typically, stabiliser trim is used as required. Rudder
trim should be used to balance the asymmetric thrust and minimise sideslip so
as to maintain heading. Maintaining the aeroplane data airspeed is of particular
importance if good results are to be obtained. Rudder trim should be used to
balance the asymmetric thrust and minimise sideslip so as to maintain heading,
and the bank angle should be determined from the data and followed as closely
as possible. It will be useful to the pilot flying this test if the altitude is initially set
to 1000 feet or so below that at which the recording/plotting needs to commence,
so as to allow him to stabilise the aeroplane before the tolerances are applied.
If there are two pilots, one can fly the manoeuvre and the other check and correct
engine power settings.
EXAMPLE
Figure 1c2-1 is a clear illustration of a set of aircraft data that were inadequate
(no wind speeds were offered by the data provider), but which makes very little
difference to the overall test result. Under these circumstances it is doubtful that
it would be necessary to follow the wheel position accurately throughout the test,
and use of a closed-loop controller has been used to maintain bank angle (for
which again, no data was provided, but a mean value of -2 degrees or so would
presumably have been used). The rate of climb and rudder angle are clearly
within acceptable limits for this very long duration climb test.
1C-7
Evaluation Handbook 3rd Edition
Figure 1c2-1
Example of Simulator Test Results for Engine Inoperative Climb, Second Segment
1C-8
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The prime consideration for this test is whether the
recorded time to climb, distance travelled and fuel
used match that of the aeroplane data for the engine
inoperative climb condition. The altitude interval must
1C-9
Evaluation Handbook 3rd Edition
MANUAL TESTING
Typically, the method for this test will require the pilot to climb at constant
calibrated airspeed for several minutes (corresponding to at least 1500 m / 5000
ft) using stabiliser trim as required. Rudder trim should be used to balance the
asymmetric thrust and minimise sideslip so as to maintain heading. Maintaining
the aeroplane data airspeed is of particular importance if good results are to be
obtained. The time and perhaps the fuel used may be recorded by the pilot in the
cockpit, but the distance travelled for the whole manoeuvre will be checked by
examination of the computed values of these parameters.
1C-10
Evaluation Handbook 3rd Edition
EXAMPLE
Figure 1c3-1
Example of Simulator Test Results for Engine
Inoperative Enroute Climb
1C-11
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
DEMONSTRATION Establish a steady climb with one engine out and go-
around power on the operating engine(s) with
approach or go-around flaps and landing gear
retracted over an altitude interval of at least 300 m
(1000 ft). All anti-ice or de-icing systems should be
operating normally. It is not intended that ice
accumulation be present on the lifting surfaces.
Operational considerations for approach in icing, such
as adjustment to airspeed and weight limit, should be
in effect.
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
1C-12
Evaluation Handbook 3rd Edition
AILERON ANGLE
SPOILER ANGLES
YAW CONTROLLER POSITION
RUDDER ANGLE
SIDESLIP ANGLE
WIND SPEED COMPONENTS
EVALUATION NOTES See notes for Test 1c(2). This test will almost certainly
use a very similar, if not identical technique. This test
is only required for those aeroplanes whose Approved
Flight Manuals require "Icing Accountability". In other
words, it is intended to apply only to aeroplanes for
which there is an operational requirement for flight in
icing. If the AFM does not state any operational
limitations (such as approach speed increment or
modified flap setting) in icing conditions then the
aircraft does not have icing accountability. - Most jet
transport aircraft do not, many turbo-prop aircraft do.
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
See notes for Test 1c(2). This test will almost certainly use a very similar, if not
identical technique, the exception being that the aeroplane should be configured
with all anti-ice and de-ice systems operating normally, gear up and go-around
flap. All icing accountability considerations, in accordance with the AFM for an
approach in icing conditions, should be applied.
EXAMPLE
1C-13
Evaluation Handbook 3rd Edition
In Figure 1c4-1 the idle thrust (on the #2 engine) is lower by approximately 100
lbs than the aircraft data suggests. While this is small compared with the
combined total net thrust, nevertheless an attempt has been made to offset this
small inconsistency by applying an equivalent extra amount on the #1 engine.
The difference in the rate of climb that would result if this were not applied would
most likely be negligible, but the modification was at least made for a logical
reason. Few regulators would be concerned with this result, but it may still be
worth an explanatory note in the QTG.
Figure 1c4-1
Example of Simulator Test Results for Engine Inoperative Climb, Approach
1C-14
Evaluation Handbook 3rd Edition
SECTION 1d
CRUISE/DESCENT
1D-1
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
1D-2
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The aeroplane should be trimmed for steady level flight in the cruise configuration
and ideally allowed to remain in this state for at least 5 seconds whilst recording
takes place prior to the initiation of the manoeuvre.
The engine throttle levers should be steadily pushed forward so that the engine
power increases to the requisite constant level, which should ideally be exactly
the same for each engine, and the test commenced only after the engine power
has stabilised. This may require the initial airspeed to be set lower then that at
which recording begins. The autopilot may be used to maintain altitude (or the
pilot may elect to do this manually) until the speed has increased by at least 50
knots from the initial value.
EXAMPLE
Referring to Figure 1d1-1, the aeroplane data (dotted line) indicates that the total
time taken to increase speed from 200 knots to 253 knots is 140 seconds.
Applying the 5% tolerance to this value gives a maximum time of 147 seconds,
whereas the simulator takes around 149 seconds. Hence the test is a marginal
failure. However, the thrust (at least on no. 1 engine) is a few hundred pounds
low. Increasing this to the correct value would probably result in the test result
just coming within the tolerance band, albeit still slightly on the high side of the
aeroplane value.
1D-3
Evaluation Handbook 3rd Edition
Figure 1d1-1
Example of Simulator Test Results for Level Flight Acceleration
1D-4
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
1D-5
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The aeroplane should be trimmed for steady level flight in the cruise configuration
and ideally allowed to remain in this state for at least 5 seconds whilst recording
takes place prior to the initiation of the manoeuvre.
The engine throttle levers should be steadily brought back to the flight test
(usually idle) position. The autopilot may be used to maintain altitude (or the pilot
may elect to do this manually) until the speed has decreased by at least 50 knots
from the trim value.
EXAMPLE
1D-6
Evaluation Handbook 3rd Edition
Figure 1d2-1
Example of Simulator Test Results for Level Flight Deceleration
1D-7
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The prime purpose of this test is to ascertain that the
engine parameters are consistent with one another
and with the aeroplane during a steady state cruise
situation. Hence the test does not need to be plotted
as a time history, especially since all that would be
seen on the plots is a series of virtually straight lines.
Instead, two separate sets of the same parameters
should be recorded - one at the beginning of the 3
minute period and the second at the end, to check that
these items correspond well with the aeroplane data.
Alternatively, a single snapshot may be presented
showing instantaneous fuel flow. Also, the total fuel
1D-8
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
With the simulator trimmed at the preset conditions for level flight, continue to fly
with wings level and at constant altitude for a period of at least 3 minutes. Do not
alter the throttle settings, unless such alterations are evident from the flight test
data recordings. Record either EPR, N1 or Engine Torque and also Fuel Flow
and after the test is complete compare them with the aeroplane data. It may
assist the pilot if an instructor station maintenance page is displayed which gives
values of, for example, engine thrusts.
EXAMPLE
There are two sets of results shown in Figure 1d3-1, the first at time=0 seconds
and then the second 300 seconds later. This duration was formerly specified in
the ICAO Manual 2nd Edition, but is now reduced to 3 minutes (180 seconds) as
a minimum.
1D-9
Evaluation Handbook 3rd Edition
Figure 1d3-1
Example of Simulator Test Results for Cruise Performance
1D-10
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The prime purpose of this test is to ascertain that the
achieved rate of descent corresponds well with the
aeroplane during a descent with engines idle. The test
should be recorded over an altitude interval of at least
300m (1000ft). If the aircraft validation data were
gathered using an engine variant that is not present
on the simulator, a second test should be run using
engine thrusts rather than pilot controls as the driving
input to show that the simulator gives the same
1D-11
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
With the simulator trimmed at the preset conditions for the descent, continue to
fly in a stabilized descent at the prescribed airspeed for an altitude interval of at
least 300m (1000ft). Note the rate of descent for comparison with the aeroplane
data.
EXAMPLE
The way in which a test is run can have a significant effect on the plotted results,
as Figure 1d4-1 illustrates. The test has clearly been run using automatic drivers,
probably attempting to maintain airspeed using the pitch controller. The result as
shown does not indicate any particular problem with the simulation itself, but the
automatic driver gains have been set too high, or are otherwise incorrectly
programmed such that the result is unlikely to be acceptable by the authorities.
1D-12
Evaluation Handbook 3rd Edition
Figure 1d4-1
Example of Simulator Test Results for Idle Descent
1D-13
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The prime purpose of this test is to ascertain that the
achieved rate of descent corresponds well with the
aeroplane during an emergency descent. The test
should be recorded over an altitude interval of at least
900m (3000ft). If the aircraft validation data were
gathered using an engine variant that is not present
on the simulator, a second test should be run using
engine thrusts rather than pilot controls as the driving
1D-14
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
With the simulator stabilized at mid-altitude and at or near Vmo for the descent,
continue to fly in a stabilized descent at constant airspeed for an altitude interval
of at least 900m (3000ft). Note the rate of descent for comparison with the
aeroplane data, and perform a brief check of spoiler blowdown.
EXAMPLE
Running each QTG test manually has always been important to the regulators,
and Figure 1d5-1 is an example of a manually run emergency descent. The
duration is considerably longer than required by the ICAO Manual, but controlling
the simulated aeroplane for long periods should not be problematic for relatively
steady state tests such as this. The initial deviation in rate of descent may be
because the trimmed state is slightly incorrect, but it should be borne in mind that
the value yielded during a snapshot check of descent rate - especially at high
altitude/mach number combinations - will not be the same as when the test is run
as a time history
1D-15
Evaluation Handbook 3rd Edition
Figure 1d5-1
Example of Simulator Test Results for Emergency Descent (Manual)
1D-16
Evaluation Handbook 3rd Edition
SECTION 1e
STOPPING
1E-1
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The test will begin with the simulated aeroplane set
up on the runway at the prescribed speed and
runway reference heading, usually with speed and
position frozen. It is not necessary, and may even
confuse the results, for a complete landing or
rejected takeoff manoeuvre to be executed.
1E-2
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
This test is fairly simple to execute as it merely involves setting the simulated
aeroplane up on the runway at the prescribed configuration, speed and runway
reference heading, allowing it to stabilize with speed and position frozen, and
then releasing both freezes with the brakes fully applied. Since the runway is
dry and the braking symmetrical, no significant steering inputs should be
necessary.
Manual braking (not auto) is to be used for this test, unless the data specify
otherwise. Ideally, ground spoilers will not be used, but this will be dependent
on the aeroplane data.
EXAMPLE
In Figure 1e1-1 the result is just out of tolerance with the airspeed taking too
little time to reduce relative to the aircraft. This is almost certainly because the
heading deviation has caused the aircraft to yaw which would have the effect
of adding a slight extra stopping force. However, the other item of note, which
1E-3
Evaluation Handbook 3rd Edition
Figure 1e1-1
Example of Simulator Test Results for Stopping Time & Distance, Dry Runway
1E-4
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The test will begin with the simulated aeroplane set
up on the runway at the prescribed speed and
runway reference heading, usually with speed and
position frozen. It is not necessary, and may even
confuse the results, for a complete landing or
rejected takeoff manoeuvre to be executed.
1E-5
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
This test is not as simple to execute as Test 1e(1) because whilst it also
involves setting the simulated aeroplane up on the runway at the prescribed
configuration, speed and runway reference heading, following the flight test
pilot's actions for the correct movement of the power levers to achieve the flight
test values of EPR, N1 or Thrust can be quite difficult. Thus it may require
some very well-worded manual test procedures to obtain a good match. Once
again though, since the runway is dry and the reverse thrust (usually)
symmetrical, no significant steering inputs should be necessary. Ideally,
spoilers will not be used so that just the effects of reverse thrust can be
ascertained, but this will be dependent on the aeroplane data.
EXAMPLE
Two (partial) sets of contrasting results are displayed in Figures 1e2-1 and
1e2-2 below. The first produces an exact match for both distance and ground
speed, whereas the second gives a result which is slightly out of tolerance for
both parameters. The differences were eventually resolved, but the ‘first
1E-6
Evaluation Handbook 3rd Edition
passes’ at each test shown here reveal how much easier it usually is to match
engineering simulation data, which was the source for the first of the two sets
of results.
Figure 1e2-1
Example of Simulator Test Results for Reverse Thrust Stopping Time & Distance (1)
1E-7
Evaluation Handbook 3rd Edition
Figure 1e2-2
Example of Simulator Test Results for Reverse Thrust Stopping Time &
1E-8 Distance (2)
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES See the notes for test 1e(1). Either flight test or
manufacturer’s performance manual data must be
used where available. An acceptable alternative is to
use engineering data based on dry runway flight-test
stopping distance and the effects of wet runway
braking coefficient. Clearly, there should be an
increase in the time and distance to stop over that
achieved during the dry runway test, although
1E-9
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
See the notes for test 1e(1). The test is typically run in a very similar, if not
identical fashion.
EXAMPLE
The diagram below, Figure 1e3-1, contains a partial set of plots and partial
printed pass/fail data for a wet runway stopping distance test. The asterisk by
the check points on the printout indicate that the test has failed, however a
close look at the plotted data shows that these parameters - including both
ground sped and ground distance - are virtually exact overlays of the
aeroplane data. The obvious conclusion to draw is that the ‘aeroplane’ values
against which the simulator is being checked have been incorrectly specified
by the simulator engineer in the test information. This situation is surprisingly
common, but also very easy to rectify.
1E-10
Evaluation Handbook 3rd Edition
Figure 1e3-1
Example of Simulator Test Results for Stopping Time & Distance, Wet Runway
1E-11
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES See the notes for test 1e(1). Either flight-test or
manufacturer’s performance manual data should be
used, though flight test data are not often available
for this test. An acceptable alternative is to use
engineering data based on dry runway flight-test
stopping distance and the effects of icy runway
braking coefficient. Clearly, there should be an
increase in the time and distance to stop over that
1E-12
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
See the notes for test 1e(1). The test is typically run in a very similar, if not
identical fashion. For icy runways, it may be expected that the simulated
aeroplane will overshoot the end of the runway, but this may be dependent on
the selected configuration.
EXAMPLE
The results shown in Figure 1e4-1 below are for a footprint test, indicating that
no aeroplane manufacturer’s data was available for this condition. The most
up-to-date packages will contain engineering simulator data, but that was not
the case for this aircraft type. The only potentially confusing item on the plots is
the reference to ‘Flight Test Data’ in the bottom left corner, which may lead an
evaluator to the erroneous conclusion that the simulator matches the
aeroplane so perfectly that the plots are indistinguishable.
1E-13
Evaluation Handbook 3rd Edition
Figure 1e4-1
Example of Simulator Test Results for Stopping Time & Distance with Icy Runway
1E-14
Evaluation Handbook 3rd Edition
SECTION 1f
ENGINES
1f(1) Acceleration
1f(2) Deceleration
1F-1
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The items of importance for this test are the prime
(key) engines parameters presented in the data, and
in particular the time taken to achieve the values.
The definitions of Ti and Tt are given in the ICAO
Manual but are repeated below for clarity. The actual
aeroplane response is not the issue for this test, but
obviously the test conditions should be accurately
represented so that a fair comparison can be made.
1F-2
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
Typical procedures call for the simulator and engines to be set for approach in
a trimmed condition. The throttles are then rapidly advanced to Go-Around
power. Record critical engine performance parameters (Power Lever Angle,
Net Thrust per engine, N1, N2, EGT, Fuel Flow and EPR) and compare versus
aeroplane data. Following flight test aeroplane parameters such as airspeed
and altitude is desirable if they are available, but small deviations in these
values should not adversely affect the results.
EXAMPLE
A good match has been achieved in the simulator when compared with the
manufacturer’s proof of match (see Figures 1f1-1 and 1f1-2), but the plot scale
chosen does not properly allow an evaluator to determine whether or not the
test passes or fails. This is one example where use of the manufacturer’s
original plot scale can be bettered when designing the simulator QTG.
1F-3
Evaluation Handbook 3rd Edition
Figure 1f1-1
Example of Aeroplane Manufacturer's Proof of Match Data
(Engine Acceleration)
Figure 1f1-2
Example of Simulator Test Results for Engine Acceleration
1F-4
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The items of importance for this test are the prime
(key) engines parameters presented in the data, and
in particular the time taken to achieve the values,
usually idle thrust. The definitions of Ti and Tt are
given in the ICAO Manual but are repeated below for
clarity. The actual aeroplane response is not the
issue for this test, especially as it is to be performed
on ground, but obviously the test conditions should
be accurately represented so that a fair comparison
can be made. Note that the final time to achieve the
same value of (idle) thrust is not the foremost issue,
as it is recognised that the simulator and aeroplane
times need not be the same if the thrust is way below
1F-5
Evaluation Handbook 3rd Edition
a level of significance.
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
Typical procedures call for the simulator and engines to be set on ground static
in a stabilised condition with the throttles set for Takeoff power. The throttle
levers should then be rapidly retarded to the idle position whilst the critical
engine performance parameters (Power Lever Angle, Net Thrust per engine,
N1, N2, EGT, Fuel Flow and EPR) are recorded and compared versus
aeroplane data. The test is usually best performed with the parking brake on,
but note should be taken of the method used during the acquisition of the flight
test data.
EXAMPLE
The example in Figure 1f2-1 shows that, while it is possible to get a match
which looks good when plotted, the nature of the tolerances for this test is such
that the test still technically fails in the acceleration phase though not in the
deceleration phase. Other parameters, if available in the data, should also be
shown, and the power lever angle (or equivalent) should be driven exactly as in
the data.
1F-6
Evaluation Handbook 3rd Edition
Figure 1f2-1
Example of Simulator Test Results for Engine Acceleration & Deceleration (Combined)
1F-7
Evaluation Handbook 3rd Edition
1F-8
Evaluation Handbook 3rd Edition
SECTION 2
HANDLING QUALITIES
2c LONGITUDINAL
2d LATERAL DIRECTIONAL
2e LANDINGS
2f GROUND EFFECT
2g WINDSHEAR
2-1
Evaluation Handbook 3rd Edition
The purpose of this section of the QTG is to provide the evaluator of the
simulator with adequate, objective evidence that the handling qualities of
the simulator correspond within reasonable limits to those of the aeroplane
being simulated. The validation tests which assist in this function have
been chosen to ensure that the stability and control characteristics of the
simulator are satisfactory relative to the actual aeroplane throughout its
speed, altitude, weight and centre of gravity envelope. Whilst much of the
training of jet transport flight crew is carried out at low altitude and in the
vicinity of the airfield, this does not remove the necessity to prove the
capability of the simulator as an effective training tool in cruise conditions
as well. It is important to pilot training and certification (licensing) that the
simulator handling qualities closely match those of the respective
aeroplane. These essential characteristics of simulators, therefore, must
be demonstrated and must be repeatable. Repeatability must not, however,
be so designed into the testing system that the objective to demonstrate
proper handling qualities is diminished.
As with the Performance Tests, the most effective way of carrying out these
tests to give the desired accuracy is by using an automatic test system.
However, confirming the automatic test result with a selection of tests
which have been flown manually by a suitably qualified pilot is perhaps of
even greater importance when assessing handling qualities than it is when
evaluating the simulated aeroplane performance.
Due to the nature of control systems, and the very limited possibilities to
measure actual pilot control force without affecting the control system
characteristics, the following considerations should be taken into account
when comparing measured control system characteristics of an aeroplane
and a simulator.
For any control system, especially if the applied control forces are
important, the exact system configuration during the test must be
documented (e.g. yaw damper on/off, control wheel steering on/off,
hydraulics on/off, feel pressure). Furthermore, it must be verified that the
2-2
Evaluation Handbook 3rd Edition
controls are free to move, and not obstructed by any crew member or
equipment (e.g. co-pilot's knee obstructing wheel movement, manuals on
co-pilot's seat obstructing column movement).
In general, flight test recorded control forces and control deflections are not
measured at the point of pilot force application. This implies that if data
from an aeroplane installed data acquisition systems is used, the exact
location in the control system where the signal is measured, and the
applied conversions to obtain equivalent pilot control force and control
deflection must be specified (e.g. measured in the control cables,
measured at the aft quadrant, etc).
Due to inertial effects and the filtering of strain gauge signals, the
measured control force will inevitably be different from the theoretical pilot
control force, for example if the pilot releases the control, the control force
will by definition instantaneously become zero, but recorded data will
always show a less abrupt change in force due to inertial and damping of
the controls.
Also, some contributions to the pilot control force, such as friction and
unbalance in the pilot controls, may not be reflected in the forces as
measured by the data acquisition system. Known differences must be
documented for each test by the data supplier, in order to be able to judge
the acceptability of observed differences.
2-3
Evaluation Handbook 3rd Edition
Finally, note that force versus position testing in several of the tests
contained in sections 2a (Static Controls Checks) and 2b (Dynamic
Controls Checks) is not required if an actual aeroplane hardware controller
is employed in the simulator. This typically applies to certain Computer
Controlled Aircraft.
2-4
Evaluation Handbook 3rd Edition
SECTION 2a
2A-1
Evaluation Handbook 3rd Edition
In order to exclude dynamic effects, the tests must be performed using very
small control deflection rates. Also, the control deflection rate should be as
constant as possible.
The tests are performed by very slowly moving the pilot control over its full
range: from neutral to the stop, then to the opposite stop, then back to the
neutral position (full sweep). The exceptions here are the pitch trim tests
and the throttle lever test, which can be accomplished by spot checking
rates and positions as appropriate, and therefore do not need a full sweep
in both directions.
Except as noted for the pitch trim and throttle lever, tolerances for these
tests are on pilot control force and surface position (or brake system
pressure in test 2a(9)). Compliance should be shown by comparison of
cross-plots of control force versus pilot control position and surface position
versus pilot control position rather than time histories, except for the pitch
trim rate test.
The controller position versus force shall be measured at the pilot control.
An alternative method would be to instrument the simulator in an equivalent
manner to the flight test aeroplane. The force and position data from this
instrumentation can be directly recorded and matched to the aeroplane
data. Prior to an initial evaluation, the regulatory authorities require that a
physical calibration is performed using a control force measuring (CFM)
system on the primary controls (the so-called ‘Fokker’ tests, though some
simulator manufacturers use methods other than that which was devised
by Fokker themselves). These tests are usually time-consuming and
labourious to perform, but are important in that they are designed to give
confidence that the computed values - as displayed on the IOS for example
- are a good match for the true values of force and position so that the
computed values may be used for recurrent and any subsequent tests. In
general, the regulatory authorities do not ask for the calibration tests using
CFM equipment to be re-run on a recurrent basis, but they are entitled to
request them, and do so occasionally if there appears to be good reason.
Note that for some aeroplanes with reversible flight controls, the tests will
need to be run at a suitable airspeed condition.
2A-2
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2A-3
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
Slowly move the pitch controller such that approximately 100 seconds are
required to achieve a full sweep. A full sweep is defined as movement of the
controller from neutral to the stop (forward or aft), then to the opposite stop, then
back to the neutral position. Before performing the test, verify that the pitch
controller movement is not obstructed by any crew member or equipment (e.g.
manuals on co-pilot's seat obstructing control column movement) as this can
cause severe deformations in the measured control characteristics, especially
near the stops. Note that the cockpit controller position versus force
measurement is not required if a self contained aeroplane controller (i.e.
aeroplane part(s)) which has integrated force and damping systems is used in the
simulator. Be aware that on some aeroplanes feel forces vary with stabiliser trim
position.
EXAMPLE
Figure 2a1-1a shows a good result for force versus position, however the
‘aeroplane data’ used for the overplot is almost certainly not taken directly from
an aeroplane using CFM equipment - the data trends are too ‘smooth’ and the
end-stops do not represent what actually occurs when using CFM equipment in
the aeroplane. This does not render the test results invalid, as there may be good
reasons why such data has been used (for example, this is what the aeroplane
manufacturer provided in their proof-of-match document), though this should be
properly explained in the QTG.
2A-4
Evaluation Handbook 3rd Edition
Figure 2a1-1a
Example of Simulator Test Results for Pitch Controller Force
versus Position Calibration
2A-5
Evaluation Handbook 3rd Edition
Figure 2a1-1b
Example of Simulator Test Results for Elevator versus Pitch Controller Position
Calibration
2A-6
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES After confirming that the roll controller is in the neutral
position, the test is run by slowly driving the controller
to either the left or the right stop, then slowly driving it
back through neutral to the other stop, then finally
driving it back again to the neutral position. The results
should be overplotted with those obtained on the
aeroplane to enable an effective comparison to be
made. For initial evaluations, or at the request of the
authorities, repeat the test with a control force
measurement (cfm) system fitted. It is not necessary
to run this test provided the aeroplane cockpit
controller unit has been employed in the simulator and
it has not been modified from its status in the
aeroplane. The lateral control system characteristics
for all aeroplane types are further validated by the
2A-7
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
Slowly move the roll controller such that approximately 100 seconds are required
to achieve a full sweep. A full sweep is defined as movement of the controller
from neutral to the stop (right or left), then to the opposite stop, then back to the
neutral position. Before performing the test, verify that the roll controller
movement is not obstructed by any crew member or equipment (e.g. manuals on
co-pilot's seat obstructing control wheel movement). Note that the cockpit
controller position versus force measurement is not required if a self contained
aeroplane controller (i.e. aeroplane part(s)) which has integrated force and
damping systems is used in the simulator.
EXAMPLE
The lack of an aeroplane data overplot for the example in Figure 2a2-1a shows
the general format and shape of a typical roll controller force (in this case wheel
force) calibration test for a conventionally-controlled (i.e. non-computer
controlled) aeroplane. Note that the breakout force, at a little under 6 lbs, is a
fairly large percentage of the maximum effort required to apply full wheel. This
plot was run on an older simulator which did not use overplots for the control
force calibration tests, relying instead on transparency copies of the aeroplane
data to facilitate the comparison.
2A-8
Evaluation Handbook 3rd Edition
Figure 2a2-1a
Example of Simulator Test Results for Roll Controller Force versus Position Calibration
Figure 2a2-1b below shows the roll controller (wheel) versus surface position
plots for the same simulator, illustrating both the small breakout value of the
aileron surface and also the amount of wheel required before spoiler movement
is initiated.
2A-9
Evaluation Handbook 3rd Edition
Figure 2a2-1b
2A-10 Example of Simulator Test Results for Aileron & Spoiler versus Roll Controller
Position Calibration
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES After confirming that the rudder pedals are in the
neutral position, the test is run by slowly driving the
pedals to either the left or the right stop, then slowly
driving them back through neutral to the other stop,
then finally driving them back again to the neutral
position. The results should be overplotted with those
obtained on the aeroplane to enable an effective
comparison to be made. For initial evaluations, or at
the request of the authorities, repeat the test with a
control force measurement (cfm) system fitted. The
directional control system characteristics are further
validated by the tests included in Sections 1b, 2d and
2e.
2A-11
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
Slowly move the rudder pedals such that approximately 100 seconds are
required to achieve a full sweep. A full sweep is defined as movement of the
controller from neutral to the stop, usually the right stop, then to the opposite
stop, then back to the neutral position. Before performing the test, verify that the
rudder pedal movement is not obstructed by any crew member or equipment.
EXAMPLE
What at first looks like a major problem with the simulated pedal force in Figure
2a3-1 (the simulator is the full line, the aeroplane data is the dotted line) turned
out to be nothing more than a problem related to the rate at which the automatic
test system was driving the rudder pedals (see the upper plot). This result was
taken from a simulator that was in service at the time, and some engineering
effort was obviously required to find and eliminate the cause of the anomaly
during the autotest, but running the test manually (as illustrated in the lower plot)
revealed no actual problem with the pedals themselves.
2A-12
Evaluation Handbook 3rd Edition
Figure 2a3-1
Example of Simulator Test Results for Rudder Pedal Force versus
Position Calibration 2A-13
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES After confirming that the tiller is in the neutral position,
the test is run by slowly driving the nosewheel steering
controller to either the left or the right stop, then slowly
driving it back through neutral to the other stop, then
finally driving it back again to the neutral position. The
results should be overplotted with those obtained on
the aeroplane to enable an effective comparison to be
made. The nosewheel steering system characteristics
are further validated by the tests included in Section
1a.
2A-14
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
Slowly move the tiller such that approximately 100 seconds are required to
achieve a full sweep. A full sweep is defined as movement of the controller from
neutral to the stop, either right or left, then to the opposite stop, then back to the
neutral position. Before performing the test, verify that the tiller movement is not
obstructed by any crew member or equipment.
EXAMPLE
Figure 2a4-1 again shows a typical plot for a nosewheel steering controller force
versus position test. As is also a typical feature of the roll controller calibration
test, the breakout is a very large percentage of the maximum force required for
full deflection.
2A-15
Evaluation Handbook 3rd Edition
Figure 2a4-1
Example of Simulator Test Results for Nosewheel Steering Controller Force versus Position
2A-16
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The rudder pedal steering force characteristics are usually the same as the
rudder control forces. Therefore, the check may be executed as part of the rudder
calibration test. While not typically necessary, it may better represent the
aeroplane data if the simulated runway friction is set to a very low value while
running this test so as to better simulate the ‘greasy plate’ sometimes employed
by aircraft manufacturers when conducting this test.
2A-17
Evaluation Handbook 3rd Edition
EXAMPLE
The rudder pedal steering check, which is often done in conjunction with the
rudder pedal position versus force calibration, is exemplified by Figure 2a5-1. The
small discontinuity in the plot has little overall significance in terms of the test
evaluation, but could probably be eliminated by running the test for a little longer
or by nudging the pedals back past the neutral position prior to actually ending
the test.
Figure 2a5-1
Example of Simulator Test Results for Rudder Pedal Steering Calibration
2A-18
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2A-19
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
See the 'EVALUATION NOTES' section. Because this test requires manual
confirmation that the computed and indicated values agree within the tolerance,
the manual test will typically only differ from the automatic one in that the
stabiliser trim switch is used to set the indicated stabiliser position instead of the
autotest system performing this function.
EXAMPLE
A plotted example would be of little benefit for this test, since it is essentially just
a check that the indicated position is correctly aligned with the computed value,
available on the IOS and/or on an engineering terminal. Clearly, the value
indicated to the pilot in the cockpit must be confirmed by a pilot seated in the
correct position.
2A-20
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES This test should be run after the calibration test 2a(6)
has confirmed that the actual and indicated trim
positions are closely aligned. These tests can be run
fully automated, but should also be run manually on a
recurrent basis to check that the trim rate as perceived
by the pilot in the cockpit corresponds closely with the
computed value. The trim rate should first be checked
on ground static, then in a go-around configuration.
2A-21
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
This test requires firstly that the simulator be set on ground in a static condition,
then the manual trim switch(es) used to drive the stabiliser from near one
extreme to near the opposite extreme, noting the time taken to travel between the
two points. The trim rate can then be cross-checked against the plotted value of
stabiliser. The second case entails setting up an approach condition, then either
repeating the method used on ground or using autopilot trim for a go-around
manoeuvre. Clearly, if large excursions in pitch angle are to be avoided, the test
at the approach condition must be of short duration, just sufficient to determine
the trim rate, and this may be achieved using either the manual switches or the
autopilot.
EXAMPLE
A plotted example of this test is shown in Figure 2a7-1. Aside from the slightly
odd values on the time axis, the test shows very well the co-ordination between
pitch trim rate and position, and even directly plots the value for trim rate which
can easily be used to compare with the aeroplane data.
2A-22
Evaluation Handbook 3rd Edition
Figure 2a7-1
Example of Simulator Test Results for Pitch Trim Rate Test
2A-23
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2A-24
Evaluation Handbook 3rd Edition
the results.
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The method for this test typically requires the person conducting the test to first
set the throttles to idle, then to each of several specified throttle lever angles, and
then wait for the engines to stabilise before noting the value of each of the prime
engine parameters at that point. This procedure is then repeated several times
over the range of throttle movement. It may be possible to move the throttle
levers slowly enough and at a sufficiently constant speed to obtain a smooth set
of time history plots, but it is more usual to plot the engine parameters in question
versus the throttle lever angle. Note that the cancellation of any configuration
warning should not affect the outcome of the test.
EXAMPLE
2A-25
Evaluation Handbook 3rd Edition
There may be occasions when a set of plots is especially useful, but the
important aspect of this test is that the cockpit throttle levers are properly aligned,
both with each other and with the appropriate engine indications for each throttle
position. The intent is of course to ensure that the pilot does not learn to position
the levers in different positions on the simulator compared to the aeroplane for
the same thrust levels. Whilst not specifically stated in the ICAO Manual, the test
condition must include the correct pressure altitude, static air temperature and
Mach number for a proper comparison of the results with the aeroplane data.
Figure 2a8-1
Example of Simulator Test Results for Cockpit Throttle Lever versus EPR
2A-26
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
DEMONSTRATION Depress the brake pedal very slowly until its full range
has been achieved, then slowly release the pedals
until they return to neutral.
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
2A-27
Evaluation Handbook 3rd Edition
Slowly move the brake pedals such that at least 30 seconds are required to
achieve a full sweep. Before performing the test, verify that the brake movement
is not obstructed by any crew member or equipment. It is worth noting however,
that the method of checking brake pedal forces manually can be difficult,
especially on older simulators.
EXAMPLE
In most modern transport aeroplanes, brake pedal loads result from pressure
feedback and deflection of the normal and alternate brake metering valve internal
return springs from the active brake hydraulic system. Pedal load characteristics
taken from a simulator test are shown in Figure 2a9-1 and tend to be
representative of both the normal and alternate brake systems. These
characteristics will change only if both active hydraulic systems are lost and the
accumulator is depressurized. Brake pedal deflections are due to stretch in the
brake cables and deflection of the metering valve return springs. Metered
pressures as a function of pedal position are shown in the lower plot. A small
initial deflection of the brake pedals is necessary before the valves begin
metering pressure to the brakes, and this should be represented in the simulator
within the prescribed tolerances.
No aeroplane data has been overplotted in Figure 2a9-1, but the plots are fairly
typical.
2A-28
Evaluation Handbook 3rd Edition
Figure 2a9-1
Example of Simulator Test Results for Brake Pedal Calibration
2A-29
Evaluation Handbook 3rd Edition
2A-30
Evaluation Handbook 3rd Edition
SECTION 2b
2B-1
Evaluation Handbook 3rd Edition
2B CONTROL DYNAMICS
2B.1 GENERAL
2B-2
Evaluation Handbook 3rd Edition
Note that the characteristics of the controller position time history are
influenced significantly by how "cleanly" the controller is released. To
provide the best matching of controller position time histories, it is
recommended that the initial portions of the force release time histories
which may be shown in the aeroplane data be approximated in the
simulator and emphasis placed on the precise positioning of the
controller. The forces of interest begin just before controller release and
continue until controller force initially approaches zero. Subsequent
forces that may be present in the aeroplane data are almost always
noise produced by aeroplane instrumentation and should be ignored.
For aeroplanes with reversible controls, the aero force gradient has an
overwhelming influence on the dynamic response of the control.
Therefore, a different dynamic response of the simulator as compared to
the aeroplane may spoil the dynamic response of the simulator control
completely due to its effect on the actual simulated aero force gradient.
Note that a relatively small offset in the surface deflection may cause a
markedly different response of the simulated aeroplane. In some cases
it may be necessary to force the simulated aeroplane to follow the
2B-3
Evaluation Handbook 3rd Edition
Tests to verify that control feel dynamics represent the aeroplane must
show that the dynamic damping cycles (free response of the controls)
match that of the aeroplane within specified tolerances. The method of
evaluating the response and the tolerance to be applied is described in
the next two subparagraphs for the underdamped and critically damped
cases.
Two measurements are required for the period, the time to first zero
crossing (in case a rate limit is present) and the subsequent frequency
of oscillation. It is necessary to measure cycles on an individual basis
in case there are non-uniform periods in the response. Each period will
be independently compared to the respective period of the aeroplane
control system and, consequently, will enjoy the full tolerance specified
for that period.
2B-4
Evaluation Handbook 3rd Edition
Figure 2b-1
Underdamped Step Response
2B.3 TOLERANCES
T(P0) ±10% of P0
T(P1) ±20% of P1
T(P2) ±30% of P2
T(Pn) ±10(n+1)% of Pn
T(An) ±10% of A1
2B-5
Evaluation Handbook 3rd Edition
Figure 2b-2
Critically Damped Step Response
For each axis of pitch, roll and yaw, the control shall be forced to its
maximum extreme position for the following distinct rates. These tests
shall be conducted at typical taxi, takeoff, cruise and landing conditions.
Slowly move the control such that approximately 100 seconds are required
2B-6
Evaluation Handbook 3rd Edition
NOTE: Dynamic sweeps may be limited to forces not exceeding 44.5 daN
(100 lb).
2B.4.4 Tolerances
a) Static Test - see Tests 2a(1), 2a(2) and 2a(3), Section 2a.
The authorities are open to alternative means such as the one described
above. Such alternatives must, however, be justified and appropriate to the
application. For example, the method described here may not apply to all
manufacturers' systems and certainly not to aeroplanes with reversible
control systems. Hence, each case must be considered on its own merit on
an ad-hoc basis. Should the authority find that alternative methods do not
result in satisfactory performance, then more conventionally accepted
methods must be used.
2B-7
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2B-8
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
This test typically is carried out with the simulator carefully trimmed in the correct
configuration and at the appropriate condition. The pitch controller is displaced
to the value specified in the flight test data, held briefly and then released and
2B-9
Evaluation Handbook 3rd Edition
allowed to freely respond for a few seconds whilst the controller position is
plotted. For the alternative method using full control sweeps, the slow test is
equivalent to the static test contained in test 2a(1), and the medium and fast
sweeps should be carried out in a similar manner, but at faster rates.
EXAMPLE
A typical result for conventional pitch control dynamics is shown in Figure 2b1-1,
clearly illustrating a classic example of an underdamped response, and is almost
always more difficult to replicate in the simulator than a response which is much
more damped. The result below may not meet the criteria within the required
tolerance for the amplitude of the first overshoot.
Figure 2b1-1
Example of Simulator Test Results for Pitch Control Dynamics
2B-10
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2B-11
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
This test typically is carried out with the simulator carefully trimmed in the correct
2B-12
Evaluation Handbook 3rd Edition
EXAMPLE
Figure 2b2-1 gives a typical result for a control wheel dynamics test. There are
usually only a small number of overshoots, and in this certainly no more than two
that ‘count’ (i.e. are $5% of the initial displacement). Whilst the time for the zero
crossing is probably just about within the 10% tolerance, the simulator result may
be problematic in that the amplitude of the first overshoot is too low. Strictly
speaking the number of overshoots is within the tolerance of ±1, but some
regulatory authorities might consider it worth further scrutiny.
Figure 2b2-1
Example of Simulator Test Result for Roll Control Dynamics
2B-13
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2B-14
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
This test is typically carried out with the simulator carefully trimmed in the correct
2B-15
Evaluation Handbook 3rd Edition
EXAMPLE
In general, obtaining a really good result for pedal position dynamics seems to
cause more problems than the other two axes. Figure 2b3-1 shows a surprisingly
good match, as long as one takes into account the fact that the third overshoot
does not need to be counted.
Figure 2b3-1
Example of Simulator Test Results for Yaw Control Dynamics
2B-16
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2B-17
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
EXAMPLE
By no means a terrible result, but the plots shown in Figure 2b4-1 could still be
improved upon. The initial speed is slightly off, plus the combination angle of
attack and pitch angle differences may mean that the initial rate of climb was
incorrect. However, the movement of the control column (not shown) gives the
correct change in elevator angle and the pitch responds quite well.
2B-18
Evaluation Handbook 3rd Edition
Figure 2b4-1
Example of Simulator Test Results for Small Control Inputs,
Pitch
2B-19
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The purpose of this test is to determine that the area
of pilot lateral control "feel" for small roll control inputs
2B-20
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
EXAMPLE
Something is clearly amiss in the result shown in Figure 2b5-1. There appears
to be a roll response even before the aileron is displaced. The most likely
explanation is that the initialisation procedure failed to complete properly; the
control displacements used for the trimming process would then be incorrectly
positioned for the start of the test. One other possibility is that the controls were
positioned correctly at the completion of the trim phase, but then were released
prematurely prior to the proper beginning of the test.
2B-21
Evaluation Handbook 3rd Edition
Figure 2b5-1
Example of Simulator Test Results for Small Control Inputs, Roll
2B-22
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The purpose of this test is to determine that the area
of directional pilot control "feel" for small control inputs
2B-23
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
EXAMPLE
The result shown in Figure 2b6-1 is at best marginal, probably because the
rudder pedal input (not shown) was insufficient to give the required rudder
displacement. It is to be expected that using the correct amount of rudder and/or
following the change in engine thrust would significantly improve the match.
2B-24
Evaluation Handbook 3rd Edition
Figure 2b6-1
Example of Simulator Test Results for Small Control Inputs, Yaw
2B-25
Evaluation Handbook 3rd Edition
2B-26
Evaluation Handbook 3rd Edition
SECTION 2c
LONGITUDINAL
2C-1
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2C-2
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
Above all else, it is important that the simulated aeroplane be accurately trimmed
in accordance with the validation data before commencing this test. After the
2C-3
Evaluation Handbook 3rd Edition
EXAMPLE
Figure 2c1-1
Example of Simulator Test Results for Power Change Dynamics
2C-4
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED AIRSPEED
PARAMETERS PITCH ANGLE
PITCH RATE
ANGLE OF ATTACK
STABILISER ANGLE
PITCH CONTROLLER POSITION
ELEVATOR ANGLE
FLAP LEVER POSITION/FLAP SURFACE ANGLE(S)
INDICATED FLAP ANGLE (to demonstrate timing)
PRESSURE ALTITUDE
RATE OF CLIMB
ENGINES KEY PARAMETERS
BANK ANGLE
WIND SPEED COMPONENTS
2C-5
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
2C-6
Evaluation Handbook 3rd Edition
better to fly the manoeuvre from a stable, trimmed condition and account for the
differences in the results obtained by use of engineering judgement.
EXAMPLE
There is nothing particularly untoward about the plots shown in Figure 2c2-1, but
it can clearly be seen that the aeroplane flight test data (the dotted line) ceased
to be hand-free after approximately 30 seconds. However, the flap motion ended
at around 7 seconds, so in any case the requirement of the ICAO Manual was
satisfied by 22 seconds (7 seconds + 15 seconds). There has been no attempt
to replicate the pilot’s manipulation of the elevator for the last two seconds, nor
is there any necessity to do so.
Figure 2c2-1
2C-7
Example of Simulator Test Results for Flap Change Dynamics,
Retraction
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED AIRSPEED
PARAMETERS PITCH ANGLE
PITCH RATE
ANGLE OF ATTACK
STABILISER ANGLE
PITCH CONTROLLER POSITION
ELEVATOR ANGLE
FLAP LEVER POSITION/FLAP SURFACE ANGLE(S)
INDICATED FLAP ANGLE (to demonstrate timing)
PRESSURE ALTITUDE
RATE OF CLIMB
ENGINES KEY PARAMETERS
BANK ANGLE
WIND SPEED COMPONENTS
EVALUATION NOTES See notes for test 2c(2a). The difference is that in this
test the flaps are to be extended rather than retracted.
2C-8
Evaluation Handbook 3rd Edition
AIRSPEED ±3 Kts
ALTITUDE ±30 m (100 Ft)
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
EXAMPLE
The result in Figure 2c2-2 is just about in tolerance, with the possible exception
of the airspeed at 65 seconds, though this may be beyond the 15 second period
after the flaps and slats (not shown) have achieved the commanded value. Even
if it is within the 15 seconds, a tiny excursion outside the tolerance band such as
this is unlikely, as an isolated case, to cause the regulators to take drastic action
against the qualification of the simulator. The result should be improved if it is
possible to do so, and careful use may need to be made of the aeroplane proof-
of-match data. This is an example of a test which is slightly affected by an
atmospheric disturbance (i.e. wind/turbulence, visible in the airspeed trace), but
this can be easily explained in a note accompanying the test result.
2C-9
Evaluation Handbook 3rd Edition
Figure 2c2-2
Example of Simulator Test Results for Flap Change Dynamics, Extension
2C-10
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED AIRSPEED
PARAMETERS PITCH ANGLE
PITCH RATE
ANGLE OF ATTACK
SPEEDBRAKE HANDLE POSITION
SPOILER ANGLES
PRESSURE ALTITUDE
RATE OF CLIMB
STABILISER ANGLE
PITCH CONTROLLER POSITION
ELEVATOR ANGLE
ENGINES KEY PARAMETERS
BANK ANGLE
WIND SPEED COMPONENTS
2C-11
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
2C-12
Evaluation Handbook 3rd Edition
better to fly the manoeuvre from a stable, trimmed condition and account for the
differences in the results obtained by use of engineering judgement.
EXAMPLE
Figure 2c3-1 is another example of a test where the result(s) ends up being
slightly out of tolerance at the very end. This is not uncommon for the
uncontrolled free-response tests such as this one, but there are usually ways to
make slight improvements that will ensure the test remains in tolerance to the
end, though sometimes it may mean that an intermediate portion of the time
history (which is usually well within tolerance) is slightly worse.
Figure 2c3-1
Example of Simulator Test Results for Speedbrake Change 2C-13
Dynamics, Extension
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED AIRSPEED
PARAMETERS PITCH ANGLE
PITCH RATE
ANGLE OF ATTACK
SPEEDBRAKE HANDLE POSITION
SPOILER ANGLES
PRESSURE ALTITUDE
RATE OF CLIMB
STABILISER ANGLE
PITCH CONTROLLER POSITION
ELEVATOR ANGLE
ENGINES KEY PARAMETERS
BANK ANGLE
WIND SPEED COMPONENTS
EVALUATION NOTES See notes for test 2c(3a). The difference is that in this
test the speedbrake is to be retracted rather than
extended.
2C-14
Evaluation Handbook 3rd Edition
AIRSPEED ±3 Kts
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
EXAMPLE
2C-15
Evaluation Handbook 3rd Edition
Figure 2c3-2
2C-16 Example of Simulator Test Results for Speedbrake Change Dynamics, Retraction
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED AIRSPEED
PARAMETERS PITCH ANGLE
PITCH RATE
ANGLE OF ATTACK
LANDING GEAR HANDLE POSITION
LANDING GEAR INDIVIDUAL POSITIONS
GEAR POSITION INDICATION/LIGHTS STATUS (to
demonstrate timing)
PRESSURE ALTITUDE
RATE OF CLIMB
STABILISER ANGLE
PITCH CONTROLLER POSITION
ELEVATOR ANGLE
ENGINES KEY PARAMETERS
BANK ANGLE
WIND SPEED COMPONENTS
2C-17
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
2C-18
Evaluation Handbook 3rd Edition
EXAMPLE
Figure 2c4-1 is a typical result for this test. In most modern jet transport
aeroplanes retracting the gear does not usually have a very large effect on
flightpath. If the test conditions have been correctly initialised and the trimmed
state defined in the validation data properly replicated, in most cases little can go
wrong.
Figure 2c4-1
Example of Simulator Test Results for Gear Change Dynamics, Retraction
2C-19
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED AIRSPEED
PARAMETERS PITCH ANGLE
PITCH RATE
ANGLE OF ATTACK
LANDING GEAR HANDLE POSITION
LANDING GEAR INDIVIDUAL POSITIONS
GEAR POSITION INDICATION/LIGHTS STATUS (to
demonstrate timing)
PRESSURE ALTITUDE
RATE OF CLIMB
STABILISER ANGLE
PITCH CONTROLLER POSITION
ELEVATOR ANGLE
ENGINES KEY PARAMETERS
BANK ANGLE
WIND SPEED COMPONENTS
EVALUATION NOTES See notes for test 2c(4a). The difference is that in this
test the landing gear is to be extended rather than
retracted.
2C-20
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
See notes for test 2c(4a). The difference is that in this test the landing gear is to
be extended rather than retracted.
EXAMPLE
A similar situation exists for the gear extension test, as illustrated in Figure 2c4-2,
as for the gear retraction test 2c(4a). In addition, both these tests are usually
quite straightforward to run manually and achieve a pass.
Figure 2c4-2
Example of Simulator Test Results for Gear Change Dynamics, Extension 2C-21
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED AIRSPEED
PARAMETERS PITCH ANGLE
ANGLE OF ATTACK
PRESSURE ALTITUDE
RATE OF CLIMB
STABILISER ANGLE
PITCH CONTROLLER POSITION
ELEVATOR ANGLE
ENGINES KEY PARAMETERS
LINEAR ACCELERATIONS (Longitudinal, Lateral,
Vertical)
2C-22
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
These cases are to verify the recorded parameters in a stable condition at the
weight and configuration specified in the data. Sufficient time should be spent in
ensuring that the trim is accurate.
EXAMPLE
The cruise case result in Figure 2c5-1 proves well that the results as given are
within the tolerances, but elevator angle, which is a toleranced parameter, has
not been printed nor is it demonstrated that the condition being checked is
actually a trim. Linear (and preferably rotational as well) accelerations should be
printed to enable an evaluator to be sure that the result is as it should be. Also,
the use of the term ‘glideslope angle’ for this printout is confusing. It can be
assumed that the actual parameter referred to is flightpath angle, but the result
should state this clearly.
2C-23
Evaluation Handbook 3rd Edition
Figure 2c5-1
Example of Simulator Test Results for Longitudinal Trim
2C-24
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2C-25
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2C-26
Evaluation Handbook 3rd Edition
MANUAL TESTING
The principal purpose of this test is to determine the pitch controller force
required to maintain speed at a specified bank angle. Note that longitudinal
stability augmentation systems must be as the validation data and that no trim
change should be used during the banking manoeuvre or at the new bank angle.
The test may consist of a continuous time history with slowing increasing bank
angle, or a series of steady-state conditions at stabilised bank angles. The
results may be "snapshot" once the aeroplane has been stabilized at the required
bank angle and at the trim airspeed, or the data may call for a time history to be
run, in which case careful study should be made of the way that the flight test
data was gathered before attempting to replicate it in the simulator. Selecting
altitude freeze will help to provide sufficient time to set up each case. The engine
power setting must not be altered.
Note that with some computer-controlled aircraft the bank angle protection
functions can make it difficult to maintain some of the higher bank angles.
EXAMPLE
2C-27
Evaluation Handbook 3rd Edition
Figure 2c6-1
Example of Simulator Test Results for Longitudinal Manoeuvring Stability
2C-28
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED AIRSPEED
PARAMETERS PRESSURE ALTITUDE
PITCH ANGLE
PITCH RATE
ANGLE OF ATTACK
ELEVATOR ANGLE
STABILISER ANGLE
BANK ANGLE
NORMAL ACCELERATION (or NORMAL LOAD
FACTOR)
PITCH CONTROLLER FORCE & POSITION
ENGINES KEY PARAMETERS
WIND SPEED COMPONENTS
EVALUATION NOTES A critical factor for this test is once again to obtain an
accurate trim before displacing the longitudinal
controller either forward or aft to achieve the requisite
airspeeds. If conducting the test using discrete
airspeed snapshots, it may be expedient to attempt to
displace the pitch controller in one direction only from
2C-29
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The principal purpose of this test is to determine the pitch controller force
required to maintain specific airspeeds without re-trimming. Each case starts
from a stable trimmed condition, then control column or pitch controller force is
applied to achieve and stabilise at the desired airspeeds. It is a consequence of
this type of manoeuvre that a rate of climb or descent will develop, so it is usual
to perform the test with the altitude frozen at the value specified in the data, and
this will not appreciably affect the results. Note that the longitudinal stability
augmentation systems must be as stated in the validation data and if the results
of the aeroplane flight test data are "snapshots", then a time history plot is not
necessary. The engine power and stabiliser settings must not be altered from
trim.
EXAMPLE
2C-30
Evaluation Handbook 3rd Edition
Figure 2c7-1
Example of Simulator Test Results for Longitudinal Static Stability
2C-31
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED AIRSPEED
PARAMETERS PRESSURE ALTITUDE
PITCH ANGLE
PITCH RATE
ANGLE OF ATTACK
ELEVATOR ANGLE
STABILISER ANGLE
BANK ANGLE
NORMAL ACCELERATION (or NORMAL LOAD
FACTOR)
PITCH CONTROLLER POSITION
PITCH CONTROLLER FORCE (if reversible controls)
ENGINES KEY PARAMETERS
WIND SPEED COMPONENTS
EVALUATION NOTES The aeroplane flight test may or may not have been
performed using a consistent 1 kt/second deceleration
rate. This may not be critical, but in any case attempts
should be made to match the actual data rather than
perform the manoeuvre using classical techniques.
Nevertheless, the main criteria are whether the
2C-32
Evaluation Handbook 3rd Edition
2C-33
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
Accurate stall warning test results are obtained by using classical aeroplane
'flight test' techniques. Thus ensure that the entry rate to the stall is close to 1
kt/sec, but take into account any significant deviations from this value visible on
the flight test results. The stick shaker speed may be defined from the flight test
data or it may have been obtained from another approved data source. The
aeroplane manufacturers may also provide the buffet and minimum stall speeds
in tabular form, though the test should be run as a time history. For aeroplanes
with stall protection systems, special care needs to be taken to ensure that the
published stall speeds are not those at which the stick pusher or other such
device operates, but the stall speed itself. Note that the stability augmentation
systems must be as stated in the validation data, and whilst the bank angle
tolerance has now been removed from the requirements, it will the results if the
bank angle differs greatly from flight test data.
EXAMPLE
The reason why the initial airspeed trace in Figure 2c8-1 suddenly ‘jumps’ from
130 knots down to 119 knots is because the wind speeds were inadvertently
omitted from the initialisation process This needs to be corrected in this test,
along with the noticeable difference in initial altitude. The overall effect of both
these errors is very little however, proving that, at least for simulation testing
purposes, the precise speed at which the aeroplane is trimmed ready to begin
a stall manoeuvre does not necessarily have much bearing on the outcome of the
test. The assumption is made with this result that the stall warning, initial buffet
and minimum speeds are all recorded elsewhere, otherwise the time histories
would benefit from markers to show the values.
2C-34
Evaluation Handbook 3rd Edition
Figure 2c8-1
Example of Simulator Test Results for Stall Characteristics
2C-35
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED AIRSPEED
PARAMETERS PRESSURE ALTITUDE
PITCH ANGLE
PITCH RATE
ANGLE OF ATTACK
ELEVATOR ANGLE
STABILISER ANGLE
BANK ANGLE
NORMAL ACCELERATION (or NORMAL LOAD
FACTOR)
PITCH CONTROLLER POSITION
ENGINES KEY PARAMETERS
WIND SPEED COMPONENTS
2C-36
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
Typical procedures for this test call for the aeroplane to be set up in a level flight
trimmed condition, and then for the pilot to either pull or push the pitch controller
to a given value for a specific period (usually a few seconds) to reduce or
increase the airspeed. Once the time has expired and the target airspeed
achieved (ideally simultaneously) the pitch controller is released to the neutral
position and the remainder of the test duration is ‘hands-off’. A minimum of three
2C-37
Evaluation Handbook 3rd Edition
full cycles is typically needed to determine the time to half (or double amplitude).
Parameters are recorded as above and the calculations and analysis will
probably be carried out within the simulator computer. The exact duplication of
the flight test control inputs should not be strictly necessary, though as always
the closer they can be matched the easier it may be to interpret the results. It
may be that some small lateral adjustments using the roll controller are needed
during the test in order to maintain a wings level configuration. Alternatively some
pilots have found it useful to use slight pressure on the rudder pedals to achieve
the same end. Ensure that the simulator stability augmentation systems are
configured as stated in the validation data.
EXAMPLE
The deviation in the pressure altitude value in Figure 2c9-1 is not necessarily
significant from the point of view of the analysis, but it would be of very doubtful
use in order to ascertain the period and time to half amplitude. Typically,
simulator autotest systems make allowances for the data, and permit choice of
one of several longitudinal parameters. Here, the pitch angle looks the most
promising parameter to compare with the aeroplane data. The overall impression
of the result is not of a high standard, but in this particular case there had been
many industry comments from several simulator manufacturers who had been
struggling to replicate the validation data results.
2C-38
Evaluation Handbook 3rd Edition
Figure 2c9-1
Example of Simulator Test Results for Phugoid DYnamics
2C-39
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED AIRSPEED
PARAMETERS PRESSURE ALTITUDE
PITCH ANGLE
PITCH RATE
ANGLE OF ATTACK
ELEVATOR ANGLE
STABILISER ANGLE
BANK ANGLE
NORMAL ACCELERATION (or NORMAL LOAD
FACTOR)
PITCH CONTROLLER POSITION
ENGINES KEY PARAMETERS
WIND SPEED COMPONENTS
2C-40
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The short period mode is typically excited with a pitch controller pulse or double
input. The test is usually begun with the aeroplane perfectly trimmed in level flight
(though the data should be carefully checked for any slight pitch rate, climb rate,
etc.). The control column or longitudinal controller is then displaced minimally but
rapidly to induce the short period pitching oscillation whilst the relevant
parameters are being recorded. Because of the very small movement required,
which must be tailored to achieve the desired result, this case should be
practised beforehand whilst the control parameters are being monitored at an
engineering terminal or workstation. The simulated stability augmentation
systems must be configured as stated in the validation data.
EXAMPLE
Figure 2c10-1 illustrates the point that, being of short duration, it is often easy to
obtain a good result for this test. The plots also serve to illustrate quite well the
type of oscillation to be expected from a jet transport when the short period has
been excited. This test does not suffer from the susceptibility to accumulative
errors that are inherent in, for example, the phugoid test.
2C-41
Evaluation Handbook 3rd Edition
Figure 2c10-1
Example of Simulator Test Results for Short Period Dynamics
2C-42
Evaluation Handbook 3rd Edition
SECTION 2d
LATERAL DIRECTIONAL
2D-1
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2D-2
Evaluation Handbook 3rd Edition
2D-3
Evaluation Handbook 3rd Edition
state.
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
Unless the pilot flying the test (in either the aeroplane or the simulator) is very
experienced and well practised at performing VMCA manoeuvres, it is likely that
accurate flying of a classical VMCA test will take several attempts to get right. The
test should ideally start with the simulated aeroplane trimmed for level flight at
around 1.3Vstall. The appropriate engine(s) can then be set to idle thrust or shut
down (depending on the method used to acquire the validation data) and the
thrust on the operating engine(s) increased to takeoff level whilst heading is
maintained and the bank angle kept at or below five degrees. The airspeed will
then need to be reduced using elevator control only. It is quite possible that the
aeroplane VMCA is below the stall speed and if this is so, the simulator should
reflect this also. If the data is presented in the form of a snapshot, the simulated
aeroplane should be trimmed at this point and the relevant parameters recorded.
It still may be difficult to fly, but the results, once obtained, are usually easier to
interpret.
EXAMPLE
The result shown in Figure 2d1-1 is one section of a snapshot version of this test.
Obviously there have been three previous sections, each demonstrating a trim
condition at progressively lower airspeed, to show the increase in rudder required
to maintain heading.
2D-4
Evaluation Handbook 3rd Edition
Figure 2d1-1
Example of Simulator Test Results for Minimum Control Speed, Air
2D-5
Evaluation Handbook 3rd Edition
Figure 2d1-2
Example of Simulator Test Results for Minimum Control Speed, Air - Time History
2D-6
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2D-7
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The normal procedure for this test calls for the pilot to establish the simulator in
a trimmed condition with symmetrical engine power and at an initial bank angle
appropriate to the test aeroplane, which may be wings level or it may be at an
initial bank angle of say, 30 degrees. After a few seconds has been allowed to
2D-8
Evaluation Handbook 3rd Edition
confirm stability, apply a roll control input to match that of the aeroplane. Use
longitudinal control as necessary to maintain a pitch angle as closely as possible
to that of the aeroplane. When the required final bank angle has been achieved,
return the roll controller to neutral. Ideally, the roll controller deflection will be
around one third of maximum, but as always this will be dependent on the
aeroplane data.
EXAMPLE
Figure 2d2-1
Example of Simulator Test Results for Roll Response (Cruise
Condition)
2D-9
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES Whilst this test, like test 2d(2), will give a good
2D-10
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
EXAMPLE
Figure 2d3-1 shows a slight roll overshoot, as the roll controller input was
returned fully to neutral between 14 and 15 seconds. The plots are another
example of the use of tolerance bands and show the test to pass, except that
2D-11
Evaluation Handbook 3rd Edition
they have been applied to roll rate, which was the parameter required in the
previous version of the ICAO Manual. Under the new requirements the same test
would fail, as it is not within the requisite 2o or 10% of bank angle. Nor does it
allow for at least 10 seconds of free response after the control input has been
removed.
Figure 2d3-1
2D-12 Example of Simulator Test Results for Step Input of Cockpit Roll
Controller
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2D-13
Evaluation Handbook 3rd Edition
RUDDER ANGLE
ROLL RATE
ELEVATOR ANGLE
ENGINES KEY PARAMETERS
WIND SPEED COMPONENTS
EVALUATION NOTES Like test 2d(3), the spiral stability test has as its main
purpose the determination that the simulator is like
the aeroplane during the free response which follows
the removal of a lateral control input. For the spiral
mode, however, the control is removed slowly once
the aeroplane is stable at a given bank angle and the
tendency (if any) of the simulated aeroplane to either
continue banking (unstable) or to return to wings level
(stable) is recorded. The flight test pitch angle should
be maintained fairly closely so that the airspeed does
not deviate significantly, and this again may be
achieved by use of an automatic closed-loop controller
or by manipulating the pitch controller as needed
when running manually. The yaw damper must be as
the aeroplane, but will usually be off. Bearing in mind
that most modern jet transport aeroplanes tend to
exhibit a relatively neutral spiral mode, it is very
important that any slight asymmetries present in the
flight test data (such as rudder or engine thrust
differences) are properly recognised during analysis of
the spiral mode test. Therefore this test should be
performed in both directions. The duration of the test
should be at least 20 seconds after the free response
period has commenced.
As an alternative, the test may be performed by
establishing a steady bank angle of about 30 degrees,
then simply maintaining the bank with a constant roll
control input. For the tolerance on bank angle, “correct
trend” means that the simulator should exhibit the
same tendency as the aeroplane to either increase or
decrease the bank angle during a free response, or
require the same lateral controller direction for the
alternative method.
2D-14
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The pilot is usually required to establish the simulator in a trimmed flight condition
with symmetrical engine power and zero bank angle. Particular care should be
exercised to precisely trim the simulator in wings level, stable flight with
symmetrical power, since initial trim strongly affects the test result. Smoothly roll
the simulator to the bank angle of the test aeroplane, and stabilise airspeed and
bank angle. Return the roll control slowly to neutral and allow a free response of
the simulator in roll. Pitch control may be applied to match the pitch angle of the
aeroplane. If the aeroplane test had the yaw damper on, verify that the rudder in
the simulator is very close to the aeroplane.
For the alternative test method, bank the aeroplane smoothly to the requisite
angle (should be about 30 degrees) and maintain that angle using roll control.
Airspeed must also be maintained using pitch control.
EXAMPLE
The result in Figure 2d4-1 is essentially a good one, and again uses tolerance
banding to prove the simulator meets the requirements. The absence of the initial
oscillation in roll which is present in the aeroplane data has no significant effect
on the outcome of the test.
Figure 2d4-1
2D-15
Example of Simulator Test Results for Spiral Stability
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2D-16
Evaluation Handbook 3rd Edition
EVALUATION NOTES These two tests are designed to ascertain that the
simulated rudder allows the same degree of control
power as the real rudder does in the aeroplane. They
are akin to the longitudinal trim tests found in 2c(5)
and as such tend to be quite straight forward to
perform. Whilst the main tolerance parameters do not
include items such as airspeed, pitch angle or rate of
climb, it is worth checking these values as well so as
to be sure that the flight condition is correct according
to the aeroplane data. Slightly different results can be
obtained depending on the engine-out trim technique
used, so the data should be carefully studied so that
the flight test manoeuvre is repeated accurately and
preferably in a manner similar to that for which a pilot
is trained to trim an engine failure condition. The tests
should be run dynamically, but the results can be
confirmed using a snapshot technique once the
simulated aeroplane has been established in a steady
state condition for a few seconds. It is important to
match the aeroplane bank angle accurately as there
can be a large change of rudder and sideslip with
bank angle for a stabilised constant heading condition.
The ICAO Manual states that for the approach or
landing test the power should be set to thrust for level
flight (i.e. level flight for the engine inoperative
condition).
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
2D-17
Evaluation Handbook 3rd Edition
substantial yaw control. For the approach condition, thrust will probably have
been set for level flight, but the validation data should be followed whichever
technique is used. Use lateral and directional trim to minimise the pilot control
forces as necessary to match the aeroplane conditions. Maintain this stabilised
flight condition for several seconds before verifying the results.
If the test involves a climb or descent, it may be helpful for the manual test to
start at a lower or higher altitude, respectively, so as to be stabilised at the
required pressure altitude of the aeroplane data.
EXAMPLE
The result shown in Figure 2d5-1 is another example of a snapshot test, showing
all the required parameters, and including a pass/fail assessment on engine
power lever angle as well as the requisite rudder deflection and sideslip angle.
While it is sometimes useful to mention items such as ‘Parameters with no a/c
data’, some explanation as to the meaning of this statement should be included
in the QTG.
Figure 2d5-1
Example of Simulator Test Results for Engine Inoperative Trim
2D-18
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2D-19
Evaluation Handbook 3rd Edition
ELEVATOR ANGLE
ENGINES KEY PARAMETERS
WIND SPEED COMPONENTS
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The data will most likely require the pilot to establish the simulator in a steady
level flight condition with symmetric engine power in the configuration specified.
Apply a rapid rudder pedal movement to match that of the aeroplane, but other
lateral/directional control inputs should not be used, nor should there be any
change away from the initial trim or power settings. Ideally there should follow a
free response of the simulator, but if the data deems it necessary the pilot should
maintain the rudder pedal (and thus the rudder) as close to the aeroplane
recorded time history as possible. Typical test duration will be around 20
seconds. The test technique is similar for both settings of the stability
augmentation (e.g. yaw damper) system.
2D-20
Evaluation Handbook 3rd Edition
EXAMPLE
Figure 2d6-1 is included because it was taken from the beginning of a Dutch roll
test. The duration however, is too short to determine the difference in
characteristics between the response with yaw damper off and on. Figure 2d6-2
has the yaw damper engaged. This test was driven with the rudder pedals (not
shown) and so allows the yaw damper simulation to be properly exercised.
Figure 2d6-1
Example of Simulator Test Results for Rudder Response (Yaw Damper 2D-21
Off)
Evaluation Handbook 3rd Edition
Figure 2d6-2
Example of Simulator Test Results for Rudder Response (Yaw Damper On)
2D-22
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2D-23
Evaluation Handbook 3rd Edition
EVALUATION NOTES Of primary consideration in this test are the dutch roll
characteristics excited by the rudder pedal
movements, which will usually be a 'doublet', which
requires the pedals to be depressed the same amount
in each direction before releasing. This technique
should then result in the oscillations being roughly
symmetrical about zero bank angle, making the results
easier to assess. With the rudder pedals returned to
the neutral position, the oscillations will be free,
allowing mathematical analysis after about 6 cycles,
with a typical dutch roll period in the order of 5 to 10
seconds. The test should ideally be driven through the
rudder pedals, so as to confirm the relationship
between control position and control surface position,
but any automatic rudder pedal driver should be
released after the pedals have been returned to
neutral. The amount of rudder deflection need not be
excessive, and like test 2d(6) should typically be
limited to around 25% of full rudder pedal throw.
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The validation data will most likely require the pilot to establish the simulator in
a steady level flight condition with symmetric engine power in the configuration
specified and with the yaw damper switched off. Apply a rapid rudder pedal
movement (or doublet) to match that of the aeroplane, but other lateral/directional
2D-24
Evaluation Handbook 3rd Edition
control inputs should not be used, nor should there be any change away from the
initial trim or power settings. There should follow a free response of the simulator
for approximately 60 seconds (at least 6 Dutch Roll cycles) with no control inputs
except as necessary to maintain the approximate flight test pitch angle.
It is desirable, but may not be necessary, to get an exact match of the aeroplane
rudder time history, since the purpose here is to check the dutch roll dynamic
characteristics - period, damping and bank angle to sideslip data.
It will be necessary to have a rudder input that results in the same general bank
trend as the aeroplane, following the rudder pedal release. The rudder pedal
input to excite the dutch roll is normally a doublet; one pedal in for a period of
time, then release, followed by the other pedal in for the same period of time,
then release. The general bank trend - either an oscillation with the average
drifting right wing down, an oscillation with the average drifting left wing down, or
an oscillation with the average near zero - can be controlled by the symmetry of
the doublet. A bank angle trend that clearly differs from the data may well result
in airspeed and angle of attack deviations, both of which can significantly affect
dutch roll dynamic characteristics.
EXAMPLE
The criteria for passing the dutch roll test does not have to include the precise
matching of all parameters such as roll rate and bank angle, etc., as the result in
Figure 2d7-1 illustrates. The results as shown passes, but this may not be
immediately obvious until the mathematical analysis is carried out. See Appendix
B for more details about such analysis.
2D-25
Evaluation Handbook 3rd Edition
Figure 2d7-1
Example of Simulator Test Results for Dutch Roll
2D-26
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2D-27
Evaluation Handbook 3rd Edition
EVALUATION NOTES The usual way in which this test is run is to begin by
configuring the aeroplane for trimmed level flight, then
applying a specified rudder pedal position and holding
it steady whilst maintaining heading using the control
wheel or roll controller. The sideslip angle thus
developed is probably the most critical parameter,
since a small deviation in this value can result in a
significant difference in bank angle and/or roll
controller angle. Several rudder deflections should be
used (at least two, preferably three or four), one of
which should be near the maximum available rudder
deflection. For propellor driven aeroplanes the test
should be performed with the deflections both to the
left and to the right. Some flight test data may be
presented such that there are inconsistencies in the
aeroplane results. If this is the case then it may be
necessary to take an average value or to carefully
scrutinise the values so that the best data set is
utilised. During these tests in the aeroplane, it is
possible for the wing fuel to move (usually inboard on
the upper wing), which gives a lateral centre of gravity
movement and a roll moment. This affects the aileron
angle required for trim. Care must be taken to verify
this in the aeroplane data and subsequently in the
simulator.
2D-28
Evaluation Handbook 3rd Edition
SPOILER or EQUIVALENT
ROLL CONTROLLER ±10% or ±5o
POSITION or FORCE
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The simulator should first be established in steady level flight with symmetric
engine power in the configuration and flight condition specified by the flight test
data. Whilst maintaining constant airspeed, apply a rudder deflection appropriate
to the validation data (using rudder pedal or rudder trim - whichever was
employed in the aeroplane) and use roll control to stabilise the simulator at a
bank angle required to hold a constant heading.
It may be possible or helpful to make use of the autopilot, in airspeed and
heading hold, to assist in this manoeuvre, though usually these tests are not
difficult to fly, and are of relatively short duration.
Once the steady sideslip has been achieved, acquire a snapshot of the stabilised
condition. Repeat for at least two rudder angles, ensuring that one of these is
near the maximum allowable rudder, as there is significant interest in
characteristics at large sideslip angles.
EXAMPLE
The result shown in Figure 2d8-1 shows a magnificent array of passes against
all the required parameters, but this is not strictly necessary since this condition
is clearly for the initial wings level (symmetrical) trim (with zero rudder deflection).
Other subsequent conditions no doubt show the true status of the test results.
2D-29
Evaluation Handbook 3rd Edition
Figure 2d8-1
Example of Simulator Test Results for Steady State Sideslip
2D-30
Evaluation Handbook 3rd Edition
SECTION 2e
LANDINGS
2E-1
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2E-2
Evaluation Handbook 3rd Edition
EVALUATION NOTES This test is designed to show that the simulator overall
normal landing characteristics, primarily in the
longitudinal axis, are sufficiently like those of the
aeroplane to allow pilot training for landing
manoeuvres. The test should be commenced at a
radio altitude of not less than 61 metres (200 feet), so
that all ground effects can be examined as the
simulated aeroplane descends. It is not necessary to
show the entire landing ground roll, but the time
history must include details of the nose gear
touchdown. The important parameters are many,
since this is such a critical area for pilot training, but it
may be unrealistic to expect all parameters to be in
tolerance all of the time due to the complex nature of
pilot activity that is usually present during the
aeroplane flight test. When the test is run
automatically it will typically be controlled by closed-
loop drivers on, for example, pitch angle (driven by
pitch controller, or elevator as a last resort) and
possibly bank angle (with roll controller), though the
latter should not in theory be significant for this
particular test. Flare characteristics should be
examined carefully to ensure that over- or under-
rotation has not occurred or that the elevator used to
perform the flare does not deviate inexplicably from
the validation data.
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
2E-3
Evaluation Handbook 3rd Edition
EXAMPLE
Figure 2e1-1 shows the difference that incorrect thrust is likely to make to a
landing test. The average thrust here is too high, but this has little effect on the
airspeed over the first 15 seconds. Where the problem manifests itself is in the
poor match with angle of attack (and also pitch angle, not shown). This test was
easily corrected by setting the correct thrusts prior to the commencement of the
test execution phase.
2E-4
Evaluation Handbook 3rd Edition
Figure 2e1-1
Example of Simulator Test Results for Normal Landing
2E-5
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2E-6
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
2E-7
Evaluation Handbook 3rd Edition
that the pilot is able to reproduce as accurately as possible the control inputs
used during the flight test. The most critical part of the manoeuvre is the flare
from 50 ft to touchdown, along with the possibility of scraping the tail. The
threshold speed should be as in the data. It should not be expected to achieve
perfect matches of all parameters and it will quite probably be necessary for
several attempts to be made before even a reasonably acceptable result is
obtained due to the complexity of coordinating and repeating several
simultaneous pilot inputs. If the aeroplane has autoland capability it should only
be used if the flight test also employed this method of achieving the landing.
EXAMPLE
Figure 2e2-1
Example of Simulator Test Results for Minimum Flap Landing
2E-8
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2E-9
Evaluation Handbook 3rd Edition
2E-10
Evaluation Handbook 3rd Edition
FORCE Lbs)
ROLL CONTROLLER ±10% or ±1.3 daN (3
FORCE Lbs)
RUDDER PEDAL FORCE ±10% or ±2.2 daN (5
Lbs)
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The simulator will probably be automatically trimmed (with the crosswind active)
in the correct configuration with an appropriate descent rate to allow the pilot to
easily fly down and complete the landing manoeuvre. The important points to
assess are firstly that the simulation is synchronised with the aeroplane data,
especially for the de-crab, flare and touchdown portions, and secondly that the
control positions and thrusts are the same or very similar. The data should be
carefully studied before beginning the manoeuvre so that the pilot is able to
reproduce as accurately as possible all the control inputs used during the flight
test. The most critical part of the manoeuvre is the de-crab and flare from 50 ft
to touchdown and then the speed decrease during the ground roll. The threshold
speed should be as in the data. It should not be expected to achieve perfect
matches of all parameters and it will quite probably be necessary for several
attempts to be made before even a reasonably acceptable result is obtained due
to the complexity of coordinating and repeating several simultaneous pilot inputs.
If the aeroplane has autoland capability it should only be used if the flight test
also employed this method of achieving the landing. The technique used can
vary between the wing down or crabbed approach and kick-off drift prior to
touchdown.
EXAMPLE
Figure 2e3-1 on the next page is a development version of one such test. The
plots appear to show that the ground effect is at fault, when in fact the solution
was to slightly increase the rate of descent during the trim phase. This had the
effect of causing the simulated aeroplane to touchdown slightly earlier and
thereby avoiding the erroneous pitch attitude which ensued when the simulated
aeroplane should already have been in ground contact. Slightly more difficult to
reconcile was the rate of airspeed decrease after touchdown. The validation data
clearly shows an anomaly here at around 105 knots and unless a wind speed
component is available but has not been properly implemented then further
information may be sought from the data provider.
2E-11
Evaluation Handbook 3rd Edition
Figure 2e3-1
Example of Simulator Test Results for Crosswind Landing
2E-12
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The purpose of this test is to show that the simulator
characteristics for a landing with one engine
2E-13
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The simulator will probably be automatically trimmed (with the appropriate engine
inoperative) in the correct configuration with a descent rate corresponding to the
2E-14
Evaluation Handbook 3rd Edition
flight test data to allow the pilot to easily fly down and complete the landing
manoeuvre. The important points to assess are firstly that the simulation is
synchronised with the aeroplane data, especially for the de-crab, flare and
touchdown portions, and secondly that the control positions and thrusts are the
same or very similar. The data should be carefully studied before beginning the
manoeuvre so that the pilot is able to reproduce as accurately as possible all the
control inputs used during the flight test. The most critical part of the manoeuvre
is the de-crab and flare from 50 ft to touchdown and then the speed decrease
during the ground roll. The threshold speed should be as in the data. It should not
be expected to achieve perfect matches of all parameters and it will quite
probably be necessary for several attempts to be made before even a reasonably
acceptable result is obtained due to the complexity of coordinating and repeating
several simultaneous pilot inputs. If the aeroplane has autoland capability it
should only be used if the flight test also employed this method of achieving the
landing. When conducting the test manually, ensure the simulator is controllable
with reverse thrust on the unaffected engine(s) down to the speed that reverse
thrust is disengaged.
EXAMPLE
In Figure 2e4-1 there are problems with both the de-rotation characteristics and
with the speed degree during the ground roll-out, as well as the fact that the
aeroplane data does not decrease sufficiently to match the requirement of 50%
of touchdown speed, though it is usually possible to run the test for longer to
show the simulator characteristics at lower speeds.
Figure 2e4-1
Example of Simulator Test Results for Engine Inoperative Landing 2E-15
Evaluation Handbook 3rd Edition
Figure 2e4-2
2E-16 Example of Simulator Test Results for Engine Inoperative Landing,
Alternate Engine Fit
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2E-17
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
Usual procedures call for the simulated aeroplane to be trimmed such that the
autoland system is able to cope with the aeroplane dynamics when the reposition
2E-18
Evaluation Handbook 3rd Edition
is complete. The pilot intervention in this test should be minimal, since the
intention is to check out the automatic landing system. Thus the test is in itself
manual rather than ‘automatic’ in the conventional (i.e. autotest) sense and the
inputs from the autotest system are typically confined to wind speed components
and terrain height. Where the aeroplane will provide lateral control to a full stop,
the simulator must also.
It may be that the results are sensitive to pilot inputs; on certain computer-
controlled aeroplanes the flare time is dependant of the exact moment of throttle
back during the flare.
EXAMPLE
An ‘interesting’ result is shown in Figure 2e5-1, but it is one that arguably looks
worse than it actually is. The overall synchronisation of the manoeuvre is not
correct, as exemplified by the radio altitude trace passing 56 feet approximately
0.5 second late. The flare begins over a second late as a result and is slightly too
long. Correct setup of the initial conditions cured the problem, as is often the
case.
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The intention behind this test is to determine that the
simulator autopilot exhibits the correct characteristics
2E-20
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The test will normally commence with the simulated aeroplane trimmed in the
appropriate configuration for a descent down the glideslope. Once the
manoeuvre has begun it is important to ensure that the autopilot behaviour is
synchronised in accordance with the aeroplane data, this will confirm that all
associated longitudinal, (and lateral and directional, if appropriate) trim changes
are correctly reflected.
EXAMPLE
2E-21
Evaluation Handbook 3rd Edition
Figure 2e6-1
Example of Simulator Test Results for Go-Around, All Engines Operating
2E-22
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2E-23
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The test will normally commence with the simulated aeroplane trimmed in the
appropriate configuration for a descent down the glideslope. Once the
manoeuvre has begun it is important to ensure that the power increase, flap
selection and gear selection are synchronised in accordance with the aeroplane
data, this will confirm that all associated longitudinal, lateral and directional trim
changes are correctly reflected.
For the autopilot case, the autopilot should be used with normal approach
procedures for one engine inoperative.
EXAMPLE
2E-24
Evaluation Handbook 3rd Edition
The captain’s wheel position plot in Figure 2e7-1 indicates that a closed-loop
controller was used to assist the simulator in maintaining the bank angle
specified (not shown) for the duration of the test. Probably the gains used for
the closed-loop controller driver are too high though, as it is somewhat
unlikely that the wheel would have been used to the extent indicated by the
plot. Unfortunately, the wheel angle was not supplied by the data provider, so
there is no way of knowing. Running the test manually would serve as a good
comparison in this case.
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED AIRSPEED
PARAMETERS RUDDER PEDAL POSITION
RUDDER ANGLE
NOSEWHEEL STEERING ANGLE
HEADING ANGLE
SIDESLIP ANGLE
YAW RATE
YAW ACCELERATION
ENGINES KEY PARAMETERS
LATERAL DEVIATION FROM RUNWAY CENTRE
LINE
WIND SPEED COMPONENTS
2E-26
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
EXAMPLE
Figures 2e8-1 and 2e8-2 below illustrate one of the reasons why obtaining
integrated validation data is important. Figure 2e8-1 has been run by driving both
2E-27
Evaluation Handbook 3rd Edition
the rudder surface position and the nosewheel angle independently and
overwriting the software locations directly. The yaw rate and speed are not
perfect matches, but they are very close and it may be that the reverse thrust
was not initialised quite correctly.
Figure 2e8-1
Example of Simulator Test Results for Directional Control with Symmetric Reverse
Thrust (1)
2E-28
Evaluation Handbook 3rd Edition
Figure 2e8-2 was run by driving rudder pedal position and allowing the
nosewheel to castor as it would when ‘flying’ the simulated aircraft normally. The
rudder pedal position had to be derived for the purposes of the test, based on the
rudder angle, as it was not provided as part of the validation data package. The
obvious difference between this result and that shown on Figure 2e8-1 is that the
nosewheel response is totally different - and in fact much more logical in this
result than in the former. This has resulted in a somewhat different yaw rate
profile, and an airspeed which is not within tolerance beyond approximately 12
seconds.
Figure 2e8-2
2E-29
Example of Simulator Test Results for Directional Control with Symmetric
Reverse Thrust (2)
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED AIRSPEED
PARAMETERS RUDDER PEDAL POSITION
RUDDER ANGLE
NOSEWHEEL STEERING ANGLE
HEADING ANGLE
SIDESLIP ANGLE
YAW RATE
YAW ACCELERATION
ENGINES KEY PARAMETERS
LATERAL DEVIATION FROM RUNWAY CENTRE
LINE
WIND SPEED COMPONENTS
2E-30
Evaluation Handbook 3rd Edition
EVALUATION NOTES The Evaluation Notes for test 2e(8) above apply to this
test, except that the test should be performed by using
a steadily increasing pilot rudder control input in the
direction to counter the yawing moment due to the
asymmetric thrust. The tolerance on airspeed applies
throughout the test, but it is especially important to
note the speed at the point where heading can no
longer be maintained (or at the declared minimum
speed for thrust reverser operation) to check that the
simulator maximum rudder effectiveness in the
presence of asymmetric reverse thrust matches the
validation data. The speed profile should be
maintained, along with the pedal steering inputs.
Typically a slow decrease in speed is combined with
a slow increase in pedal position to counteract the
yaw moment from the engine asymmetry. In some
flight tests, a nosewheel angle is present, even though
the nosegear steering system may be disconnected,
and the use or non-use of this nosewheel value this
may have a significant influence on the test results.
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
EXAMPLE
The result shown in Figure 2e9-1 illustrate how easily the lateral distance can
deviate even when the yaw rate is easily within tolerance. There may be reasons
why the lateral deviation plot should be omitted, but the regulatory authorities
2E-31
Evaluation Handbook 3rd Edition
Figure 2e9-1
Example of Simulator Test Results for Directional Control with Asymmetric Reverse
2E-32 Thrust
Evaluation Handbook 3rd Edition
SECTION 2f
GROUND EFFECT
2F-1
Evaluation Handbook 3rd Edition
2F GROUND EFFECT
The selection of the test method and procedures to validate ground effect
is at the option of the organisation performing the flight tests; however, the
flight test should be performed with enough duration near the ground to
sufficiently validate the ground-effect model.
2F-2
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2F-3
Evaluation Handbook 3rd Edition
EVALUATION NOTES The purpose of this test is to show that the simulator
aerodynamic model includes terms to adequately
represent the longitudinal ground effect on the
aeroplane. As discussed in Paragraph 2F above,
there are at least two acceptable means to
demonstrate that the longitudinal ground effect
characteristics match the validation data. For the level
fly-by method, the aeroplane will have been trimmed
for level flight at three or more heights both within and
just above ground effect at approximately the same
airspeed and configuration. The results should
illustrate the different longitudinal control required at
each height with the same trim position that was
established for the free-air fly-by condition, and the
thrust change required to maintain a constant
airspeed as the height decreases.
2F-4
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The fly-by test will usually consist of a series of longitudinal trims at several
heights above the ground with all except one in ground effect. The purpose is to
demonstrate that the trim conditions, either with regard to stabiliser position for
no pitch control input or pitch control for a constant stabiliser position, are
different for each height. In addition, small throttle adjustments will be required
to maintain a constant airspeed at each height. The differences however are
likely to be small, so it is of the utmost importance that all attempts are made to
obtain accurate steady state trim conditions so that this difference is easy to
distinguish. The results will be snapshots and as such do not require or benefit
from being plotted as a time history, and it may be reasonable to set and lock the
simulated aeroplane at the correct radio altitude to make the test points easier
to fly.
EXAMPLE
Referring to the example results in Figure 2f1-1, at first sight the result may
indicate that the ground effect is in error - obviously fundamental if the simulator
is to be used for landing manoeuvres. However, examination of the setup
conditions in this case revealed that the gross weight had not initialised correctly,
hence the large discrepancy between the aeroplane and simulator net thrust
values. The remedy was simply to re-run the test and make sure that the weight
was correct the second time. One-off anomalous test runs such as this are not
as common as they used to be, but can still sometimes happen.
2F-5
Evaluation Handbook 3rd Edition
Figure 2f1-1
Example of Simulator Test Results for Ground Effect Demonstration (Snapshots)
2F-6
Evaluation Handbook 3rd Edition
SECTION 2g
WINDSHEAR
2G-1
Evaluation Handbook 3rd Edition
2G WINDSHEAR
2G.1 GENERAL
2G.2 REQUIREMENTS
For the purposes of flight crew training, there are four critical phases of flight
for which wind models should be available in the simulator:
Whilst the Windshear Training Aid profiles provide one solution to the ICAO
Manual requirement, other sources of wind model data are also acceptable,
provided they are from a recognised source such as the UK Royal
Aerospace Establishment, Bedford or the Joint Airport Weather Studies
Project. See References 13 and 14 respectively for further details.
2G-2
Evaluation Handbook 3rd Edition
2G-3
Evaluation Handbook 3rd Edition
0
-10
U.Wind (kts)
-20
-30
-40
-50
0 2000 4000 6000 8000 10000 12000
Distance Travelled (ft)
0.6
0.4
0.2
0
0 2000 4000 6000 8000 10000 12000
Distance Travelled (ft)
0.6
0.4
0.2
0
0 2000 4000 6000 8000 10000 12000
Distance Travelled (ft)
Figure 2g-1
Wind Training Aid Model #1
2G-4
Evaluation Handbook 3rd Edition
-10
U.Wind (kts)
-20
-30
-40
-50
-60
0 5000 10000 15000 20000
Distance Travelled (ft)
0.8
V.Wind (kts)
0.6
0.4
0.2
0
0 5000 10000 15000 20000
Distance Travelled (ft)
-4
-6
-8
-10
0 5000 10000 15000 20000
Distance Travelled (ft)
Figure 2g-2
Wind Training Aid Model #2
2G-5
Evaluation Handbook 3rd Edition
-10
U.Wind (kts)
-20
-30
-40
-50
-60
0 2000 4000 6000 8000 10000 12000 14000
Distance Travelled (ft)
10
5
0
-5
-10
-15
0 2000 4000 6000 8000 10000 12000 14000
Distance Travelled (ft)
10
W.Wind (kts)
-10
-20
-30
0 2000 4000 6000 8000 10000 12000 14000
Distance Travelled (ft)
Figure 2g-3
Wind training Aid Model #3
2G-6
Evaluation Handbook 3rd Edition
10
0
-10
-20
-30
-40
0 5000 10000 15000 20000 25000
Distance Travelled
10
5
0
-5
-10
-15
0 5000 10000 15000 20000 25000
Distance Travelled
0
-10
-20
-30
0 5000 10000 15000 20000 25000
Distance Travelled
Figure 2g-4
Wind Training Aid Model #4
2G-7
Evaluation Handbook 3rd Edition
1.9
3 & 4 ENGINE
AIRPLANES
USE THIS DATA
1.8
1.7
1.6
WIND FACTOR
1.5
1.4
1.3
1.2
1.1
0.9
0.8
0.16 0.18 0.2 0.22 0.24 0.26 0.28 0.3 0.32
Figure 2g-5
2G-8 Wind Training Aid Wind Factor Chart
Evaluation Handbook 3rd Edition
Figure 2g-6
United Kingdom Royal Aerospace Establishment (now Qinetiq) Microburst Vortex Ring Air
Flow Model
2G-9
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
2G-10
Evaluation Handbook 3rd Edition
TOLERANCES None
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
This test should be performed after it has been determined that the nominated
model has been implemented correctly by comparing simulator data
(windshear profiles) with the flight (model) data. The effect of windshear in
each of the two flight conditions should be demonstrated firstly by the pilot
performing the appropriate manoeuvre (i.e. takeoff and approach/go around)
with no windshear inserted and then with the windshear, using a corresponding
model for each which should be provided (and clearly labelled as either takeoff
or approach/landing) on the instructor's station. Recording and plotting the
various parameters of interest will assist greatly in the subsequent evaluation
of the results. It may enhance the effectiveness of the test if the pilot flies
through each of the models without being briefed on which profile to expect. If
the profile selected is recognised by the flight crew as a windshear problem,
then the demonstration is successful.
EXAMPLE
Figure 2g1-1 below shows a partial set of equivalent results for both a
simulated takeoff with and without windshear present. There is no aeroplane
data for comparison, nor does there need to be, as the intent is to show the
effects of the windshear on the simulation. These tests should be backed up by
subjective evaluation.
2G-11
Evaluation Handbook 3rd Edition
Figure 2g1-1
Example of Simulator Test Results for Takeoff Windshear Demonstration
2G-12
Evaluation Handbook 3rd Edition
SECTION 2h
2h(1) Overspeed
2H-1
Evaluation Handbook 3rd Edition
As a general note, all of the tests in this section are applicable only to
Computer Controlled Aeroplanes and should show time history results of
the response to control inputs during entry into each envelope protection
function. The requirements of the ICAO Manual state that all these tests
must be run in both normal and degraded control states if the function is
different. However, it is evident that for envelope limiting functions, there
is little to be gained by running a test where that function is inactive,
hence the requirement really refers to testing for the most degraded
states where the function is still active.
2H-2
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The test should first be run with all aeroplane
systems functional so that the normal operation of
the overspeed protection system can be tested, this
may have been achieved either by the utilisation of
the actual flight warning unit or else by software
simulation. Running the test automatically will
probably involve the driving of the pitch controller
position, so that the test does not bypass or ignore
any interaction between the electronic flight control
system and the overspeed protection system. The
method of detecting the onset of the overspeed
protection should be ascertained, this can be
typically identified by changes in the control surface
2H-3
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
This test should be continued until the airspeed is stable for the current
altitude. The entry into the overspeed function should be smooth, with a
constant control position, which may be stick forward or stick neutral, as
required for accelerating flight in the specific aeroplane configuration.
EXAMPLE
2H-4
Evaluation Handbook 3rd Edition
2H-5
Figure 2h1-1
Example of Simulator Test Results for Overspeed Protection Function
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The test should first be run with all aeroplane
systems functional so that the normal operation of
the minimum speed system can be tested. Running
the test automatically will probably involve the
driving of the elevator surface position, but this
should not adversely affect the test results providing
such methods do not bypass or ignore any
interaction between the electronic flight control
system and the minimum speed warning system.
The method of detecting the minimum speed
warning should also be ascertained, so that the
2H-6
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The entry into the speed limiting function should be smooth, with a constant
control position. This may be stick aft or stick neutral, as required for
decelerating flight in the specific aeroplane configuration. Any other incidence
limiting function, such as an alpha floor function on the engine power, should
be inactive during this test (these functions will be tested in test 2h(6)).
EXAMPLE
In the example in Figure 2h2-1, although the result exceeds tolerance at the
end of the time history, the result is within tolerance during the minimum
speed portion of the test, and therefore it is acceptable.
2H-7
Evaluation Handbook 3rd Edition
Figure 2h2-1
Example of Simulator Test Results for Minimum Speed Protection Function
2H-8
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The test should first be run with all aeroplane
systems functional so that the normal operation of
the load factor protection system can be tested, this
may have been achieved either by the utilisation of
the actual aeroplane flight warning unit or else by
software simulation. Running the test automatically
will probably involve the driving of the pitch and roll
controller positions, so that the test does not bypass
or ignore any interaction between the electronic
flight control system and the load factor protection
system. The method of detecting the onset of the
2H-9
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The test should commence when trimmed for steady level flight, after which
the pilot needs to bank the using the control wheel/lateral controller. The
entry into the load factor limiting function should be smooth, and the
aeroplane should be held against its load factor limit for long enough to
establish the stabilised value. Systems designed to limit bank angle above a
certain load factor should also be demonstrated by this test, which should be
of sufficient length for the aeroplane to obtain a stabilised bank angle in these
conditions. . Care must be taken that the test demonstrates the correct
protection, as even slight mishandling can cause other protections, such as
the AOA, alpha floor or pitch angle protections to become active also. It may
be appropriate to disable, if possible, other such limiting functions, in order
that the load factor protection can be demonstrated in isolation.
2H-10
Evaluation Handbook 3rd Edition
2H-11
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The test should first be run with all aeroplane
systems functional so that the normal operation of
the pitch angle protection system can be tested, this
may have been achieved either by the utilisation of
the actual flight warning unit or else by software
simulation. Running the test automatically will
probably involve the driving of the pitch controller
position, so that the test does not bypass or ignore
any interaction between the electronic flight control
system and the pitch angle protection system. The
method of detecting the onset of the protection
should be ascertained, which is typically identified
by changes in the control surface angles that do not
2H-12
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
With the aeroplane stabilised in steady level flight, the entry into the pitch
angle limiting function should be made using the control column or pitch
controller and should be smooth, with the input being held for long enough to
establish the stabilised value. Care must be taken that the test demonstrates
the correct protection, as even slight mishandling can cause other protections,
such as the AOA, normal load factor or minimum speed protections to become
active also. It may be appropriate to disable, if possible, other such limiting
functions, in order that the load factor protection can be demonstrated in
isolation.
EXAMPLE
In this example (Figure 2h4-1a & 1b) the pitch angle is stabilised following a
small overshoot at 25 deg by the protection, but before the end of the time
history the minimum speed protection becomes active and starts to reduce the
pitch angle still further without any input from the pilot.
2H-13
Evaluation Handbook 3rd Edition
2H-15
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The test should first be run with all aeroplane
systems functional so that the normal operation of
the pitch angle protection system can be tested, this
may have been achieved either by the utilisation of
the actual flight warning unit or else by software
simulation. Running the test automatically will
probably involve the driving of the roll controller
position, so that the test does not bypass or ignore
any interaction between the electronic flight control
system and the bank angle protection system. The
method of detecting the onset of the protection
should be ascertained, which is typically identified
2H-16
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
With the aeroplane stabilised in steady level flight, the entry into the bank
angle limiting function should be made using the control wheel or lateral
controller and should be smooth, with the input being held for long enough to
establish the stabilised value. Care must be taken that the test demonstrates
the correct protection, as even slight mishandling can cause other protections,
such as the AOA or normal load factor protections to become active as well.
EXAMPLE
2H-17
Evaluation Handbook 3rd Edition
2H-18
Figure 2h5-1
Example of Simulator Test Results for Bank Angle Protection Function
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The test should first be run with all aeroplane
systems functional so that the normal operation of
the angle of attack protection system can be tested,
this may have been achieved either by the utilisation
of the actual flight warning unit or else by software
simulation. Running the test automatically will
probably involve the driving of the pitch controller
position, so that the test does not bypass or ignore
any interaction between the electronic flight control
system and the bank angle protection system. The
intention is to demonstrate protection of the
aeroplane against excessive angles of attack. Note
2H-19
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
Having ascertained that the simulated aeroplane is in trim for steady level
flight, the entry into the angle of attack limiting function should be smooth, with
a constant longitudinal control position. This may be stick aft or stick neutral,
as required for decelerating flight in the specific aeroplane configuration. Care
must be taken that the test demonstrates the correct protection, as even slight
mishandling can cause other protections, such as the normal load factor or
minimum speed protections to become active also. It may be appropriate to
disable, if possible, other such limiting functions, in order that the angle of
attack protection can be demonstrated in isolation.
The test should continue for long enough to demonstrate recovery from, or
stabilisation at, the high incidence condition.
EXAMPLE
Figure 2h6-1 shows an example of the use of tolerance banding on the plots.
Not considered universally useful, they are appropriate in this test because of
the need to check the simulated system against the aeroplane limitation.
2H-20
Evaluation Handbook 3rd Edition
Figure 2h6-1
Example of Simulator Test Results for Angle of Attack Protection Function
2H-21
Evaluation Handbook 3rd Edition
2H-22
Evaluation Handbook 3rd Edition
SECTION 3
MOTION SYSTEM
3a FREQUENCY RESPONSE
3b LEG BALANCE
3c TURN-AROUND CHECK
3d MOTION EFFECTS
3-1
Evaluation Handbook 3rd Edition
3.1 INTRODUCTION
Figure 3-1
Flight Simulator Six-Axis Synergistic Motion System
During the flight of a real aeroplane, the movement of the aeroplane gives
rise to perceivable stimuli to the pilot's sensory organs. These stimuli,
referred to as motion cues, form an important source of information to the
pilot when performing the task of controlling the aeroplane. It is important
therefore to produce the relevant motion cues in a simulator to obtain the
overall fidelity necessary for using it as an adequate training tool.
3-2
Evaluation Handbook 3rd Edition
Figure 3-2
Simulator Motion System Drive Block Diagram (Simplified)
3-3
Evaluation Handbook 3rd Edition
The frequency response, leg balance and turn-around bump tests have
been the traditional motion tests that have been in existence for many
years. More precise definitions of these items and the way in which they
are applied to flight simulators are given in the succeeding pages.
Usually a special program is used to run these tests which bypasses the
main motion drive software normally used for training purposes. This is
3-4
Evaluation Handbook 3rd Edition
because the tests are not checking the types of cues to be sensed on the
flight deck (these are done in the functional and subjective test section of
the QTG) but the performance of the motion system itself (with its
"payload").
To ensure that very small test input amplitudes could not be used,
guidance was provided in the form of suggested minimums for the test
inputs. Since this was a new concept of test, there was no experience to
draw upon to define the thresholds that were set for these minimums. An
initial guess was proposed with the view that this could be adjusted as
experience was gained. The intent was that these thresholds should
3-5
Evaluation Handbook 3rd Edition
appear in guidance material rather than in the regulations, but the way the
document evolved precluded this as the guidance material became
absorbed into the section entitled Test Requirements.
Isolating the aircraft characteristics from the motion system ensured that
there was no need for aircraft data to be used. This made the test generic
and independent of the aircraft simulated. Test results across platforms
could differ because of the gains set in the motion cueing software.
It was recognised that software cueing gains could be different for on-
ground and in-air cases to optimise the motion performance. For this
reason, two tests conditions were specified, an on-ground test and an in-
air test. The shape and form of the injected test input was left to the
discretion of the operator, hence the wording "One test case on-ground: to
be determined by the Operator" and " One test case in-air: to be
determined by the Operator"
The concept of the test was to generate a footprint test during the initial
evaluation, which would produce the Master QTG result. Subsequent
recurrent evaluations would be compared with the MQTG result to highlight
any changes in both the hardware and software performance of the motion
cueing system. Specifically, the amplitude during recurrent testing must
remain within ±0.05g relative to that measured during the initial
qualification.
3-6
Evaluation Handbook 3rd Edition
SECTION 3a
FREQUENCY RESPONSE
3a FREQUENCY RESPONSE
3A-1
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The purpose of this test is to determine the phase lag
and the attenuation experienced by the motion system
when subjected to an oscillatory input. It is normally
run at two separate frequencies, a slow frequency
(typically 0.1 Hz) and a higher frequency (typically 0.5
Hz). At 0.1 Hz, the phase lag between the reference
input and the actuator position feedback will usually be
of the order of less than 10 degrees, whereas at 0.5
Hz the phase lag will be more in the region of 30
degrees, though some variation can be expected for
different motion systems. The attenuation will usually
be greater than -1.0 dB for both cases, though
naturally both phase and gain are frequency
dependent.
3A-2
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
It is not recommended that this test is performed manually because of the special
driving signals needed to allow the test to be run correctly so as to permit
analysis of the results. In any case it is not the motion cues experienced by the
flight crew which are under scrutiny, but the physical response of the motion
system itself.
EXAMPLE
Figures 3a-1 and 3a-2 on the next page shows two differing sets of simulator test
results for Motion System Frequency Response. The top set is in the form of a
table which gives the phase and gain values for each jack after running the test
at a particular frequency. This table will be accompanied by plots of the response
of each of the actuators driven simultaneously by a sinusoidal reference input.
The lower set is in the more conventional Bode Plot format, which shows the
phase and gain over a range of frequencies against a logarithmic frequency axis
(1Hz, 10Hz, 100Hz in this case). Note that the response falls away rapidly above
approximately 10 Hz.
3A-3
Evaluation Handbook 3rd Edition
Figure 3a-1
Frequency Response Results Example 1
3A-4
Evaluation Handbook 3rd Edition
Figure 3a-2a
Frequency Response Results Example 2
3A-5
Evaluation Handbook 3rd Edition
Figure 3a-2b
Frequency Response Results Example 2
3A-6
Evaluation Handbook 3rd Edition
SECTION 3b
LEG BALANCE
3b LEG BALANCE
3B-1
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
3B-2
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
It is not recommended that this test is performed manually because of the special
driving signals needed to allow the test to be run correctly so as to permit
analysis of the results. In any case it is not the motion cues experienced by the
flight crew which are under scrutiny, but the physical response of the motion
system itself.
EXAMPLE
See Figures 3b-1 and 3b-2.
3B-3
Evaluation Handbook 3rd Edition
Figure 3b-1
Example of Simulator Test Results for Motion System Cross-Drive (Leg Balance) at 0.5 Hz
3B-4
Evaluation Handbook 3rd Edition
Figure 3b-2
Example of Simulator Test Results for Motion System Cross-Drive (Leg Balance) at 3 Hz
3B-5
Evaluation Handbook 3rd Edition
3B-6
Evaluation Handbook 3rd Edition
SECTION 3c
TURN AROUND
3c TURN AROUND
3C-1
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
3C-2
Evaluation Handbook 3rd Edition
SIMULATOR ACCEPTANCE.
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
It is not recommended that this test is performed manually because of the special
driving signals needed to allow the test to be run correctly so as to permit
analysis of the results. In any case it is not the motion cues experienced by the
flight crew which are under scrutiny, but the physical response of the motion
system itself.
EXAMPLE
The example below (Figure 3c-1) shows the response of two motion system
actuators (‘jacks’) when subjected to a sinusoidal reference drive input at 0.5 Hz.
Both plots exhibit reversal characteristics at peak amplitude, i.e. there are visible
non-linearities just as the direction of travel is beginning to reverse, though the
nature of the non-linearities is slightly different between the two actuators. The
deviation from the ‘standard’ sinusoidal path can be clearly seen in both these
plots, however all such deviations are within the ±0.01g tolerance limit set by the
simulator manufacturer.
3C-3
Evaluation Handbook 3rd Edition
Figure 3c-1
Example of Simulator Test Results for Motion System Turn Around/Smoothness
(Actuator #1, upper plot & Actuator #5, lower plot)
(Platform heave motion based on a reference drive demand at 0.5 Hz)
3C-4
Evaluation Handbook 3rd Edition
SECTION 3d
MOTION EFFECTS
(These requirements are stated as being Validation Tests, but are specified
in the Functions and Subjective Testing section of the ICAO Manual)
3D-1
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES This test is only required to check the motion buffets,
but the general ‘feel’ of the main motion system
simulation can also be subjectively examined at the
same time. Whilst some of the flight conditions and
aeroplane configurations may be set up using the
simulator autotest system, because of there being no
requirement to meet tolerances, along with the
necessary pilot input which is required during this test,
automatic running and checking against tolerances is
not possible. The reader is referred to Volume II of this
Handbook, as these tests essentially fall into the
category of ‘Functions & Subjective Tests’.
TOLERANCES NONE
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The tests require the simulator to be flown in all the specified regimes.
3D-2
Evaluation Handbook 3rd Edition
SECTION 3e
3E-1
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED TIME
PARAMETERS MOTION LINEAR ACCELERATION DEMANDS
MOTION ROTATIONAL ACCELERATION
DEMANDS
MOTION ROTATIONAL VELOCITY DEMANDS
MOTION LINEAR ACCELEROMETER - X, Y, Z
MOTION JACK POSITIONS
EVALUATION NOTES This test is required to make sure that the motion
system is being properly maintained over the life of
the simulator. It is a test of both the motion
hardware and the motion cueing and filtering
software to monitor change. The test represents the
motion system's reaction to a series of demands to
the motion cueing software and is totally
independent of aircraft data. Pre-defined demands
are injected at the point where the aircraft centre of
gravity accelerations and velocities would be
transformed into the pilot reference point
accelerations and velocities prior to entering the
motion cueing software.
3E-2
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
EXAMPLE
Figure 3e-1 (six diagrams on the following pages) shows the Master footprint
result for the Motion Repeatability Test - On Ground. Note the linear
acceleration inputs have been provided for X, Y and Z as independent inputs,
with responses both in terms of accelerations measured by accelerometer and
also with motion jack positions. Angular acceleration/rates inputs in roll, pitch
and yaw are then provided as independent inputs.
3E-3
Evaluation Handbook 3rd Edition
3E-4
Evaluation Handbook 3rd Edition
Figure 3e-1b
3E-5
Evaluation Handbook 3rd Edition
Figure 3e-1c
3E-6
Evaluation Handbook 3rd Edition
Figure 3e-1d
3E-7
Evaluation Handbook 3rd Edition
Figure 3e-1e
3E-8
Evaluation Handbook 3rd Edition
Figure 3e-1f
3E-9
Evaluation Handbook 3rd Edition
3E-10
Evaluation Handbook 3rd Edition
SECTION 3f
3F-1
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
RECORDED TIME
PARAMETERS USUAL PARAMETERS FROM THE RECORDED
PARAMETER LIST FOR THE ORIGINAL SOURCE
TEST
LINEAR ACCELERATIONS AT PILOT
REFERENCE POINT
ANGULAR ACCELERATIONS AT PILOT
REFERENCE POINT
ANGULAR RATES AT PILOT REFERENCE POINT
MOTION ACTUATOR POSITIONS
MOTION PLATFORM LINEAR DISPLACEMENT
AND ANGULAR POSITION
MOTION LINEAR ACCELEROMETER - X, Y, Z
EVALUATION NOTES This test is required to make sure that the motion
system cues in various flight regimes remain
consistent and also that this cueing is being properly
maintained over the life of the simulator. No
tolerances are prescribed, but the tests are to check
the ability of the motion system to give repeatable
cues and so that the response should not markedly
3F-2
Evaluation Handbook 3rd Edition
3F-3
Evaluation Handbook 3rd Edition
TOLERANCES NONE
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
EXAMPLE
The plots of Figure 3f-1 shows 1b(4) Normal Takeoff with the additional
motion related plots. This is done to provide a general example of a Motion
Cueing Performance Signature Test. Please note that the full complement of
plots for the original 1b(4) source test have not been provided, but this
example places emphasis on the additional motion related plots.
3F-4
Evaluation Handbook 3rd Edition
Figure 3f-1a
Motion Cueing Performance Signature - Normal Takeoff
3F-5
Evaluation Handbook 3rd Edition
Figure 3f-1b
Motion Cueing Performance Signature - Normal Takeoff
3F-6
Evaluation Handbook 3rd Edition
Figure 3f-1c
Motion Cueing Performance Signature - Normal Takeoff
3F-7
Evaluation Handbook 3rd Edition
Figure 3f-1d
Motion Cueing Performance Signature - Normal Takeoff
3F-8
Evaluation Handbook 3rd Edition
Figure 3f-1e
Motion Cueing Performance Signature - Normal Takeoff
3F-9
Evaluation Handbook 3rd Edition
Figure 3f-1f
Motion Cueing Performance Signature - Normal Takeoff
3F-10
Evaluation Handbook 3rd Edition
Figure 3f-1g
Motion Cueing Performance Signature - Normal Takeoff
3F-11
Evaluation Handbook 3rd Edition
Figure 3f-1h
Motion Cueing Performance Signature - Normal Takeoff
3F-12
Evaluation Handbook 3rd Edition
Figure 3f-1i
Motion Cueing Performance Signature - Norm al Takeoff
3F-13
Evaluation Handbook 3rd Edition
Figure 3f-1j
Motion Cueing Performance Signature - Normal Takeoff
3F-14
Evaluation Handbook 3rd Edition
SECTION 3g
3G-1
Evaluation Handbook 3rd Edition
3G.1 INTRODUCTION
3G-2
Evaluation Handbook 3rd Edition
3G-3
Evaluation Handbook 3rd Edition
The human pilot is sensitive to both the amplitude and the frequency of
vibration. Research has determined that humans are particularly
sensitive to vibratory accelerations occurring in the frequency range
from 1-10 Hz. Human physiological sensitivity to vibration is dependent
on body position, method of support relative to the direction of
oscillation, and what portions of the body resonate. The whole body
natural frequency for a human when seated occurs at 3-5 Hz, and
disorientation (vertigo) can occur at about 1 Hz. The maximum motion
of the head relative to the seat occurs at 3-6 Hz. If vibration levels
become severe, breathing becomes difficult for structural responses
within the range 1-4 Hz, and chest pains result from 3-10 Hz
oscillations.
3G-4
Evaluation Handbook 3rd Edition
3G-5
Evaluation Handbook 3rd Edition
the simulator data, and allowances made when comparing the resulting
APSD. The following sub-sections provide a brief introduction into how
the major control parameters affect the resulting APSD.
3G.5.1 Windowing
3G.5.2 Bandwidth
The following plots are for exactly the same simulator recording
processed so as to generate APSD plots with different bandwidths.
Figure 3g-2a
APSD Plot, Processed using 0.25 hz Bandwidth
Figure 3g-2b
APSD Plot, Processed using 2.0 Hz Bandwidth
Clearly the increased bandwidth has smoothed the peaks of the trace
and reduced the apparent magnitude, if direct comparison is to be
conducted between the aircraft trace (dotted) and the simulator result
then both should be processed to the same bandwidth.
3G-7
Evaluation Handbook 3rd Edition
transformation stages are repeated for several times and the results
averaged. The aircraft trace will usually contain more random variation
than the simulator result, and therefore benefit more from averaging.
Using a single pass analysis, when the windowing function gain (Figure
3g-3a, dotted) is multiplied with the input signal, a large percentage of
the signal is processed at less than full gain.
Figure 3g-3a
Single Pass Analysis
Figure 3g-3b
Multiple Pass Analysis
However, if too many averages are performed upon a short duration
signal, then the benefits of the windowing function are lost, as the fade
in and out of the signal by the windowing function becomes too rapid.
3G-8
Evaluation Handbook 3rd Edition
3G.5.4 Smoothing
Usually, two types of final data plots are prepared for most of the
operating conditions:
2. Power spectral density plots are created for each axis of acceleration.
These plots display frequency from 1.0 to 20.0 Hz as the abscissa and
acceleration power spectral density, in G2/Hz units, as the ordinate.
For each flight case checked, a set of time histories for X, Y and Z axes
should be provided from the simulator which allows side-by-side viewing
with similar plots from the aeroplane.
For each axis, the simulator vibration system will usually provide more
than one noise source of slightly different frequency content and several
periodic frequency drives. As a minimum, these will probably be
low-frequency drive (0-4 Hz), mid-frequency drive (4-10 Hz) and
high-frequency drive (10-30 Hz), though there may be more than these
for some simulators or from some simulator manufacturers. The
predominant frequency in each of these frequency ranges will have
been chosen from the flight test data and programmed into the vibration
3G-9
Evaluation Handbook 3rd Edition
The characteristics of the Motion system may well be such that 'cross
coupling' of the effects are evident, particularly at medium to high
frequencies. The effect is such that, when driving high axis vibrations in
one axis (e.g. the lateral axis, 'Y'), a moderate amount of vibration in
another axis (e.g. the vertical axis, 'Z') will be induced. This will be
noticed predominantly on the spectral plots. Given the nature of a flight
simulator and its motion system this may be unavoidable and should
therefore be taken into consideration when reviewing the results.
3G-10
Evaluation Handbook 3rd Edition
Note. For some aeroplanes, there may be no data available for some of
the conditions. Under these circumstances no objective comparison can
be made but simulator plots can still be useful in order to provide
objective results which can be used as a reference point for future
evaluations once the effect has been subjectively assessed as being
acceptable.
There may well be occasions when pilots are not happy with the
amplitudes at a particular QTG check point. Assuming that the
conditions and scalings have been checked objectively with the time
histories and power spectral density plots, then the best that can be
done is to tune the drives high or low as far as can be permitted within
the limitations of achieving a reasonable match.
3G-11
Evaluation Handbook 3rd Edition
3G-12
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
3G-13
Evaluation Handbook 3rd Edition
TOLERANCES NONE
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
Position the simulated aeroplane at the prescribed point within the appropriate
motion buffet regime. A typical period for which this condition would have to be
maintained is 20 to 30 seconds. The motion system platform vibrations should
be plotted in all three linear axes and this will almost certainly entail the use of
a spectral analyser. Understanding exactly how the analyser functions is not
necessary for the evaluation of the test results. Results should be hard-copied
for comparison with aeroplane data. Ideally the scales should be the same for
both aeroplane and simulator data sets.
EXAMPLE
Several sets of test results for both time histories and power spectral density
(in two axes) is shown on the following pages in Figures 3g-4 to 3g-9
inclusive. Note that these are only representative, and therefore should not be
taken as being valid for all (or any) aeroplane, either for flap buffet or any
other flight condition. What they are intended to illustrate is that the evaluation
of such results must primarily look at the trends of the recordings, along with
certain of the ‘spikes’ that should be present at or near the frequencies at
which they may be seen on the aeroplane data.
3G-14
Evaluation Handbook 3rd Edition
Figure 3g-4
Example of Simulator Test Results for Flap Buffet Amplitude Time History (Y- and Z-
Axes Only)
Figure 3g-5
Example of Aeroplane Manufacturer’s Data for Flap Buffet - Amplitude Time History
3G-15
Evaluation Handbook 3rd Edition
Figure 3g-6
Example of Simulator Test Results for Flap Buffet - PSD Plots (Y- and Z-Axes Only)
3G-16
Evaluation Handbook 3rd Edition
Figure 3g-7
Example of Aeroplane Manufacturer’s Data for Flap Buffet - PSD Plots
3G-17
Evaluation Handbook 3rd Edition
Figure 3g-8
Example of PSD Plot Obtained from Stand-alone Test Equipment
Figure 3g-9
Example of Time History Plot Obtained from Stand-alone Test Equipment
3G-18
Evaluation Handbook 3rd Edition
SECTION 4
VISUAL SYSTEM
4-1
Evaluation Handbook 3rd Edition
Most, if not all, advanced flight simulators are built with a visual system
which employs computer generated imagery. The testing of these systems
entails the use of methods and techniques with which a simulator training
captain or evaluation pilot may not be very familiar. In particular, technical
terms such as foot-lamberts (or candles per square metre) are not
encountered on an every day basis and as a result may well be alien to him
- at least initially. Nevertheless, an understanding of these and other terms
and the way in which they are applied to the evaluation of flight simulator
visual systems is fundamental if the evaluator is to perform his function
properly.
The requirements on modern visual systems, and especially those for which
qualification is being sought, are stringent and well defined. For a lower
level qualified device, the visual must provide dusk and night scenes,
whereas for the higher level daylight scenes are necessary. The overall
brightness capability of any system must be such that realistic simulation of
aeroplane landing lights and lights on the ground, as well as significant
topographical feature, is provided. Naturally, the field of view from each
pilot station and the portrayal of the general environment must be fully
compatible with the aeroplane being simulated, with special emphasis being
placed on the visible ground segment on approach.
Visual system computers, whilst powerful on their own, are normally run as
slave to the simulator host computer via some kind of electronic
communications link (usually an Ethernet). This hardware set-up works
well, but has the obvious limitation that a delay is introduced between the
response of the simulated aeroplane and the response of the visual system
image generator. Limits have been set on the maximum allowable delay,
but in all flight regimes this delay should be small enough to remain
unnoticed by the flight crew.
4-2
Evaluation Handbook 3rd Edition
SECTION 4a
4A-1
Evaluation Handbook 3rd Edition
4A.1 INTRODUCTION
The test technique used should be designed to measure only those delays
introduced by the purely simulator aspects and not the response of the
aerodynamic or certain control features which would also be found on the
aeroplane itself. For example, there will always be a small time delay in the
aeroplane between the control column being displaced and the movement
of the elevator surface and also between the movement of the elevator
surface and a resultant change in pitch angle. It is not the purpose of the
latency requirement contained in the ICAO Manual to use complicated flight
test equipment to measure these delays on the aeroplane so that they can
be accurately reproduced on the simulator. The response of the simulator
model compared to the aeroplane is checked in the relevant QTG validation
tests.
4A-2
Evaluation Handbook 3rd Edition
Figure 4a-1
Example of Simulator System Response (Latency) Results
4A.3 TRANSPORT DELAY
The transport delay is defined as the total simulator system processing time
required for an input signal from a pilot primary flight control until motion
system, visual system or instrument response. It is the overall time delay
incurred from signal input until output response. It does not include the
characteristic delay of the aeroplane simulated.
The results obtained by this method of testing for simulator response times
are more clearly defined than when running conventional latency tests and
can be measured more easily with less chance of ambiguity. The method
gives a direct measurement of latency without having to subtract the
aeroplane response and as such no aeroplane flight test data is required.
The tests ensure that all the computing elements, in the critical path, are
executed in the optimum order.
The following method is typical of that used to perform the transport delay
checks:
With the software executing normally, a force demand is injected into the
primary control causing it to move. The control position signal passes along
4A-3
Evaluation Handbook 3rd Edition
the normal controls path to the host computer where the change is detected.
This is used to initialise a discrete index counter which is incremented by
each of the software modules in turn as the critical path is executed. The
index in a given module can only be updated if it contains the index number
of the module designated to run immediately before it. This allows the
software path to be traced with a signal which is not degraded by the
simulated model and checks that each element of the critical path is
executed in the correct order. The time overhead introduced by the
incrementing of the discrete index is negligible.
The signals then pass through the normal computing path to generate the
visual picture, motion deflection and instrumentation response. A recording
device is used to plot the deflections and the time delay between the onset
of the control deflection and the change in state of the visual, motion and
instruments gives a direct measurement of the transport delays in the
system.
Figure 4a-2
Example of Simulator System Response (Transport Delay) Results
4A-4
Evaluation Handbook 3rd Edition
It should be borne in mind that the example given has been "styled" to show
the type of results which would be expected. The exact appearance of the
results may well differ among the different simulator manufacturers. The
important aspect is the time delay between the movement of the flight
controls and the corresponding movement of the other signals. The
direction of movement displayed on this type of plot is not usually of any
significance and will depend on various factors such as the type of recording
device used for these tests and the sign conventions used for the plotted
parameters.
There are some inherent difficulties in the acquisition of aeroplane flight test
data for information which needs to be measured over a very short duration
(i.e. less than 1 second in this particular case). Therefore most modern
simulators use only the transport delay method for accomplishing these
tests. Note that the reason for there only being 3 tests required for the
transport delay method versus 9 for the conventional latency method is that
the differences in transport delay which might be said to occur between the
takeoff, cruise and approach or landing flight conditions are assumed to be
negligible. This assumption is broadly correct, in that the critical path
computations are essentially the same whichever axis is being tested.
4A-5
Evaluation Handbook 3rd Edition
There has been a move away from use of external plotting equipment,
largely because of cost and availability/maintainability, but many older
simulators do still use such methods. They are however much more labour
intensive and present results that are generally more difficult to interpret
than those generated through the main automatic test systems. With
experience though, this does not present a problem.
The above has been merely a cursory treatment of the subject of simulator
transport delay methodology. For further information the reader is referred
to Appendix 5 to ACJ No.1 to JAR-STD 1A.030 (part of Reference 20) which
covers what is and is not acceptable in considerably more detail.
4A-6
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
4A-7
Evaluation Handbook 3rd Edition
4A-8
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
Manual testing for this requirement is possible, but due to the necessity of
carefully coordinating the control input with the measurement of the simulator
responses, along with the fact that what occurs is almost always too fast for a
pilot to observe the effects of the simulated aeroplane, there is not usually
much to be gained from such an approach. Nevertheless, with good
coordination between the pilot operating the controls and the person or team
operating the recording equipment, reasonable results can be achieved,
though usually not at the first attempt.
EXAMPLE
The plots shown in Figure 4a-1a and 4a-1b, right, form half of a set of results
obtained using the transport delay method. These first two sets of data show
nothing untoward. The control movement was at approximately 1.02 seconds
on the timescale, and so to meet the tolerance the responses must all occur
before 1.17 seconds. The instrument and visual responses are both within this
limit, but reference to Figure 4a-1b below shows that the motion system did not
exhibit any movement at all. The reason for this could be obvious (e.g. the
motion platform was not engaged when the test was run!), or it may indicate a
transducer failure or other problem associated with the hardware or
electronics. It is unlikely to be a software fault, because the driving signal
clearly worked for the visual and instrument.
4A-9
Evaluation Handbook 3rd Edition
Figure 4a-1a
Example of Simulator Test Results for Transport Delay (Yaw) Part 1
4A-10
Evaluation Handbook 3rd Edition
Figure 4a-1b
Example of Simulator Test Results for Transport Delay (Yaw) Part 2
4A-11
Evaluation Handbook 3rd Edition
4A-12
Evaluation Handbook 3rd Edition
SECTION 4b
4B-1
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The method of verification that the visual system field
of view is adequate can be checked by use of a
theodolite. Set-up of such equipment can be
elaborate and generally will already have been
performed during initial system test. A visual check of
the capability can be performed more quickly.
Therefore see the 'Manual Testing' section below.
4B-2
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
A typical procedure for this test is to provide a grid pattern of lines or light
points that subtend 5 Degrees per line/point allowing the number of squares to
be counted to demonstrate the required field of view. The squares should
appear square, not rectangular. If there is doubt about the angle subtended a
theodolite should be used to prove the exact Field of View.
EXAMPLE
See below.
Figure 4b1-1
Example of Spherical Grid Test Pattern
(Front Channel with Partial Side Channels)
4B-3
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
DEMONSTRATION Using a test pattern which fills the entire visual scene
(all channels) with a matrix of black and white 5o
squares or a 5o grid, evaluate the display system
geometry as presented
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
4B-4
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The procedure for this test is to display a 5 o grid to show the angular spacing.
Most systems are provided with a fixed grid projected from a slide or similar
device. If visual system image can be shown to align within the tolerances
specified it may be assumed that the requirements are met based on the initial
acceptance of the visual system. Records of these checks should be available.
In the event there is any doubt about the observations or original checks a
Theodolite should be used to demonstrate compliance.
EXAMPLE
See below.
o o
5 +/- 1
o o
10 +/- 1.5
Figure 4b2-1
Example of Spherical Grid Test Pattern with example angular measurement
(Front Channel with Partial Side Channels)
4B-5
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The evaluator should be situated in the cockpit with a 1o photometer and the
visual demonstration model or test pattern loaded. The test pattern may be
raster drawn and should fill the entire visual scene (at least three channels),
consisting of a matrix of 5o squares with a white square in the centre of each
4B-6
Evaluation Handbook 3rd Edition
channel.
Measurement shall be made on the centre bright square for each channel
using the photometer. The minimum brightness of this square shall be 7cd/m2
(2 foot-lamberts). Any adjacent dark squares should then be measured and the
contrast ratio found by dividing the bright square value by the dark square
value, with the minimum acceptable ratio being 5:1.
EXAMPLE
−35 0 35
20
−25
Figure 4b3-1
Example of Surface Contrast Checkerboard Pattern
4B-7
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The evaluator should be situated in the cockpit with a 1 degree photometer and
the visual demonstration model or test pattern loaded. The test pattern may be
4B-8
Evaluation Handbook 3rd Edition
raster drawn and should fill the entire visual scene (at least three channels),
consisting of a matrix of 5 o squares with a white square in the centre of each
channel. A highlight is superimposed on the centre white square of each
channel and the brightness measured using a 1 degree photometer. Note that
the use of lightpoints is not acceptable, but it is acceptable to use calligraphic
capabilities to enhance raster brightness. (The highlight square should appear
a reasonably even brightness). Verify the visual system display brightness as
measured by the photometer is 20cd/m2 (6 foot-lamberts) on the display and
17.5cd/m2 (5 foot-lamberts) at an approach plate positioned at the pilot's knee.
EXAMPLE
−35 0 35
20
−25
Figure 4b4-1
Example of Highlight Brightness Checkerboard Pattern
4B-9
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
4B-10
Evaluation Handbook 3rd Edition
The preferred method of demonstrating that the visual system meets this
requirement is described above. As always, the cockpit lighting should be
switched off and the eyes allowed to adjust before attempting to perform this
test.
EXAMPLE
Figure 4b5-1
Example of Surface Resolution Test Pattern
4B-11
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
4B-12
Evaluation Handbook 3rd Edition
MANUAL TESTING
EXAMPLE
See below.
Figure 4b6-1
Example of Lightpoint Size Test Pattern
(Note: For the sake of clarity, not all lights are shown)
4B-13
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
The evaluator should be situated in the cockpit with a 1o photometer and the
visual demonstration model or test pattern loaded. The test pattern should
4B-14
Evaluation Handbook 3rd Edition
EXAMPLE
The light array measures 20 ft lamberts and the background measures 0.1 ft
lamberts, therefore the contrast ratio is 20/0.5 = 40:1
40 × 40 DOT ARRAY
BLACK SURFACE
Figure 4b7-1
Example of Lightpoint Array Test Pattern
4B-15
Evaluation Handbook 3rd Edition
4B-16
Evaluation Handbook 3rd Edition
SECTION 4c
4C-1
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES This test will typically be automatically set up with the
simulated aeroplane trimmed for landing at 100 feet
above the runway height and on the glideslope. The
manufacturer will have produced a chart showing
calculations which set out what runway lights are
visible from that point with a specified Runway Visual
Range (RVR) (either 1000 or 1200 feet). Clearly, the
aeroplane result is dependent on both the aeroplane
geometry and pitch angle when trimmed for a final
approach, and the pilot or other evaluator performing
this test must be properly seated in the cockpit. The
furthest lights may only just be visible, so the
simulator cab lighting should be switched off and
time taken to allow the eyes to adjust if necessary.
4C-2
Evaluation Handbook 3rd Edition
)))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
MANUAL TESTING
Select a scene that has a fully marked and lighted runway and collect the
following data for that runway:
In addition, use the aeroplane manufacturer data for the following parameters:
Select the reduced visibility conditions, either 1000 or 1200ft RVR (depending
on the regulatory authority) and trim the aeroplane for landing on the glideslope
at a radio altitude of 100ft. Then use the formulas at the end of this section to
compute these positions:
Graphically show the aeroplane position, near and far ends of the VGS on a
map of the selected runway that shows the approach, centreline, edge and
runway threshold lights (see Figure 4c-3 for an example).
Either fly the simulated aeroplane on the glideslope and freeze at 100ft radio
altitude or select the pre-computed position. Verify that the simulated
aeroplane's pitch angle is correct since the near end of the VGS is very
4C-3
Evaluation Handbook 3rd Edition
From the pilot eyepoint observe the near and far ends of the VGS along the
runway extended centreline. Compare the VGS to the plot. The 20%
tolerance of the VGS is applied at the far end of the VGS. If the calculations
show that the runway threshold is visible when in the same position in the
aeroplane, then the threshold in the simulator visual system must also be
visible. Perform the test in the day, twilight and night modes as necessary to
confirm the results. No instrumentation is required to perform this test.
CALCULATIONS
Given the waterline and station of the Main Gear (WMG, SMG) and the Pilot's
Eyepoint (WEP, SEP) we can find the horizontal and vertical distances of the
Pilot's Eyepoint from the Main Gear when the aeroplane is level (EP_MG_Xo,
EP_MG_Zo):
Similarly the distances of the Glideslope Antenna from the Main Gear
(GA_MG_Xo, GA_MG_Zo) can be easily found.
X2 = Xo cos 2 - Zo sin 2
Z2 = Xo sin 2 + Zo cos 2
Figure 4c-1
4C-4 Visual Ground Segment Horizontal and Vertical Distances - Pilot/Glideslope
Antenna/Main Gear
Evaluation Handbook 3rd Edition
Since the radar altimeter reading is the altitude of the Main Gear, AMG, then the
altitude of the Pilot's Eyepoint, AEP, and the Glideslope Antenna, AGA, are:
Figure 4c-2
Visual Ground Segment Horizontal and Vertical Distances - Aeroplane to Ground
4C-5
Evaluation Handbook 3rd Edition
Refer to Figure 4c-2. With the aeroplane on a glideslope which has an angle of
*, the horizontal distance from the glideslope antenna to the point the
glideslope intersects the runway centreline, DXMIT_GA_HZ is:
DXMIT_GA_HZ = AGA/tan(*)
Using the published Threshold Crossing Height (TCH), the distance from the
glideslope transmitter to the threshold, DXMIT_TH is given by:
DXMIT_TH = TCH/tan(*)
With the transmitter at a distance DXMIT_TH from the threshold of the runway, the
horizontal distance from the Pilot's Eyepoint to the runway, DR is given by:
With the Runway Visual Range of RVR feet, the horizontal distance that will be
visible on approach is:
The distance down the runway that is visible for a given RVR is:
DV = VDHZ - DR
Given a cut-off angle of :, measured with a zero pitch, the closest distance that
the pilot can see, DMIN is:
4C-6
Evaluation Handbook 3rd Edition
Figure 4c-3
Visual Segment Diagram Example 1
4C-7
Evaluation Handbook 3rd Edition
Visual Segment for a Slant Range Visibility of 1200 feet and a Radio Altitude of 100
feet. The visual segment is 875 ft long and starts 826 ft before the threshold.
Aircraft type: A797-83
Aircraft pitch: 2.35 deg
Cockpit Cutoff Angle @ 100ft 20.79 deg
Calibrated Airspeed: 128.0 kts
Pilot's eye height above ground: 120.7 ft
Figure 4c-4
Visual Ground Segment Diagram Example 2
4C-8
Evaluation Handbook 3rd Edition
SECTION 5
SOUND SYSTEMS
5a JET AEROPLANES
5b PROPELLOR AEROPLANES
5c SPECIAL CASES
5e FREQUENCY RESPONSE
5-1
Evaluation Handbook 3rd Edition
The purpose of the simulator sound system tests is to confirm that what is
being heard by the flight crew in the cockpit correlates well with what they
would hear in the aeroplane. Clearly, the requirements here are such that
abnormal sounds (eg compressor stall) must be simulated as well as
normal ones such as engine whine and aerodynamic noise. A further
complication is that some of the sounds heard will be extremely transient
in nature (eg engine seizure or landing gear uplock) and will not always be
supported by quality data.
For this level of qualification, the requirement is that the sounds should be
demonstrated as being representative of those heard in the aeroplane.
Since objective data is not required, the sounds are subjectively evaluated
and accepted by experienced persons. See Volume 2 of this Handbook for
more information on Functions and Subjective Tests involving sound cues.
There is, therefore, still some subjective content, in that it is not really
possible to devise an objective test to demonstrate the coordination aspect,
but the major workload with these tests consists of setting up and using a
Frequency Response Analyser (FRA) to determine the simulator
compliancy. All tests in this section should be presented using an
unweighted 1/3-octave band format from band 17 to 42 (50 Hz to 16 kHz).
A minimum 20 second average should be taken at the location
corresponding to the aeroplane data set. This is usually, though not
necessarily, close to the position of the captain’s right ear. Obviously, the
aeroplane and simulator results should be produced using comparable
5-2
Evaluation Handbook 3rd Edition
The ICAO Manual requires objective testing for the highest qualification
level to be carried out to confirm that the simulation of sounds in various
phases of ground and flight operations correspond well with the data
obtained during those manoeuvres in the aeroplane. The main item of test
equipment used for these tests is a sound analyzer, which typically will be
a hand-held device capable of various types of sound measurement, but
including 1/1- and 1/3-octave frequency analysis and broadband statistical
distributions. Often the data collected is then transferred to a PC where
data can be displayed using off-the-shelf spreadsheet software and hard
copies made for inclusion in the QTG.
The unit has many modes of operation. For the purposes of obtaining
simulator sound test results the analyser should be used in an
instantaneous mode, which gives an immediate readout of the sound
pressure level in the simulator. This will give a good indication of the
stability of the sound pressure levels, in case of doubt.
Once all the readings have been recorded and stored they may be
transferred to a PC (using RS232, etc.) for display and hard copy.
The way in which a particular piece of equipment is used for these test will
vary, but below is given some generic guidance, based on experience with
using such a device.
2. Ensure the crew seats are in a normal flying configuration, that all
doors to the flight deck and all air vents are closed.
5-3
Evaluation Handbook 3rd Edition
3. Set the sound level to maximum at the simulator IOS. Leave Flight
Freeze on for the moment (or a similar control which turns the
simulator sound off).
5. With the analyser now calibrated, the background sound level in the
cockpit needs to be checked to make sure that it has no impact on
the on the sound levels when the simulator sound system is switched
on.
7. The main sound tests can now be run. Set the simulated aircraft up
in each of the specified configurations and initiate the recordings.
During recordings, all flight deck occupants must maintain strict
silence.
8. When the analyser has completed each set of readings, store the
data and proceed to the next test. Once all tests are complete,
transfer the sets of data to the PC and convert the data to
spreadsheet format and then use the appropriate method of plotting
the results on a printer connected to the PC.
The evaluation of the results must compare simulator SPL results against
flight test data, which will have been supplied by the aeroplane
manufacturer or data provider and processed in the same way as the data
is gathered on the simulator.
The tolerances suggested in the ICAO Manual are based on the ability of
the human ear to perceive sounds - especially in the 1/3 Octave band that
is most prominent to human hearing, and this has attracted a tolerance
level of 5 decibels, based on the standard dBA correction curve that is
used to calculate human noise perception in any noisy environment.
5-4
Evaluation Handbook 3rd Edition
Prior to collecting the data for the specified simulator tests, some
preliminary work is required to ascertain the ambient noise levels in the
simulator flight deck.
For readings taken in a quiet room with a low noise fan running in the
background, note the steady reduction in level between 1 kHz and 5 kHz
(Figure 5-1), the latter value being approximately 25.5 dB. In contrast, the
one example of aeroplane data gathered with the aeroplane parked and all
systems switched of has a broadly similar trend, but the reduction is much
greater, with the final 5 kHz value being approximately 11 dB, which is an
extremely low value for a reading taken with any kind of noise (such as a
fan) present. In the simulator, it is therefore to be expected that the higher
frequencies will show increasingly higher SPL values than does the aircraft
data.
Two tests were then conducted on the simulator itself, firstly with the
simulator air conditioning off, then with it on. Both tests were with the sound
system turned off, but with the simulation otherwise running normally (i.e.
avionics/electrical equipment internal cooling noises were present).
5-5
Evaluation Handbook 3rd Edition
Figure 5-1
Example of Simulator Calibration Results - Quiet Room with Low Noise Fan
The results, given below, show that the ambient simulator state, even with
the air conditioning off, gives a greater SPL than is indicated by the
aeroplane data for an equivalent condition. It is clear therefore that some
allowance needs to be made to adjust the readings taken on the simulator
to enable a fair comparison to be made with the aircraft data.
Figure 5-2 helps to quantify the effect of the simulator air conditioning
system on the sounds heard in the cabin when the simulator sound system
is switched off. Figure 5-3 shows directly the effect of the simulator cab air
conditioning when applied to recorded results for the engines at idle with
the sound system switched on.
5-6
Evaluation Handbook 3rd Edition
Figure 5-2
Example of Simulator Calibration Results - Air Conditioning On versus Off
5-7
Evaluation Handbook 3rd Edition
Figure 5-3
Example of Simulator Calibration Results - Adjustment for Air Conditioning
1. That for most configurations the simulator results are always going
to be affected by the simulator cab air conditioning, and also that, at
higher frequencies, there will probably be a greater deviation.
5-8
Evaluation Handbook 3rd Edition
SECTION 5a
JET AEROPLANES
5a JET AEROPLANES
5A-1
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
5A-2
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EXAMPLE
Figure 5a-1 shows a test result for the gear down, landing flap condition which
has been transferred to a PC and plotted from within a spreadsheet program.
5A-3
Evaluation Handbook 3rd Edition
Figure 5a-1
Example of Simulator Test Results for Landing Condition Sound Test
5A-4
Evaluation Handbook 3rd Edition
SECTION 5b
PROPELLOR AEROPLANES
5b PROPELLOR AEROPLANES
5B-1
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
5B-2
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
The above conditions may be set by use of an automatic test driver system, but
usually the results will then be taken manually, using a sound analyser. All
personnel should be silent during the 30 seconds or so that the readings are
being taken.
5B-3
Evaluation Handbook 3rd Edition
5B-4
Evaluation Handbook 3rd Edition
SECTION 5c
SPECIAL CASES
5c SPECIAL CASES
5C-1
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
5C-2
Evaluation Handbook 3rd Edition
5C-3
Evaluation Handbook 3rd Edition
5C-4
Evaluation Handbook 3rd Edition
SECTION 5d
5D-1
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
5D-2
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EXAMPLE
See the discussion in section 5.2 above. The ICAO Manual is the source for the
following diagram (Figure 5d-1). Figure 5-2 above allows a comparison to be
made from a real simulator test result for background noise..
Figure 5d-1
Recommended Maximum Simulator Background Noise
5D-3
Evaluation Handbook 3rd Edition
5D-4
Evaluation Handbook 3rd Edition
SECTION 5e
FREQUENCY RESPONSE
5e FREQUENCY RESPONSE
5E-1
Evaluation Handbook 3rd Edition
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
EVALUATION NOTES The object of this test is to ascertain that the sound
pressure levels and frequency distribution for the
specified flight conditions remains as implemented at
the original qualification and does not significantly
degrade over time. The test must be performed for the
Master QTG prior to the initial evaluation and the
results retained as a ‘baseline’ against which future
recurrent tests can be compared, when the tolerance
below will be applied.
))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))))
5E-2
Evaluation Handbook 3rd Edition
EXAMPLE
The example shown in figure 5e-1 is taken from the ICAO Manual.
Figure 5e-1
Example of Recurrent Frequency Response Test Tolerance
5E-3
Evaluation Handbook 3rd Edition
5E-4
Evaluation Handbook 3rd Edition
APPENDIX A
A-1
Evaluation Handbook 3rd Edition
APPENDIX A
The requirements for aerodynamic data accuracy and fidelity for the
evaluation of advanced simulators are greater than almost any other
application in the aerospace industry, including aircraft certification. Final
design data are based on predicted/wind tunnel aerodynamic data. Autopilot
and stability augmentation systems can be designed and built to less accurate
aerodynamic data because prototypes of the "electronic boxes" are usually
manufactured to be finely tuned to the exact aerodynamic characteristics of
the particular aircraft in flight test. Aircraft certification flight test data often
demonstrates that the aircraft meets some particular characteristic rather than
defines that characteristic. For example, an aircraft must only demonstrate,
for certification, that it possesses a minimum positive longitudinal static
stability about a trim point, rather than define the value of that stability. A
qualified simulator, on the other hand, must duplicate the same level of static
longitudinal stability as the aircraft. Therefore, all aircraft certification flight test
data are not necessarily acceptable for simulator design and evaluation. Only
that flight test data obtained using normal flight test standards and procedures
where a sufficient number of parameters including all pertinent aircraft
configuration, trim, flight and atmospheric conditions are recorded and
properly documented will suffice for these purposes. A simulator must
replicate every performance and stability and control characteristic of the
aircraft. Validation of this fact requires that these particular characteristics of
the aircraft be determined accurately and that the simulator be evaluated by
comparison with these aircraft data.
Recognising the cost and difficulty of acquiring flight test data that meets
these requirements for simulator validation, the regulatory authorities position
is that the airframe manufacturer should be the primary source of these data.
This position is based on the concept that the manufacturer has the greatest
familiarity with its aircraft and can best identify representative data which
accurately defines it. The manufacturer usually correlates the flight test data
with his predicted or wind tunnel design data in order to prove or improve
analytical skills. He may also employ every possible independent method
within reason to verify the validity of the final flight test data realising the
liability incurred upon the publication of the data. Further, design loads data
are based upon the manufacturer's final predicted or wind tunnel data and any
significant differences between these data and the final flight test data must be
addressed in terms of airframe structural load limits, limit speeds, centre of
gravity limits, etc. Clearly the manufacturer has a vested interest in the
A-2
Evaluation Handbook 3rd Edition
Data pack information and explanations on initial conditions, test methods and
data acquisition systems, often fall short of that needed to fully support
simulator model development and accurate validation of simulator responses.
Too often assumptions have to be made to explain what happened during the
flight test. Additional questions then have to be submitted to the manufacturer
for clarification.
A-3
Evaluation Handbook 3rd Edition
Manufacturers also must provide design data where no other data exists,
usually when a simulator is needed for initial instructor and crew training for a
new aircraft for which flight data is not yet available. The problem with design
data is that it may not represent all of the changes which have been made to
the aircraft since the design data was released and it may represent certain
simplifications which do not impact on its use as a design tool but may not be
accurate enough for simulator use. Unfortunately, this is unknown until actual
test data becomes available. However, this risk is minimised by close
coordination and cooperation with the manufacturer and in review of
preliminary flight test data prior to final approval of initial simulator training.
To achieve the level of quality needed in the flight test data for simulator
support, the following requirements should be followed:
a) The aircraft should be maintained at the trim condition prior to the start
of the test for sufficient time to ensure that it has reached stabilised
flight.
A-4
Evaluation Handbook 3rd Edition
which can be used to collect data for simulator validation, taking into
consideration the items discussed above.
A-5
Evaluation Handbook 3rd Edition
A-6
Evaluation Handbook 3rd Edition
APPENDIX B
B-1
Evaluation Handbook 3rd Edition
APPENDIX B
B1.0 INTRODUCTION
B-2
Evaluation Handbook 3rd Edition
Aeroplane stability is subdivided into two main types. The first describes the
change of forces and moments on an aeroplane due to a slight displacement
and is termed static stability. The second involves the subsequent history of
the motion and the changing values of forces and moments and is termed
dynamic stability. Both terms apply to deviations in any motion, longitudinal
and lateral, and refer to the behaviour of the aeroplane in a disturbance
without the interference of pilot-operated control surfaces. Consequently, for
an aeroplane to be defined as stable, it must possess inherent stability without
the aid of an external impetus in the form of an independent control. When
applied to computer controlled aeroplanes, it follows that the basic airframe
need not necessarily possess inherent stability provided the automatic
manipulation of the control surfaces produces an effective stability. In practice,
most if not all transport category aircraft which are computer controlled (to
whatever extent) have airframes which to a greater or lesser degree are
inherently stable as well. Typically the use of various control computers on
these aeroplane types is for reasons other than to compensate for an unstable
(though probably more manoeuvrable) airframe design.
B-3
Evaluation Handbook 3rd Edition
few seconds - the time taken is immaterial to the definition. There are several
possible motions, and these are perhaps best visualised with the aid of time
histories, albeit stylised and simplified.
The plot (which could for example be bank angle during a spiral stability
manoeuvre) in Figure B-1 indicates that the aeroplane is both statically and
dynamically stable in this mode. This is because of the tendency to return to
the original state after the disturbance has been removed (e.g. a pulsed input
of, say, control wheel). The damping can be described as being at its critical
value, allowing no overshoot.
-2
-4
-6
0 10 20 30 40 50 60 70 80 90 100
Percentage Timescale
Figure B-1
Example of Critical Damping (Statically and Dynamically Stable)
In Figure B-2 below, the aeroplane is also both statically and dynamically
stable, even though the parameter plotted (e.g. pitch rate or sideslip angle)
shows some overshoot. The damping is still positive however, and the
tendency is still for the aeroplane to return to its original state. The
characteristics required to be measured in this case would be period and time
to half amplitude (or damping coefficient).
B-4
Evaluation Handbook 3rd Edition
10
-2
-4
-6
-8
-10
0 10 20 30 40 50 60 70 80 90 100
Percentage Timescale
Figure B-2
Example of Positive Damping (Statically and Dynamically Stable)
50
45
40
35
30
25
20
15
10
5
0
-5
-10
-15
-20
-25
-30
-35
-40
-45
-50
0 10 20 30 40 50 60 70 80 90 100
Percentage Timescale
Figure B-3
Example of Negative Damping (Statically Stable but Dynamically Unstable)
B-5
Evaluation Handbook 3rd Edition
50
45
40
35
30
25
20
15
10
5
0
-5
-10
-15
-20
-25
-30
-35
-40
-45
-50
0 10 20 30 40 50 60 70 80 90 100
Percentage Timescale
Figure B-4
Example of Simple Divergence (Statically and Dynamically Unstable)
B-6
Evaluation Handbook 3rd Edition
16
12
Amplitude (e.g. pitch rate - deg/sec)
-4
-8
-12
-16
Figure B-5
Method of Determination of Time to Half Amplitude of a Second Order Oscillation
Take Figure B-5 to be a typical plot output of a parameter resulting from the
Phugoid test. Most of the information shown here will probably be seen on the
simulator output, but the two amplitudes, A1 and A2, and the times at which
they were recorded by the host computer will probably not be shown. For
manual verification of the computed results for period, time to half (or double)
amplitude and/or damping coefficient, the method described below should be
used.
The periodic time is found by merely measuring the time difference between
two successive peaks (or troughs) in the plotted value. A more generalised
method though, involves determining the time difference between, say, five
successive peaks or troughs and then dividing this value by five to obtain the
average period, which may differ very slightly between any two individual
maxima or minima. It is important to ignore the first peak or trough, which may
B-7
Evaluation Handbook 3rd Edition
not be part of the aeroplane free response since the influence of the initial
control input or other disturbance may still be present. Therefore,
measurement should not begin close to the time origin of the plot, but for a
phugoid test may not be possible for more than a minute into the test.
In the equations below, the times at which amplitudes A1 and A2 occur are
denoted T1 and T2 respectively. Tp is the periodic time (in seconds) of the
oscillations, measured from the time history. Other definitions are given in the
equations.
Hence the only parameters which need to be measured are the two
amplitudes, A1 and A2, which should be spaced as far apart as is reasonable,
along with the times at which those amplitudes occur and also the periodic
time, Tp. The tolerances for the period, time to half amplitude and damping
coefficient are found in the ICAO Manual.
B-8
Evaluation Handbook 3rd Edition
APPENDIX C
C-1
Evaluation Handbook 3rd Edition
APPENDIX C
C1.0 INTRODUCTION
In Appendix 1 of the ICAO Manual there are many items for which a Statement
of Compliancy (SOC) is required. There is no definition of precisely what a SOC
should contain, and there have probably been many different versions across the
spectrum of simulators and simulator manufacturers. The purpose of this
appendix is to provide guidance as to what is required in a SOC by way of four
examples. For the other items which require an SOC the reader is referred to the
ICAO Manual itself.
OBJECTIVE
TEST PROCEDURES
METHOD OF COMPLIANCE
C-2
Evaluation Handbook 3rd Edition
OBJECTIVE
METHOD OF COMPLIANCE
C-3
Evaluation Handbook 3rd Edition
OBJECTIVE
(1) Dry
(2) Wet
(3) Icy
(4) Patchy Wet
(5) Patchy Icy
(6) Wet on Rubber Residue in Touchdown Zone
INITIAL CONDITIONS
C-4
Evaluation Handbook 3rd Edition
TEST PROCEDURES
MANUAL:
(1) Use the QTG utility to run patchy wet case manually. Autotest will
set the runway condition for each case automatically.
(2) Position flap and gear levers as required above.
(3) When terminal indicates automatic setup is complete press the
autopilot disconnect (APD) switch to disengage the backdrives.
(4) <CR> to start the test.
(5) After 1 second apply and hold full brakes until a full stop has been
reached.
(6) Repeat steps 1 to 2 for :
- patchy ice
- dry
- rubber residue on wet runway
- wet
- icy
AUTOMATED :
RECORDING DETAILS
DATA EVALUATION
For DRY, WET and ICY results see tests 1E1, 1E3, and 1E4
SIMULATOR ....
FAA ...................
C-5
Evaluation Handbook 3rd Edition
TOLERANCE
Subjective
METHOD OF COMPLIANCE
Simulator compliance with requirements (4), (5) and (6) of this section of
reference [1] is demonstrated by performing the above tests in a manual
or automated mode. In addition, the following objective tests should be
performed:
1.E.1 Stopping Time and Distance, Wheel Brakes, Dry Runway for
requirement (1)
1.E.3 Stopping Time and Distance, Wheel Brakes, Wet Runway for
requirement (2)
1.E.4 Stopping Time and Distance, Wheel Brakes, Icy Runway for
requirement (3)
OBJECTIVE
INITIAL CONDITIONS
C-6
Evaluation Handbook 3rd Edition
TEST PROCEDURES
Tyre
MANUAL :
C-7
Evaluation Handbook 3rd Edition
AUTOMATED :
RECORDING DETAILS
DATA EVALUATION
Verify that the time (sec) and distance (ft) to stop the A/C increases with
increasing brake temperature.
#1 #2 #3
SIMULATOR ....
FAA ....................
TOLERANCE
Subjective
METHOD OF COMPLIANCE
C-8
Evaluation Handbook 3rd Edition
2) Tyre drag forces and side forces for failed tyres are based on
aeroplane manufacturer ref. [3]. The friction coefficient for a blown
tyre with a free rolling wheel is set at 0.03. The friction coefficient for
all tyres failed is 0.05.
3) Brake torque limits for each landing gear assembly are adjusted for
tyre failures. When a single tyre fails, the torque limit for that gear
assembly is reduced by 50%.
C-9
Evaluation Handbook 3rd Edition
C-10
Evaluation Handbook 3rd Edition
APPENDIX D
D-1
Evaluation Handbook 3rd Edition
APPENDIX D
D1.0 INTRODUCTION
No specific requirements have been set down in the ICAO Manual for
the minimum specification of a simulator motion system. However,
during the International Standards Working Group deliberations on
motion system requirements it was deemed appropriate to further
discussions on this subject by setting up a motion performance sub-
group. This sub-group met on two occasions during 1992 and
formulated the information given in section 3.5 below. The content of
section 3.5 has not been included as such in the ICAO Manual and
thus the information it contains cannot be said to have the force
implied by that document. Nevertheless, it is considered that the
figures given form the basis for realistic guidelines on motion system
performance capability.
a) Introduction
b) Procedures
ii) Only the outputs of the six actuator position transducers and a
test accelerometer(s) should be used to verify compliance.
c) Motion Envelope
D-2
Evaluation Handbook 3rd Edition
e) Leg Balance
D-3
Evaluation Handbook 3rd Edition
degrees.
f) Turn Around
D-4
Evaluation Handbook 3rd Edition
APPENDIX E
E-1
Evaluation Handbook 3rd Edition
APPENDIX E
E1.0 INTRODUCTION
This appendix examines the use of math pilots or closed-loop controllers in the
development and validation of simulator models and discusses the reasons why
a math pilot can be a useful and appropriate tool. Comparisons of simulation
data with actual flight test results are shown for the purpose of validating the use
of closed-loop controllers. Also provided is a hypothetical example of the misuse
of closed-loop controllers which might result in a close match to the tolerance
parameters, but mask a deficiency in the simulation model.
Also mentioned is the possible need for regulatory standards for the use of math
pilots. Additional requirements, in terms of parameters and their tolerances, are
examined and discussed. If the validation tests are intended to provide an
objective comparison between flight and simulation results, then the use of
closed-loop controllers needs to be recognised and addressed.
E2.0 BACKGROUND
The use of math pilots or closed-loop controllers has become a common practice
to primarily address three issues:
E-2
Evaluation Handbook 3rd Edition
Although improvements are continually being made to the quality of flight test
data used for model development and validation, uncertainties still exist. Math
pilots can address these uncertainties such as small measurement errors or
small unmeasurable atmospheric disturbances.
Figure E-1 provides a time history match
of a flap change. The match was driven
with the elevator position, stabiliser
deflection and flap handle. For this test
the tolerances specified by the ICAO
Manual are airspeed (± 3 knots), altitude
(± 100 feet) and pitch attitude (± 1.5o).
The figure presents the tolerance
parameters and bank angle. During the
configuration change the flight test is
stimulated by an asymmetry that results
in the rolling off. The change in bank
angle is left unchecked by the pilot. The
simulator response is not excited by the
same asymmetry and does not roll off.
The result is that prior to the required 15
second interval after completion of the
configuration change the simulator
match exceeds the tolerances on
airspeed, altitude and pitch attitude. If
only the longitudinal parameters for this
match were examined, one might
conclude that a shortcoming existed in
the model. The match demonstrates
how a disturbance or slight asymmetry in
the secondary (lateral) axis can impact
the primary axis and the tolerance
parameters. To compensate for the
Figure E-1 missing disturbance in the simulator
Flap Change Match - Open Loop Roll Axis response, a math pilot can be used
during the match.
E-3
Evaluation Handbook 3rd Edition
Figure E-2
Flap Change Match - Closed Loop Roll Axis
E-4
Evaluation Handbook 3rd Edition
For matches of short duration this has not been an issue. However for time
histories of considerable length, small errors are liable to integrate into large
errors. These errors may result in exceeding the tolerances on a parameter for
a given test. A review of the most recent data packages produced show that
many of the conditions in the table above are also some of the longest time
history matches.
Wind-up turns a
Speed Stability a a
Vmca a
Engine out Trim a a
Steady Sideslip a a
Ground Effect a a
E-5
Evaluation Handbook 3rd Edition
Figure E-3
Open Loop Landing Match
Figure E-4
E-6
Closed Loop Landing Match
Evaluation Handbook 3rd Edition
The result is an excellent match of the tolerance parameters. The error, whether
it is in the flight test data or in the mathematical models, is accounted for in the
difference between the flight test and simulator elevator position. But is this an
acceptable match? Engineering judgement must be used to determine whether
the difference between the simulator and flight test elevator position is
acceptably small. As seen in Figure E-5, the elevator error remains centered
about zero, is small in magnitude and the error is not sustained for any
considerable length of time.
Figure E-5
Closed Loop Elevator Difference
(Simulator - Flight Test)
E-7
Evaluation Handbook 3rd Edition
Figure E-6
Open Loop Landing Match Modified Function
E-8
Evaluation Handbook 3rd Edition
Figure E-7
Comparison of Pitching Moment Ground Effect Coefficient Increment
E6.0 ABUSE
The next example in this series illustrates the improper use of a closed-loop
controller. Again the landing condition is matched closed-loop, but this time the
elevator control effectiveness is artificially reduced by 20 percent. Figure E-8
shows excellent correlation between the simulator and flight results for the
tolerance parameters (airspeed, angle of attack and altitude). The match of the
tolerance parameters is as good as the match generated with the baseline
aerodynamic model presented previously in Figure E-4.
E-9
Evaluation Handbook 3rd Edition
Figure E-8
Closed Loop Landing Match 20% Reduction in Elevator
Effectiveness
One might conclude that the modified model is adequate for training, when in fact
it contains a significant deficiency. The error in the model only becomes apparent
when a comparison of the elevator position is made (Figure E-9).
The elevator trace with the modified database shows a sustained error occurring
at the end of the match. With no tolerance on the elevator deflection, the
acceptability of the match must be determined by engineering judgement, and
for this example, such judgement should lead to the conclusion that the match
is unacceptable.
E-10
Evaluation Handbook 3rd Edition
Figure E-9
Elevator Error for Closed Loop Match 20% Reduction in
Effectiveness
Another potential concern with the use of math pilots is the distribution of error
among multiple parameters. The proper use of math pilots assumes that the
parameters being closed upon are matched closely; that nearly all the error will
be accounted for in the difference between the flight test and simulation
controller position. However, the gains on the closed-loop controller could be
relaxed such that more error appears in the tolerance parameters and less in the
controller position error. In the preceding example of the landing match with the
20% reduction in elevator effectiveness, allowing more error in the airspeed,
altitude and pitch attitude traces would reduce the elevator error. In such a
case, engineering judgement is again needed to assess the acceptability of the
match – and to ensure the use of closed-loop controllers is not being abused.
There have in the past been instances noted of open-loop simulator QTG tests
which have initially been run by simulation engineers in a closed-loop manner
and then the resultant primary control fed back into the simulation as if the test
was being run open-loop. This is a clear abuse of math pilot methodology, since
it renders the test result unclear as to whether the simulation and/or test is
actually valid.
E-11
Evaluation Handbook 3rd Edition
The use of math pilots however can be abused and mask a deficiency in the
model if sound engineering judgement is not applied. So is engineering
judgement enough? If it is not, then additional parameters and their associated
tolerances need to be defined; but for which tests, and which parameters and
what magnitude of the tolerance?
The performance and handling qualities tests used to qualify flight crew training
simulators can be divided into three categories:
For the “free response” tests the pilot is not actively in the loop. The pilot initiates
the manoeuvre and allows the to freely respond. Tests categorised as free
responses would include: small control inputs, configuration changes,
longitudinal trims, phugoid, short period, roll response, roll overshoot, spiral
stability, rudder response and dutch roll. For these tests math pilots on the
secondary axis would be allowed. However in the primary axis math pilot activity
would be inappropriate. The configuration change example previously discussed
in the paper would be an example of the proper use of closed-loop controllers for
a “free response” test.
For the tracking task tests the pilot is continuously in the loop. Tests categorised
as tracking tasks would include: takeoffs, dynamic engine failure after takeoff
(during the recovery), climbs, descents, longitudinal manoeuvring stability,
longitudinal static stability, stalls, minimum control speed - air, engine-out trims,
steady sideslip, landings, go-around, directional control with asymmetric reverse
thrust and ground effect. Like the free response tests, math pilots would be
allowed on the secondary axis; however, in contrast to the free response tests,
math pilots would also be allowed on the primary axis. Since the math pilot will
force a match of some of the tolerance parameters, their use during these
“tracking task” tests requires sound engineering judgement in assessing the
controller input or additional tolerances on the controllers.
E-12
Evaluation Handbook 3rd Edition
The question of additional requirements has been partially addressed. The use
of math pilots on the secondary axes would be allowed for both “free response”
and “tracking task” tests. The magnitude of the inputs on the secondary axis
needs to be minimised to ensure that the inputs do not affect the primary axis.
The use of a math pilot on the primary axis would only be allowed during a
“tracking task” test with additional tolerances on the controllers. The additional
tolerances on the controller inputs are an attempt to replace engineering
judgement with a quantifiable standard.
But what magnitude should the tolerance be and on which controller input: pilot
control deflection, pilot control force or control surface deflection? With regard
to the magnitude of the tolerance some guidance can be found in the current
tolerances on surface deflections or controller force. The static control tests and
some handling qualities tests (longitudinal manoeuvring stability, longitudinal
static stability, longitudinal trim, ground effect, steady sideslip, engine out trim)
include tolerances on controller inputs noted in the table below.
Roll ± 2 aileron
o
±5 o
spoiler (or 10%)
± 3 spoiler
o
Some aeroplane manufacturers have used the surface position tolerances from
the handling qualities tests for the magnitude of the “tolerances” on the closed-
loop controllers. The tolerances from the static checks were found to be too
lenient for the elevator and rudder control deflections. Cockpit controllers or
forces were not used because of greater confidence in the flight test surface
position measurement and a better grasp of the relevance of the control surface
error to the aerodynamic model.
Some additional internal development guidelines suggest that the error should
be about zero and that inputs approaching the controller “tolerance” only be for
a short duration. The controller and the parameters that are being closed on
must be clearly identified in the test setup. These guidelines are not absolute
and still require engineering judgement in assessing the controller error. This
E-13
Evaluation Handbook 3rd Edition
leads to the original quandary that the validation tests are intended to be a
quantitative assessment of a flight simulator; yet when using closed-loop
matching techniques for the primary axis engineering judgement is required in
lieu of tolerances to determine the acceptability of the match.
E8.0 CONCLUSION
E-14
Evaluation Handbook 3rd Edition
APPENDIX F
F-1
Evaluation Handbook 3rd Edition
F-2
Evaluation Handbook 3rd Edition
APPENDIX F
F1.0 INTRODUCTION
The publishing world has moved into the electronic era, with thousands of
documents being electronically published every day, using well-established
and widely available technology. The acceptance of electronic media is
widespread, and its application to the supply of technical documentation is
now almost universal.
The content and production process of the QTG makes it ideally suited to
electronic publication. Moving forward to an electronic media will facilitate
considerable improvements to quality and revision control, through the
increased use of automation. The electronic document can be distributed on
CD-ROM or DVD; with copies costing a fraction of that of the paper version,
multiple copies become practical, and the onerous task of reviewing a
document can now be performed in parallel by several individuals.
F1.1 REQUIREMENTS
The principal requirement for the electronic QTG (eQTG) is to simply provide
the paper document in an electronic format, but it is important that the concept
of the eQTG is understood to embrace more than just the use of electronic
storage. The electronic version must be a worthy and practical successor to
its paper prototype, if it is to gain acceptance as a complete substitute. The
document has to be globally portable, easy to access, and comprehensive.
These three features are undeniable attributes of the paper version, but not
automatically bestowed upon an electronic counterpart. These, and a few
other attributes beyond electronic storage alone, are required to make the
electronic version as useable as the paper prototype and these must be
understood to be essential elements of the eQTG concept.
All electronic formats will require the reader to have access to a suitable
computer and some software in order to access the document. The choice of
electronic format will clearly influence the portability of the document. It should
be remembered that a simulator has an expected service life in excess of
twenty years, and the eQTG must be reviewed and recreated throughout the
devices lifetime. Paper documents carefully stored are still legible a thousand
years later. Electronic media does not stand the test of time so well, in
particular the file format itself becomes rapidly out of date. An acceptable
F-3
Evaluation Handbook 3rd Edition
eQTG system should employ a file format and storage media that provides
reasonable insurance against becoming unsupportable within the life of the
device. Portable Document Format (PDF) introduced by Adobe in 1993 for
use with the Acrobat Reader has been chosen by the industry as the most
standard format for the eQTG.
The QTG for a typical simulator, including reference data and supporting
materials can constitute a document spanning 10 volumes. If the simulator
has been designed to support more than one variant of the aircraft, such as
an alternative engine fit, then the size of the QTG document increases
proportionally. When such a document is transferred into electronic format a
single linear arrangement of the 5000 or so pages will not be as manageable.
A reader would need to scroll through many thousands of pages to reach any
particular item. Quite clearly an acceptable eQTG must be given some
amount of additional structure and indexing to assist the reader. Any practical
eQTG will require a hierarchical structure of automatic links to the major
sections of the document. To be intuitive to the first time reader an acceptable
eQTG should ideally be arranged with a single ‘point of entry’ leading the
reader into a simple, well indexed method of navigation throughout the
document.
We should also consider the problems posed by the need for the eQTG to be
comprehensive, and easy to extend or adapt as the simulator device evolves
during its life cycle. Any test result, or supplementary data that is required to
be included into the QTG, must also be added to the eQTG. For example, if it
becomes necessary during the life of the simulator to include a piece of
supplementary data alongside a QTG test, then it can be added to the paper
copy after no more processing than the simple operation of a suitable hole
punch. An acceptable eQTG system should also be capable of embracing
data from all external sources, via a process that demands the very minimum
level of technical skill.
F-4