Vous êtes sur la page 1sur 10

Avia6120 Essay Allan Bradley

Trimester Two, 2010 Task 6 C 313 5319

ASSIGNMENT COVER SHEET


(Use as the first page of your assignment)

Student Details
Family Name: Bradley
Given Name: Allan
Student Number: c 313 5319

Course Details
Course Name: Crew Resource Management
Course Code: Avia6120

Assignment Details
Task Number: 6
Task Title: Essay: CRM/TEM Training

PLEASE NOTE
All assignments are the responsibility of the student.
Ensure you keep a copy of your assignment before
submitting.

DECLARATION:
I have read and understand the University of Newcastle’s Policy for the Prevention and Detection of
Plagiarism Main Policy Document, which is located at:
http://www.newcastle.edu.au/policy/academic/general/plagiarism.htm

I declare that, to the best of my knowledge and belief, this assignment is my own work, all sources have
been properly acknowledged, and the assignment contains no plagiarism. This assignment or any part
thereof has not previously been submitted for assessment at this or any other University.
I acknowledge that the assessor of this assignment may, for the purpose of assessing this assignment:
• Reproduce this assessment item and provide a copy to another member of the Faculty;
and/or
• Communicate a copy of this assessment item to a plagiarism checking service (which may
then retain a copy of the item on its database for the purpose of future plagiarism checking).
• Submit the assessment item to other forms of plagiarism checking

By attaching this cover sheet I am affirming the above declaration.


Allan Bradley

1
Avia6120 Essay Allan Bradley
Trimester Two, 2010 Task 6 C 313 5319

An effective team approach to threat and error management is the aim of CRM training. How
can this be best achieved and monitored? Refer to relevant literature on the topic.

Developing an Effective Team Approach to Threat and Error Management In An


Aviation Organization
By Allan Bradley

Abstract
Crew Resource Management (CRM) training has developed in response to acknowledgement
by the aviation industry that human factors contribute to more than half of all aviation
accidents and incidents. In the thirty years since CRM courses have existed, they have evolved
through six generations to the current Threat and Error Management (TEM) model. CRM has
expanded from its original client base of flight deck crew, to the great majority of aviation
industry personnel. In doing so, recognition of factors such as they types of cultures in which
people work has to be taken into consideration. The use of Line Operations Safety Audits
(LOSA) was developed to provide a real time, objective and contextual record of how crews
managed threats and errors on a flight deck. Use of data obtained from LOSA can be used to
improve not only the way flight crews manage threats and errors, but also help develop a just,
safety culture within the organization.

Introduction
During World War Two, Allied bomber pilots flying from England to Africa were regularly
issued flight plans which routed them due south over the Atlantic Ocean for several hours
before turning east, heading for Africa’s west coast. Beaty (1995) reveals some bombers lost
during these flights were reported to have reached the point where they should have turned left
(east) for Africa, but instead turned right (west) for the mid-Atlantic. Eventually these aircraft
would have run out of fuel and ditched mid ocean. Several aircraft that survived the flight only
did so because at the turning point, when the navigator reported that it was time to turn left, the
captain turned right and refused to listen to his crew’s exhortations to alter course. Eventually
these crews overpowered their captain and turned the aircraft around. Subsequent debriefing
revealed that the captain just got ‘bloody minded’ and refused to listen to anyone, regardless of
how much sense they made. Beaty (1995) offers other examples of aircraft accidents caused
by captains making decisions that defied logic, but it was apparent that these erroneous
decisions could not be explained as ‘one off, freak events’. There were too many to ignore.
There were human factors in aviation incidents and accidents that needed understanding and
mitigating.

Human Factors (HF) in aviation has now developed into a serious field of academic research
and Hawkins (1987) explains that in 1978, KLM (Royal Dutch Airlines) provided the first
Human Factors Awareness Course. HF has since developed several branches of study
including one identified by Ruffell Smith (1979) and labeled by Lauber (1980) as (originally)
Cockpit Resource Management, now Crew Resource Management (CRM). As Hawkins
(1987) explains, CRM can be thought of as “the management and utilization of all the people,

2
Avia6120 Essay Allan Bradley
Trimester Two, 2010 Task 6 C 313 5319

equipment and information available to the aircraft”. Since then, CRM has evolved through six
generations. The current (sixth) generation is termed Threat and Error Management (TEM).

Aviation authorities in many countries now mandate that companies have, as an integral part of
their structure a Safety Management System (SMS). The role of the SMS is to use tools such
as CRM and TEM to “establish robust defences to ensure that errors do not result in incidents
or accidents” (CASA 2004). Maurino (2000) explains that until the year 2000, human factors
had been concentrated on the Flight Deck, but since then the idea has expanded to include
ground, security, ramp operations and cabin safety staff. As part of this effort, ICAO
encourages the implementation of Standards and Recommended Practices (SARPS) as
described in ICAO documents 8168 and 4444.

The Flight Safety Foundation (2010) reports that between 1990 and 1994 there were 1.32
serious airliner accidents per million departures. By the end of the century, the number of
serious airliner accidents had dropped to 1.06 per million departures. In the ten years since, the
number of serious airliner accidents has dropped to 0.55 per million departures. Learmount
(2010) suggests that this almost halving of the accident rate can be attributed to several factors
including: rapid development of computer and communication technology making gathering
and sharing data quicker and easier, development of systems such as Enhanced Ground
Proximity Warning Systems (EGPWS) and Traffic Collision and Avoidance Systems (TCAS),
and establishment of organizations such as The US Commercial Aviation Safety Team (CAST)
whose task is “to identify safety priorities and create an action plan”. The implementation in
1999 of ICAO’s Universal Safety Oversight Audit Programme also held member states
individually accountable for aviation safety oversight in their country.

While technological advances must have contributed to aviation safety, it is recognized that
60% of large jet transport accidents have flight crew errors as a causal factor (Duke 1991).
Therefore, to significantly reduce the number of aircraft accidents, work had to be done to
identify and mitigate flight crew errors.

By the beginning of this century, CRM was evolving through its third and fourth generation.
By the year 2000, CRM was moving out of the Flight Deck into the cabin and beyond.
Helmreich and Foushee (2010) suggest that the behaviours that exemplified effective CRM
were being identified and highlighted. Acceptance and implementation of CRM training is, as
a result of ICAO or State mandate, or by voluntarily embracing of its principles, becoming the
norm in the aviation industry. As CRM moves out of the Flight Deck, the size and complexity
of the team increases and identifying ways to achieve and monitor an effective team approach
to threat and error management needs to be considered.

The Team
Shortly after Lauber (1980) first used the term Cockpit Resource Management, it was quickly
realized that CRM was relevant beyond the cockpit and CRM became Crew Resource
Management. Cabin Crew were the first team members outside the cockpit to be included in
CRM. The importance of Cabin Crew as a resource was highlighted when they weren’t used
during a B737–400 accident in Kegworth, England in 1989. The Department of Transport

3
Avia6120 Essay Allan Bradley
Trimester Two, 2010 Task 6 C 313 5319

(1990) noted that attempts by cabin crew to inform the Captain of their observations regarding
the remaining engine were summarily dismissed. The same report noted the role of Air Traffic
Control who, in their attempts to help, became distractions to the pilots. Although it is
conjecture, it could be argued that if the Captain had listened to the Cabin Crew, or if ATC had
not been a constant source of distraction, the accident might have been averted. Utilization of
all available resources was not optimized. As CRM evolved it began to include more
personnel. It became apparent that “hazards and errors can occur at all levels of an
organization, from the cockpit or the shop floor right through to the boardroom. Seemingly
minor errors or hazards in one area can combine with others to result in an incident or
accident.” (Reason, J. 2000). It therefore follows that any effort to identify and mitigate or
negate threats and errors must include as many members of the aviation community as
possible. While ideally this would include everyone involved in aviation, on a more practical
level, it should include everyone within an aviation organization and other organizations they
deal with. An example would be an airline dealing with other organizations such as ATC,
caterers, refuellers, cleaners, security and so on.

Threat and Error Management


Maurino (2005) defines threats as “events or errors that occur beyond the influence of the flight
crew, increase operational complexity, and which must be managed to maintain margins of
safety”. Helmreich, Klinect and Wilhelm (1999) suggest threats can be classified as expected,
unexpected or external errors. An example of an expected threat could be high terrain
surrounding an airfield. The terrain is well documented and procedures developed to mitigate
this threat. An example of an unexpected threat could be an in-flight aircraft system
malfunction. In this situation the pilots must use their knowledge and skill to achieve a
successful outcome. An external threat could be a hidden or latent system shortcoming such as
an electronic flight instrument that becomes unreadable in bright sunlight. Maurino (2005)
prefers to classify threats as either environmental or organizational. Regardless of how a threat
is classified, a measure of the effectiveness of a team is their ability to anticipate and manage a
threat.

Maurino (2005) defines errors as “actions or inactions by the flight crew that lead to deviations
from organizational or flight crew intentions or expectations”. Helmreich et al. (1999) have
categorized errors into five types and offer three possible responses to any error. One response
to an error that the error is trapped, that is, identified and managed before it is of any
consequence. Another response is that an error could be exacerbated because although it is
detected, the flight crew’s actions lead to an undesirable outcome. Finally the flight crew could
fail to respond to the error at all. Outcomes to these three responses are listed as
inconsequential, undesired aircraft state and additional error.

When it comes to threat and error management (TEM), Gunther (2003) proposes that threats
and errors are initially handled by strategies such as corporate culture, SOPs and personally
developed techniques. If these strategies fail, the threat or error is resisted by hardware and
software such as GPWS and TCAS. It is only when the resistance is defeated that humans are
called upon to resolve the threat or error. In the examples of threats listed above, anticipating
and managing high terrain could include using full power takeoffs and terrain avoidance

4
Avia6120 Essay Allan Bradley
Trimester Two, 2010 Task 6 C 313 5319

departure and arrival procedures. Anticipating and managing in-flight malfunctions could be
achieved by Abnormal Procedures Checklists. For the example of an unreadable flight
instrument, good CRM that utilizes the other crew member’s instruments could be a strategy to
manage the threat. Errors can be avoided or resolved by crew members by being proficient in
their job, being vigilant, monitoring and challenging, exercising leadership skills and taking
advantage of previous experience (Gunther 2003).

As Gunther (2003) says “because flying is our business, threats must be identified and
reduced/eliminated while errors must be avoided and managed”.

Culture
Sincere efforts to reduce threats or errors may mean changes being made within an
organization both structurally, such as reorganizing departments, and in the way personnel
interact, such as modifying Power Distance gradients. Helmreich (2003) notes that resistance
to such changes can be due to cultural issues. Helmreich and Merritt (1988) identify three
cultures types influencing flight crew. These cultures are national, organizational and
professional.

The national culture in which an organization and individual exists will inevitably exert
influence in the workplace. While Lonner (1980) identifies seven psychological universals
common to all cultures, Hofstede (1983) identified four cultural dimensions which vary
between nations. While little can be done by an organization to change a national culture, it
should be cognizant of the effect national culture has on the way threats and errors are
managed and develop appropriate strategies. If a national culture places great value in safe
work practices, then the organization should reward and encourage this. If the national culture
is otherwise, it should develop strategies to modify such norms within the organization.

An organization’s culture will have a major effect on how threats and errors and managed. The
highest levels of management must be committed to installing and visibly supporting a system
which exists to minimize or negate threats and errors. If they are not, any attempt by
subordinate officers to install such a system will not be taken seriously by co workers. A
senior management which does not take safety seriously, will create a culture that suggests
‘safety isn’t important’ and this will permeate through all ranks. Numerous organizations
including CASA (2004), ICAO-IATA (2003) confirm this.

Professional culture is well described by Helmreich (2002) and acknowledges that individual
professions have norms, patterns of behaviour and other characteristics which make their
profession unique. Sometimes a professional culture can be a positive influence for improving
safety, such as refusing to accept incomplete or substandard work. Alternatively a professional
culture might not be such a positive influence, for instance supporting an attitude of ‘near
enough is good enough’.

If all three aspects of culture; national. organizational and professional work together to
encourage and support safe work practices and then a Safety Culture can establish itself within
the organization. At the airline I work for efforts are ongoing to create an effective safety

5
Avia6120 Essay Allan Bradley
Trimester Two, 2010 Task 6 C 313 5319

culture. Last week I attended an in-house workshop where it was stressed that the company’s
safety culture needed to be seen as a ‘just culture’. Characteristics of a ‘just culture’ include:
accepting that safety is everyone’s responsibility, any staff member can report threats and
errors free from fear of reprisal, confidentiality of reporting is respected, there is mutual trust
between a reporter and the person to whom they report, and acceptance by everyone that
mistakes happen but it is everyone’s responsibility to try to minimize them (RBA 2010).

Line Operations Safety Audits


Even with a determined and sincere desire within an organization to develop and improve a
just and effective safety culture, some staff may be engaging in unsafe practices or making
decisions which erode safety without being aware of it. There are already some devices
available to the aviation industry which can highlight problem areas. Gunther (2003) lists some
as: accident and incident reports, Quick Access Recorder (QAR) downloads for Flight
Operations Quality Assurance (FOQA), regulatory mandates or rule changes, and regular
simulator assessments of pilots. The issue with all of these tools is that they are ‘backward
looking’ and do not necessarily allow a context for a situation. All of the tools listed explain
‘what happened’. None of them explain ‘what is happening’. Maurino (1998) explains that
accident investigations rarely provide data which can be applied in future training and while
they can tell us what went wrong, they give little or no opportunity to identify what pilots do
right. Furthermore, accident investigation reports often identify procedures or systems that
failed, but rarely have the ability to report human conditions such as confusion, forgetfulness,
distraction or fatigue, which may be a result of training errors, flawed technology – human
interfaces, poorly designed procedures, corporate pressures or a poor safety culture. In order to
give real time data which can be constructively used in the future to improve flight safety, the
Line Operations Safety Audit (LOSA) was developed.

LOSA is a tool for helping to develop an effective safety culture within the aviation industry.
FAA Advisory Circular 120-90 (2006) describes LOSA “as a formal process that requires
expert and highly trained observers to ride the jumpseat during regularly scheduled flights to
collect safety-related data on environmental conditions, operational complexity and flightcrew
performance. Confidential data collection and non-jeopardy assurance for pilots are
fundamental to the process.”

Escuer (2003) stresses that for any LOSA to succeed, pilots involved must be confident that
any data gathered during a flight will be confidential and not used as evidence for punitive
action by the company. Trust between the pilots and the LOSA observers in the confidentiality
of any data gathered is essential.

The FAA Advisory Circular 120-90 (2006) explains in detail the process for planning and
conducting a LOSA and then goes on to summarize the ten operating characteristics of LOSA.
The unique characteristic of LOSA which makes it so useful is the trained observer. The
observer is able to identify threats which would otherwise remain unnoticed such as overloaded
radio frequencies or regularly missed callouts or checklists. The observer can not only identify
threats and errors, but also record how the crew manage them. If a LOSA identifies threats
occurring, errors or violations committed or normalization of deviance from Standard

6
Avia6120 Essay Allan Bradley
Trimester Two, 2010 Task 6 C 313 5319

Operating Procedures (SOPs) then an organization with an effective safety culture should take
action to improve safety. Any changes made should, as the FAA AC says ‘be action-focused
and data driven.’ As well as being able to identify threats and errors, LOSA can also identify
positive behaviours such as techniques pilots have developed to manage regular threats or
frequent errors. Kriechbaum and Alai. (2003) note that by conducting a LOSA, a ‘snapshot’ of
the current situation within an organization can be developed, improvements can then be
developed and implemented. Further LOSA in the future will provide data to identify if any
changes have been beneficial and if further changes are needed.

Conclusion
For more than sixty years the aviation industry has been aware that regardless of the reliability,
intelligence, simplicity or complexity of any equipment, the human component must also be
considered. Humans do not always behave logically or rationally and they can make errors.
Evidence confirms that more than half of all aircraft accidents and incidents have human errors
as a causal factor. Developing systems and techniques to cope with the human component in a
system has been an ongoing field of research.

Efforts by the aviation industry to make flight crews aware of human factors and develop
strategies to cope with them have been present for more than thirty years. Originally the
courses were provided solely for flight deck crew (pilots, flight engineers and in some cases,
navigators and radio operators) however it soon became apparent that the team members
beyond the Flight Deck needed to be included in human factors awareness and coping
strategies. The result was the development of Crew Resource Management (CRM) which has
as a basic premise, “the management and utilization of all the people, equipment and
information available to the aircraft” (Hawkins 1987).

An important consideration when dealing with people is the culture in which they live and
work. National, organizational and professional cultures all influence personnel. Culture can be
an obstacle to good CRM, for example a national culture which encourages high Power
Distance (PD) gradients across a flight deck thus inhibiting a First Officer from challenging a
Captain. National culture may also enhance CRM if it encourages attention to detail and
adherence to procedures. An organization may have a culture which encourages safe practices
such as assertiveness from junior staff when they notice threats or errors within the
organization. Alternatively, an organization’s culture might be so profit driven, that safe
practices are not considered as important as a balance sheet with no red ink. Professional
personnel such as flight crew may also have a culture of their own. In a positive light, a
professional culture may hold high standards of work in high esteem. Alternatively it might
impose a negative influence such as belittling attempts to upgrade or improve procedures.
Whichever cultural aspect is considered, because people are involved, CRM skills are an
important tool that can be used to help manage threats and errors.

CRM has evolved over the years and is now thought to be in its sixth generation which is
identified as Threat and Error Management (TEM). Threats are considered to be “events or
errors that occur beyond the influence of the flight crew, increase operational complexity, and
which must be managed to maintain margins of safety” Maurino (2005). Errors are considered

7
Avia6120 Essay Allan Bradley
Trimester Two, 2010 Task 6 C 313 5319

to be “actions or inactions by the flight crew that lead to deviations from organizational or
flight crew intentions or expectations” Maurino (2005). Initial countermeasures to threats and
errors are strategies already developed such as a corporate safety culture and SOPs. If these
strategies are defeated, then the threat or error is resisted by hardware or software such as
EGPWS or TCAS. It is only when strategies and resistance have been overcome that the flight
crew are called upon to resolve the situation using techniques such as vigilance, monitoring
and challenging, assertiveness and leadership skills. As Gunther (2003) says “because flying is
our business, threats must be identified and reduced/eliminated while errors must be avoided
and managed”.

Some of the issues that must be addressed when considering TEM are; knowing what threats
and errors are actually occurring, why they are occurring and how they are being resolved.
Accident and Incident reports or Flight Data Recorder downloads only offer an ‘after the
event’ perspective and do not give much opportunity for context. In an attempt to overcome
these shortcomings, the Line Operations Safety Audit (LOSA) was developed. During a
LOSA, an independent, objective observer sits on the Flight Deck jumpseat recording threats
and errors as they occur and how the flight crew manages them. The advantage of a LOSA is
that it can give a real time analyses of what is happening and why. It can identify what the
pilots get wrong (ignore or miss threats, commit violations, make errors) but just as important,
it can also identify what the pilots get right. By conducting LOSA on a regular basis,

The data obtained from LOSA can then be used to make changes for the better. Weaknesses
can be identified and strengthened, faults corrected and good ideas incorporated. Strategies
can be improved or put in place, levels of resistance can be enhanced and the ability of humans
to resolve threats or errors developed. Cultural issues can be addressed in such a way as to
develop a just, safety culture within the organization. In this way an effective team approach to
threat and error management can be achieved and monitored.

References

Australian Government (2004) Safety management systems an aviation business guide Civil
Aviation Safety Authority. Canberra, Australia

B Beaty, D. (1995) The naked pilot. the human factor in aircraft accidents Shrewsbury,
England: Airlife Publishing

Department of Transport Air Accidents Investigation Branch (1990)


Report on the accident to Boeing 737-400 G-OBME near Kegworth, Leicestershire on
8 January 1989: Farnborough: Author (AAIB 04/90)

Lauber, J. K (1980) Resource management on the flightdeck. In Cooper, G. E., White, M. D.,
&. Lauber, J. K (Eds). (1980). Proceedings of a NASA/industry workshop (NASA CP-
2120). Moffett Field, CA: NASA-Ames Research Center. Retrieved 27 July 2020 from
World Wide Web
http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19800013796_1980013796.pdf

8
Avia6120 Essay Allan Bradley
Trimester Two, 2010 Task 6 C 313 5319

Duke, T. A. (July 1991). Just what are flight crew errors? Flight Safety Digest, 10 (7), 1-15

Escuer, R.A. (November 2003). LOSA experience within Futura. Proceedings of the first
ICAO-IATA LOSA & TEM conference. Dublin 2003 Retrieved 28 July 2010 from
World Wide Web http://www.icao.int/anb/humanfactors/icaojournalist.htm

Federal Aviation Administration (27 April 2006) Advisory Circular: Line operations safety
audits. Washington, DC: Author. Director, Flight Standards Service. Retrieved 28 July
2010 from World Wide Web
http://rgl.faa.gov/Regulatory_and_Guidance_Library/rgAdvisoryCircular.nsf/list/AC
%20120-90/$FILE/AC%20120-90.pdf

Flight Safety Foundation (2010) Serious accidents rates. Learmount, D. Global airline
accident review. Flight Global.com. Retrieved 30 July 2010 from World Wide Web
http://www.flightglobal.com/articles/article.aspx?liArticleID=336920&printerFriendly

Gunther, D. (2003) The safety change process following Line Operations Safety Audits
(LOSA). Proceedings of the first ICAO-IATA LOSA & TEM conference. Dublin 2003
Retrieved 28 July 2010 from World Wide Web
http://www.icao.int/anb/humanfactors/icaojournalist.htm

Hawkins, F., (1993) Human factors in flight, Aldershot, U.K. Ashgate

Helmreich, R.L. and Foushee (2010) Why CRM? Empirical and Theoretical Bases of Human
Factors Training. In Kanki, B.G., Helmreich, R.L. & Anca, J. (Eds). 2010. Crew
Resource Management. San Diego, CA: Academic Press

Helmreich, R. L., Klinect, J.R. and Wilhelm, J.A.(1999) Models of threat, error and CRM in
flight operations. In Procedings of the Tenth International Symposium on Aviation
Psychology (pages 677 – 682). Columbus, Ohio: The Ohio State University

Helmreich, R.L. and Merritt, A. C. (1988) Culture at work in aviation and medicine: National,
organizational and professional influences. Aldershot, UK: Ashgate

Helmreich, R.L. (2002). Culture, threat, and error: Assessing system safety. In Safety in
Aviation: The Management Commitment: Proceedings of a Conference. London: Royal
Aeronautical Society

Helmreich, R. L. (2003) Ten years of change – crew resource management 1989-1999.


Proceedings of the first ICAO-IATA LOSA & TEM conference. Dublin 2003 Retrieved
28 July 2010 from World Wide Web
http://www.icao.int/anb/humanfactors/icaojournalist.htm

9
Avia6120 Essay Allan Bradley
Trimester Two, 2010 Task 6 C 313 5319

Hofstede, G. (1983) The cultural relativity of organizational practices and theories. Journal of
International Business Studies. Fall 1983.

ICAO (2010) Universsal safety oversight audit programme. Retrieved 29 July 2010 from
World Wide Web http://www.icao.int/cgi/goto_anb.pl?soa

ICAO-IATA (2003) Proceedings of the first ICAO-IATA LOSA & TEM conference. Dublin
2003 Retrieved 28 July 2010 from World Wide Web
http://www.icao.int/anb/humanfactors/icaojournalist.htm

Kriechbaum, C. and Alai, J. (2003) Air New Zealand’s LOSA Programme. Proceedings of the
first ICAO-IATA LOSA & TEM conference. Dublin 2003 Retrieved 28 July 2010 from
World Wide Web http://www.icao.int/anb/humanfactors/icaojournalist.htm

Lonner, W.J. (1980) A decade of cross-cultural psychology: JCCP, 1970 – 1979 Journal of
Cross Cultural Psychology. March 1980 11: 7 - 34

Maurino, D. E. (1998) Human factors training would be enhanced by using data obtained from
monitoring normal operations. ICAO Journal January / February 1998 pages 17, 18,
23, 24

Maurino, D. (2000 ) ICAO human factors programme expands scope beyond the flight deck
and ATC facility. The ICAO Journal. January / February 2000. pages 15, 16, 17

Maurino, D. (April 2005) Threat and error management. Canadian Aviation Safety Seminar.
Vancouver, Canada: Retrieved 28 July 2010 from World Wide Web
http://flightsafety.org/archives-and-resources/threat-and-error-management-tem

Reason, J. (2000) Human error models and management. British Medical Journal (320) pages
768 -770

Royal Brunei Airlines (2010) Safety management workshop notes. Bandar Seri Begawan,
Brunei Darussalam. Author: Senior Vice President: Quality, Safety, Security &
Environment.

Ruffell Smith, H. P. (January 1979) A simulator study of the interaction of pilot workload with
errors vigilance and decisions. NASA Technical Memorandum 78482. Retrieved 27
July 2010 from World Wide Web
http://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/19790006598_1979006598.pdf

Submitted 12:52 BST


Thursday 05 August 2010

10

Vous aimerez peut-être aussi