Vous êtes sur la page 1sur 7


Review of a Pivotal Human Factors Article: Humans and Automation: Use, Misuse, Disuse, Abuse
John D. Lee, Department of Mechanical and Industrial Engineering, University of Iowa, Iowa City
Objective: This paper considers the inuence of Humans and Automation: Use, Misuse, Disuse, Abuse and examines how it relates to the evolving issue of humanautomation interaction. Background: Automation presents important practical challenges that can dramatically affect satisfaction, performance, and safety; philosophical challenges also arise as automation changes the nature of work and human cognition. Method: Papers cited by and citing Humans and Automation were reviewed to identify enduring and emerging themes in human-automation research. Results: Humans and Automation emerges as an important node in the network of automation-related papers, citing many and being cited by many recent inuential automation-related papers. In their article, Parasuraman and Riley (1997) integrated previous research and identied differing expectations across designers, managers, and operators regarding the need to support operators as a source of automation problems. They also foresaw and inspired research that addresses problems of overreliance and underreliance on automation. Conclusion: This pivotal article and associated research show that even though automation seems to relieve people of tasks, automation requires more, not less, attention to training, interface design, and interaction design. The original article also alludes to the emergence of vicious cycles and dysfunctional meta-control. These problems reect the coevolution of automation and humans, in which both adapt to the responses of the other. Application: Understanding this coevolution has important philosophical implications for the nature of human cognition and practical implications for satisfaction, performance, and safety.

Over the past 50 years, automation has challenged the human factors community with both pragmatic and philosophical issues (Asher & Post, 1964; Edwards & Lees, 1974; Parsons, 1985). Pragmatic issues concern the frequent failure of automation to achieve the promised benefits. Philosophical issues concern how automation redenes the role of humans in complex systems and even the nature of human cognition (Hancock, 1996; Sheridan, 2002). Automation, dened as technology that performs a function that was previously carried out by a human (Parasuraman & Riley, 1997), touches activities from the monumental to the mundane. At one extreme, automation in the form of decision support systems

influences corporate strategies and monetary policy; at the other extreme, automation helps lter unwanted e-mail messages. In both cases and many in between, imperfect automation can undermine or enhance satisfaction, performance, and safety. Automation extends the physical and cognitive capacity of people to achieve what might otherwise be impossible, but only if its design considers the characteristics of the joint cognitive system that emerges from the combination of humans and automation (Roth, Bennett, & Woods, 1987). The highly influential article Humans and Automation: Use, Misuse, Disuse, Abuse (Parasuraman & Riley, 1997) identied several

Address correspondence to John D. Lee, Department of Mechanical and Industrial Engineering, University of Iowa, 2130 Seamans Ctr., Iowa City, IA 52242; jdlee@engineering.uiowa.edu. HUMAN FACTORS, Vol. 50, No. 3, June 2008, pp. 404410. DOI 10.1518/001872008X288547. Copyright 2008, Human Factors and Ergonomics Society.



issues that explain why automation often fails to deliver its promised benets and inspired substantial research to address these issues.

Parasuraman and Riley (1997) reviewed 120 papers related to human interaction with automation. This review included several of the most inuential papers in the eld of human factors engineering. Table 1 shows 10 of the most frequently cited of these papers, which consider a broad range of issues, including the role of heuristics and biases, trust in automation, representation aiding, adaptive automation, and analytic techniques to assess human-automation interaction. These papers represent some of the 71 papers published in Human Factors over the past 50 years that directly address the issue of automation. These papers consider the effects of automation across domains that include aircraft cockpits, command and control, manufacturing, maritime operations, medicine, driving, process control, and air trafc control. Humans and Automation has also been very inuential, garnering 126 citations since publication and 31 in 2007 alone. Of the papers citing Humans and Automation, several have also begun to emerge as inuential, as indicated by the citations shown in Table 1. Of the ve most frequently cited papers, three concern the role of decision biases and trust in guiding reliance, and two describe qualitative and quantitative models that can guide automation design. Humans and Automation represents an important node in the network of automation-related research.

As suggested by its title, Humans and Automation: Use, Misuse, Disuse, Abuse describes why automation often fails to perform as expected. Automation designers frequently fail to account for how people adapt to the introduction of automation, leading to unanticipated negative consequences concerning satisfaction, performance, and safety. Such consequences were documented more than a decade before Humans and Automation in papers such as Flight-Deck Automation: Promises and Problems (Wiener &

Curry, 1980) and Ironies of Automation (Bainbridge, 1983). Fundamental ironies identied by these papers are that automation designed to enhance safety does not always enhance safety and that as automation becomes more sophisticated, the role of the human becomes more, rather than less, important. Humans and Automation builds on these insights and points to how differing expectations across designers, managers, and operators lead to a failure to support operators in using the automation effectively (Parasuraman & Riley, 1997). Humans and Automation describes a mismatch of inappropriate expectations and requirements regarding the support of operators. An important contribution of this article was to characterize these mismatches in terms of misuse, disuse, and abuse. This categorization makes useful distinctions that continue to guide automationrelated research, as reflected in a recent review (Sheridan & Parasuraman, 2006). Use of automation refers to operators engaging automation to perform functions they might otherwise perform manually. Appropriate use of automation can enhance safety and performance. Automation use depends on a complex interaction of factors that include workload, cognitive overhead, trust in automation, self-condence, and risk. Although one might expect automation to reduce workload and be engaged by operators to mitigate high-workload situations, this is often not the case. Clumsy automation often simplies easy tasks and accentuates the difculties of hard tasks rather than smoothing workload peaks and valleys (Wiener,1989). One reason for this outcome is the overhead associated with engaging and disengaging automation, as documented by the model-based analysis of Kirlik (1993). Another reason is that the apparent simplicity of automation masks its actual complexity, which is only revealed in demanding situations (Woods, 1994). Misuse concerns situations in which operators rely on automation when the automation performs poorly. In these situations, operators might overtrust the automation (Lee & See, 2004), use heuristics to engage automation in situations that are often but not always appropriate, or fall prey to automation biases that make them less attentive to contradictory information (Mosier, Skitka, & Korte, 1994; Skitka, Mosier, & Burdick, 1999, 2000). One important aspect of misuse concerns monitoring failures, in which operators tend to


June 2008 Human Factors

TABLE 1: Ten of the Most Cited Articles Cited by Humans and Automation and 5 of the Most Cited Articles Citing Humans and Automation Papers Cited by Humans and Automation Tversky and Kahnemann (1974) Judgment under uncertainty: Heuristics and biases Lee and Moray (1992) Trust, control strategies and allocation of function in human-machine systems Bainbridge (1983) Ironies of automation Weick (1988) Enacted sensemaking in crisis situations Lee and Moray (1994) Trust, self-condence, and operators adaptation to automation Bennett and Flach (1992) Graphical displays: Implications for divided attention, focused attention, and problem solving Rouse (1988) Adaptive aiding for human/computer control Yeh and Wickens (1988) Dissociation of performance and subjective measures of workload Parasuraman, Mouloua, and Molloy (1996) Effects of adaptive task allocation on monitoring of automated systems Kirlik (1993) Modeling strategic behavior in human-automation interaction: Why an aid can (and should) go unused Papers Citing Humans and Automation Parasuraman, Sheridan, and Wickens (2000) A model for types and levels of human interaction with automation Moray, Inagaki, and Itoh (2000) Adaptive automation, trust, and self-condence in fault management of time-critical tasks Lee and See (2004) Trust in technology: Designing for appropriate reliance Mosier, Skitka, Heers, and Burdick (1998) Automation bias: Decision making and performance in high-tech cockpits Parasuraman (2000) Designing automation for human use: Empirical studies and quantitative models Citations 4,050 95 93 83 76 69 49 44 28

23 Citations 71 32 29 17 16

Note. As reported by the ISI Web of Knowledge through October 2006, which does not include book chapters or technical reports.

neglect automation breakdowns. The likelihood of these monitoring failures is inversely proportional to the likelihood of the failure frequency. These failures are also more likely to occur when the manual task load is high and the failures lack a salient cue. Parasuraman and Riley (1997) cite the case of the Royal Majesty, in which the GPS signal to the electronic chart failed without a salient indication, leading to a monitoring failure

that led the crew to neglect other sources of information until the ship ran aground (National Transportation Safety Board [NTSB], 1997). Such examples of misuse focus on situations that might be best described as overuse, frequently induced by overtrust. Although not directly addressed in Humans and Automation, other examples of misuse include situations in which people use automation in ways the designers did



not anticipate. For example, some drivers have misused the antilock braking system (ABS) by driving faster and following more closely, undermining the anticipated safety benet of the automation (Sagberg, Fosser, & Saetermo, 1997). Disuse concerns situations in which operators fail to engage automation when it could enhance performance. Operators are often slow to accept automation because it threatens their way of life, they have not developed trust in its capability, or the automation lacks the needed functionality (Lee, 2006). Warnings represent a specic type of automation that has a long history of disuse in process control, aviation, and driving (F. P. Lees, 1983; Woods, 1995). Humans and Automation describes how the high rate of false alarms in systems with a low base rate of situations deserving a warning is likely to undermine acceptance and lead to disuse. The paper describes a highly sensitive system that misses only 1 of every 1,000 hazardous events and has a relatively low false alarm rate of .0594. Even so, if the base rate of a hazardous condition is low (e.g., .001), then the system will generate 59 false alarms for every true warning. This rate is comparable with the 28 true alerts and 980 false alarms in a recent eld test of an automotive forward collision warning system a rate of approximately 1 alert per hundred miles and 35 false alarms for every true warning (Najm, Stearns, Howarth, Koopmann, & Hitz, 2005). How false alarms affect acceptance is not clear, but recent evidence suggests that not all false alarms are perceived as being equally useless. Weather forecasters near misses are considered differently than forecasts that are off by a wider margin (Barnes, Gruntfest, Hayden, Schultz, & Benight, 2007). Similarly, collision warnings that drivers can associate with the trafc situation do not lead to the degree of distrust and disuse associated with alerts that drivers view as being random failures of the system (M. N. Lees & Lee, 2007). Parasuraman and Riley (1997) describe automation misuse and disuse as a reection of a complex interaction of many factors, such as risk, workload, and self-condence. In high-workload situations in which the operator has little condence in his or her capacity to respond, misuse is more likely than disuse. Of the many factors that inuence misuse and disuse, trust has emerged as a particularly important factor. Overtrust leads to misuse and undertrust leads disuse (Lee & Moray, 1992; Muir,

1987). Trust tends to reect the capacity of the automation and so often leads to appropriate use, but many factors unrelated to the capacity of the automation can inuence trust (Lee & See, 2004). As an example, trust in an online banking system was greater when the interface used cool pastel colors rather than warm primary colors (Kim & Moon, 1998). Understanding misuse and disuse may depend on understanding the complex array of factors that lead to overtrust and undertrust. Abuse concerns situations in which automation is designed and implemented without paying sufficient attention to its effects on the operators. Abuse often occurs when designers or managers believe that replacing unreliable people with reliable automation will reduce errors and enhance efciency. For example, weight-on-wheels sensors have been installed on aircraft to prevent pilots from inadvertently deploying thrust reversers while in ight. When this sensor fails, it prevents pilots from engaging the thrust reversers while landing, leading to a new and serious failure (Parasuraman & Riley, 1997). The replacement fallacy that underlies the assumption that failures can be avoided by automating the operators role ignores the role people play in accommodating the unexpected and in maintaining a safe and efcient system (Woods & Dekker, 2000). Automation aimed at replacing the human often has the ironic effect of undermining performance because it leaves people unsupported in accommodating the situations that the automation cannot accommodate. Automation abuse often occurs because designers create automation that has a high degree of authority and autonomy (Sarter & Woods, 1994), leading to surprises as the automation responds in ways that the operator does not expect (Sarter, Woods, & Billings, 1997). Additional feedback, such as haptic vibration that signals mode transitions of flight control systems, can mitigate this abuse (Sklar & Sarter, 1999); however, the fundamental problem rests in a technology-centered design and management philosophy. The most frequently cited contribution of Parasuraman and Rileys (1997) paper is the characterization of automation use, misuse, and abuse, but the paper also includes two less broadly recognized themes. These themes cut across the issues of misuse, disuse, and abuse: vicious cycles and the balance of control between the designer, manager, and the operator.


June 2008 Human Factors

Vicious cycles occur when a negative outcome leads to a response that leads to further negative outcomes. Such vicious cycles can exacerbate an automation-related problem as the problem is magnied over repeated interactions. As an example, Parasuraman and Riley (1997) describe the potential for highly autonomous automation to give operators little time to exercise their skills. Inadequate skills may lead operators to overrely on automation. This misuse might then diminish the operators skills further and lead to greater misuse, which in turn could undermine skills, creating a vicious cycle that greatly increases the prevalence of misuse. Asimilar pattern can occur with abuse. Parasuraman and Riley describe how automation abuse can lead to misuse and disuse of automation by operators. If this results in managers implementing additional high-level automation, further disuse or misuse by operators may follow, and so on, in a vicious circle (p. 284). In this way, managers and designers desire to insert automation to eliminate human error may distance the operator from the system and make errors more likely, leading designers and management to further substitute automation for the operator and further marginalize the operator. The vicious cycle involving escalating automation abuse illustrates a fundamental challenge regarding the balance of control between the designer and the operator in the meta-control of a system. Although this issue of meta-control is not directly addressed in Humans and Automation, it underlies many instances of misuse, disuse, and abuse. To the extent that the process is well understood and precise and repeatable actions are needed, automation designers and management may be in a better position than operators to enhance performance and safety. To the extent that unanticipated variability and creative solutions are required, operator discretion concerning when and how to engage automation must be supported. This meta-control problem must balance the benets of top-down control, in which designers and managers dene the role of technology, with the benets of bottom-up control, in which operators dene when and how automation is used. Poor feedback and poor understanding of the process being controlled make this a difficult control problem. Designers and managers have limited feedback, often constrained to the failures

caused by human error, and operators successful adaptations mask the complexity and unanticipated situations that the automation fails to accommodate (Woods & Dekker, 2000). As noted by Parasuraman and Riley (1997), many problems with automation stem from the mismatch of the expectations that designers and managers have regarding the need to support the operator managing the automation. Mismatched expectations are one symptom of dysfunctional meta-control.

Many metaphors have been used to describe automation, including that of a tool, prosthesis, supervisee, and agent (Lee, 2006). With any of these metaphors, automation does not simply replace the human in performing a function; rather, automation transforms the work as operators adapt to the new opportunities it affords (Parasuraman & Riley, 1997). For example, electronic charts transform the navigation work of a mariner from an active interaction involving estimating positions, extrapolating the path, and planning waypoints to passive monitoring of the progress along a route (Lee & Sanquist, 2000). Anticipating the consequences of this transformation and operators adaptation to it poses a substantial challenge in the design and management of automation, particularly because it is so often neglected (Klein, Woods, Bradshaw, Hoffman, & Feltovich, 2004). As automation begins to mediate the cooperation and coordination between operators, these consequences may become even more critical and difcult to anticipate (Gao & Lee, 2006). Even relatively mundane technology can have profound effects on human cognition as people adapt to it. Parasuraman and Riley (1997) note that external storage of information, such as books and written communication, as opposed to internal storage through rote memorization, has played a central role in the evolution of human cognition (see also Tomasello, 1999). Likewise, the concept of distributed cognition describes the important inuence external representations have on human cognition (Hollan, Hutchins, & Kirsh, 2000; Hutchins, 1995). As an example, Hutchins (1995) argues that a cockpit, composed of the pilots and the instrumentation, remembers its speed rather than just the pilot. Memory is partly



an external representation developed when pilots move speed bugs, which are adjustable pointers on the bezel of the airspeed indicator. Pilots adapt the simple technology of speed bugs to provide an external memory that indicates when the aps and slats of the plane must be adjusted as the airspeed changes. Compared with speed bugs, automation represents a much more powerful extension to human cognition because operators adapt to it and it can adapt to operators (Byrne & Parasuraman, 1996; Rouse, 1988). A metaphor of automation as a cybernetic organism cyborg captures the tendency for operators to adapt to automation and for how automation in turn might adapt to them. Cyborg describes a hybrid organism in which the technology and the human become so intertwined that the capacity of humans cannot be meaningfully described without considering their technological appendages (Clark, 2003; Haraway, 1991). Although not described in Humans and Automation, the cyborg metaphor suggests a coevolution and adaptation of operators and technology that could result in profound cultural and cognitive changes that go well beyond the ability of automation to transform work. With the cyborg metaphor, the boundary separating the operator and the automation blurs. Such an intimate coupling of operators and automation greatly magnies the consequences of use, misuse, and abuse and also poses philosophical concerns regarding the teleology of technology (Hancock, 1996). As the boundary between automation and the operator blurs, it becomes increasingly critical that designers recognize that they engineer relationships and not simply technology (Sheridan, 2002). Humans and Automation will continue to be useful because issues related to use, misuse, and disuse will become increasingly important as automation plays an increasingly important role in our society. The evolution of technology, even in the past decade, demonstrates that the interactions between humans and automation are becoming increasingly important and intimate.

This paper greatly benefited from the comments of Linda Boyle and Bobbie Seppelt.
Asher, J. J., & Post, R. I. (1964). The new eld-theory: An application to postal automation. Human Factors, 6, 517522.

Bainbridge, L. (1983). Ironies of automation. Automatica, 19, 775779. Barnes, L. R., Gruntfest, E. C., Hayden, M. H., Schultz, D. M., & Benight, C. (2007). False alarms and close calls: Aconceptual model of warning accuracy. Weather and Forecasting, 22, 11401147. Bennett, K. B., & Flach, J. B. (1992). Graphical displays: Implications for divided attention, focused attention, and problem solving. Human Factors, 34, 513533. Byrne, E. A., & Parasuraman, R. (1996). Psychophysiology and adaptive automation. Biological Psychology, 42, 249268. Clark, A. (2003). Natural-born cyborgs: Minds, technologies, and the future of human intelligence. New York: Oxford University Press. Edwards, E., & Lees, F. (1974). The human operator in process control. London: Taylor & Francis. Gao, J., & Lee, J. D. (2006). A dynamic model of interaction between reliance on automation and cooperation in multi-operator multi-automation situations. International Journal of Industrial Ergonomics, 36, 512526. Hancock, P. A. (1996). Teleology of technology. In R. Parasuraman & M. Mouloua (Eds.), Automation and human performance (pp. 461498). Mahwah, NJ: Erlbaum. Haraway, D. J. (1991). Simians, cyborgs, and women: The reinvention of nature. New York: Routledge. Hollan, J., Hutchins, E., & Kirsh, D. (2000). Distributed cognition: Toward a new foundation for human-computer interaction research. ACM Transactions on Computer-Human Interaction, 7, 174196. Hutchins, E. (1995). Cognition in the wild. Cambridge, MA: MIT Press. Kim, J., & Moon, J. Y. (1998). Designing towards emotional usability in customer interfaces: Trustworthiness of cyber-banking system interfaces. Interacting With Computers, 10, 129. Kirlik, A. (1993). Modeling strategic behavior in human-automation interaction: Why an aid can (and should) go unused. Human Factors, 35, 221242. Klein, G., Woods, D. D., Bradshaw, J. M., Hoffman, R. R., & Feltovich, P. J. (2004). Ten challenges for making automation a team player in joint human-agent activity. IEEE Intelligent Systems, 19(6), 9195. Lee, J., & Moray, N. (1992). Trust, control strategies and allocation of function in human-machine systems. Ergonomics, 35, 12431270. Lee, J. D. (2006). Human factors and ergonomics in automation design. In G. Salvendy (Ed.), Handbook of human factors and ergonomics (pp. 15701596). Hoboken, NJ: Wiley. Lee, J.D., & Moray,N. (1994). Trust, self-condence, and operatorsadaptation to automation. International Journal of Human-Computer Studies, 40, 153184. Lee, J. D., & Sanquist, T. F. (2000). Augmenting the operator function model with cognitive operations: Assessing the cognitive demands of technological innovation in ship navigation. IEEE Transactions on Systems, Man, and Cybernetics Part A: Systems and Humans, 30, 273285. Lee, J. D., & See, K. A. (2004). Trust in technology: Designing for appropriate reliance. Human Factors, 46, 5080. Lees, F. P . (1983). Process computer alarm and disturbance analysis: Review of the state of the art. Computers and Chemical Engineering, 7, 669694. Lees, M. N., & Lee, J. D. (2007). The inuence of distraction and driving context on driver response to imperfect collision warning systems. Ergonomics, 50, 12641286. Moray, N., Inagaki, T., & Itoh, M. (2000). Adaptive automation, trust, and self-condence in fault management of time-critical tasks. Journal of Experimental PsychologyApplied, 6, 4458. Mosier, K. L., Skitka, L. J., Heers, S., & Burdick, M. (1998). Automation bias: Decision making and performance in high-tech cockpits. International Journal of Aviation Psychology, 8, 4763. Mosier, K. L., Skitka, L. J., & Korte, K. J. (1994). Cognitive and social issues in flight crew/automation interaction. In M. Mouloua & R. Parasuraman (Eds.), Human performance in automated systems: Current research and trends (pp. 191197). Hillsdale, NJ: Erlbaum. Muir, B. (1987). Trust between humans and machines, and the design of decision aids. International Journal of Man-Machine Studies, 27, 527539. Najm, W. G., Stearns, M. D., Howarth, H., Koopmann, J., & Hitz, J. (2005). Evaluation of an automotive rear-end collision avoidance system (No. DOT HS 810 569). Washington, DC: Research and Innovative Technology Administration, Advanced Safety Technology Division.

National Transportation Safety Board (NTSB). (1997). Marine accident report: Grounding of the Panamanian passenger ship ROYAL MAJESTY on Rose and Crown Shoal near Nantucket, Massachusetts June10,1995 (No. NTSB/MAR97/01). Washington, DC: Author. Parasuraman, R. (2000). Designing automation for human use: Empirical studies and quantitative models. Ergonomics, 43, 931951. Parasuraman, R., Mouloua, M., & Molloy, R. (1996). Effects of adaptive task allocation on monitoring of automated systems. Human Factors, 38, 665679. Parasuraman, R., & Riley, V. (1997). Humans and automation: Use, misuse, disuse, abuse. Human Factors, 39, 230253. Parasuraman, R., Sheridan, T. B., & Wickens, C. D. (2000). A model for types and levels of human interaction with automation. IEEE Transactions on Systems, Man, and Cybernetics Part A: Systems and Humans, 30, 286297. Parsons, H. M. (1985). Special-issue preface Automation and the individual: Comprehensive and comparative views. Human Factors, 27, 12. Roth, E. M., Bennett, K. B., & Woods, D. D. (1987). Human interaction with an intelligent machine. International Journal of ManMachine Studies, 27, 479526. Rouse, W. B. (1988). Adaptive aiding for human/computer control. Human Factors, 30, 431443. Sagberg, F., Fosser, S., & Saetermo, I. A. F. (1997). An investigation of behavioural adaptation to airbags and antilock brakes among taxi drivers. Accident Analysis and Prevention, 29, 293302. Sarter, N. B., & Woods, D. D. (1994). Decomposing automation: Autonomy, authority, observability and perceived animacy. In M. Mouloua & R. Parasuraman (Eds.), Human performance in automated systems: Current research and trends (pp. 2227). Hillsdale, NJ: Erlbaum. Sarter, N. B., Woods, D. D., & Billings, C. E. (1997). Automation surprises. In G. Salvendy (Ed.), Handbook of human factors and ergonomics (2nd ed., pp. 19261943). New York: Wiley. Sheridan, T. B. (2002). Humans and automation. New York: Wiley. Sheridan, T. B., & Parasuraman, R. (2006). Human-automation interaction. In R. Nickerson (Ed.), Reviews of human factors and ergonomics (Vol. 1, pp. 89129). Santa Monica, CA: Human Factors and Ergonomics Society. Skitka, L. J., Mosier, K., & Burdick, M. D. (2000). Accountability and automation bias. International Journal of Human-Computer Studies, 52, 701717.

June 2008 Human Factors

Skitka, L. J., Mosier, K. L., & Burdick, M. (1999). Does automation bias decision-making? International Journal of Human-Computer Studies, 51, 9911006. Sklar, A. E., & Sarter, N. B. (1999). Good vibrations: Tactile feedback in support of attention allocation and human-automation coordination in event-driven domains. Human Factors, 41, 543552. Tomasello, M. (1999). The cultural origins of human cognition. Cambridge, MA: Harvard University Press. Tversky, A., & Kahnemann, D. (1974). Judgement under uncertainty: Heuristics and biases. Science, 185, 11241131. Weick, K. E. (1988). Enacted sensemaking in crisis situations. Journal of Management Studies, 25, 305317. Wiener, E. L. (1989). Human factors of advanced technology (glass cockpit) transport aircraft (NASA Contractor Rep. 177528). Moffett Field, CA: NASA-Ames Research Center. Wiener, E. L., & Curry, R. E. (1980). Flight-deck automation: Promises and problems. Ergonomics, 23, 9951011. Woods, D. D. (1994). Automation: Apparent simplicity, real complexity. In M. Mouloua & R. Parasuraman (Eds.), Human performance in automated systems: Current research and trends (pp. 17). Hillsdale, NJ: Erlbaum. Woods, D. D. (1995). The alarm problem and directed attention in dynamic fault management. Ergonomics, 38, 23712393. Woods, D. D., & Dekker, S. W. A. (2000). Anticipating the effects of technological change: A new era of dynamics for human factors. Theoretical Issues in Ergonomics Science, 1(3), 272282. Yeh, Y. Y., & Wickens, C. D. (1988). Dissociation of performance and subjective measures of workload. Human Factors, 30, 111120.

John D. Lee is a professor in the Department of Mechanical and Industrial Engineering at the University of Iowa and is the director of human factors research at the National Advanced Driving Simulator. He received his Ph.D. in mechanical engineering from the University of Illinois in 1992. Date received: January 22, 2008 Date accepted: April 18, 2008