Human factors is the
study of human perceptual and cognitive capabilities
and limitations and the role these play in the effective
use of consumer products, machines, computers, and
other large-scale systems.
Our human factors research
is currently focused on the effects of different
types and levels of automation
on human attention, decision-making, vigilance and
other aspects of cognition. An important research
thrust over the past decade has been the investigation
of pilot and air-traffic controller performance
with advanced automation and the validation of a
new approach to automation implementation called
We also have a major research program in human-robot
in which we are examining the efficacy of different
interface types for human supervision of multiple,
semi-autonomous robotic vehicles, including networked systems with large numbers of agents. See Automation and Robotics.
The human factors research methods we use include:
Performance tests on laboratory multi-task platforms
Performance studies using cockpit, air traffic
control, military command and control, and multiple-robot
Computational modeling of human performance (Bayesian analysis, fuzzy
signal detection theory, task network models)
Eye movement analysis
In addition, in recent years we have used neuroergonomic
methods to examine human factors issues.
Harnessing Complexity in Human-Machine Systems, Air Force Office of Sponsored Research (AFOSR) Multi-University Research Investigation (MURI) (Subcontract from Carnegie Mellon University
Adaptive Delegation Interfaces for Human-Robot Teaming (Army Research Laboratory, subcontractor to Perceptronics Solutions)
Delegation interfaces for human supervision of
multiple unmanned vehicles (Army Research Laboratory)
Recent Human Factors Publications
Parasuraman, R., Cosenzo, K., & de Visser, E. (2009). Adaptive automation for human supervision of multiple uninhabited vehicles: Effects on change detection, situation awareness, and mental workload. Military Psychology, 21.
Parasuraman, R., Sheridan, T. B., & Wickens, C. D. (2008). Situation awareness, mental workload, and trust in automation: Viable, empirically supported cognitive engineering constructs. Journal of Cognitive Engineering and Decision Making, 2, 141-161.
Parasuraman, R, & Wickens, C. D. (2008). Humans: Still vital after all these years of automation. Human Factors, 50, 511-520.
Sanquist, T. F., Minsk, B., & Parasuraman, R. (2008). Cognitive engineering in radiation screening for homeland security. Journal of Cognitive Engineering and Decision Making, 2.204-219.
Sanquist, T. F., Doctor, P., & Parasuraman, R. (2008). A threat display concept for radiation detection in homeland security cargo screening. IEEE Transactions on Systems, Man, and Cybernetics. Part C. Applications, 38, 856-860.
Miller, C., & Parasuraman, R. (2007). Designing for flexible interaction between humans and automation: Delegation interfaces for supervisory control. Human Factors, 49, 57-75.
Rovira, E., McGarry, K., & Parasuraman, R. (2007). Effects of imperfect automation on decision making in a simulated command and control task. Human Factors, 49.76-87.
Metzger, U., & Parasuraman, R. (2006). Effects of automated conflict cueing and traffic density on air traffic controller performance and visual attention in a datalink environment. International Journal of Aviation Psychology, 16, 343-362.
Sheridan, T., & Parasuraman, R. (2006). Human-automation interaction. Reviews of Human Factors and Ergonomics, 1, 89-129.Parasuraman, R. (2000). Designing automation for human use: Empirical studies and quantitative models. Ergonomics, 43, 931-951.
Metzger, U., & Parasuraman, R. (2005). Automation in future air traffic management: Effects of decision aid reliability on controller performance and mental workload. Human Factors, 47(1), 35-49.
Parasuraman, R., Galster, S., Squire, P., Furukawa, H., & Miller, C. (2005). A flexible delegation interface enhances system performance in human supervision of multiple autonomous robots: Empirical studies with RoboFlag. IEEE Transactions on Systems, Man, and Cybernetics. Part A: Systems and Humans, 35(4), 481-493.