Human Supervision of Automated Systems and the Implications of Double Loop Learning

Human Supervision of Automated Systems and the Implications of Double Loop Learning

A.S. White
DOI: 10.4018/jitsa.2013070102
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This paper describes the problem of human monitoring of automation. It considers the approaches involved in mental models and compares the ideas involved in double loop learning. The approaches are collected together with limited experimental experience to form a more complete model of the learning model involved in developing human control with proposed strategies for development.
Article Preview
Top

Introduction

Over the last 60 years automation has played an increasing role in our society. With the reduction of prime human control of systems has come the situation whereby supervision of automated systems is now common. The role of the computer has made this almost inevitable. Computer controlled systems are more reliable than operator controlled situations. However when the certain failure of a computer system happens, our society demands that some backup human control, or at least monitoring exists.

In 1973 an Eastern Airlines flight 401 was landing and the pilot was trying to sort out a problem with the undercarriage when the autopilot disengaged. Neither officer noticed a problem due to their preoccupation with the landing procedures. The report (NTSB, 1992) said that the crew had become complacent and had not monitored the instruments effectively.

This crash and many other examples of similar events exhibit pilot over-reliance on automation (Lee & Moray, 1992; Mosier & Skitka, 1994; Riley, 1994). On June 30th 1994 an Airbus A330 crashed while on a test flight. The aircraft was being tested to ascertain how well the autopilot could control an engine out situation, with different loading conditions. A later investigation concluded that the crew were overconfident and did not intervene early enough to prevent an accident. It was alleged that if they had responded 4 seconds, earlier then the accident could have been avoided.

Parasuraman (1993) gives a pertinent example of a system where the staff are poorly trained, paid little and have a small opportunity for advancement, staff that monitor x-ray apparatus at airports for detecting weapons and hazardous or explosive materials. They have an excellent detection record exceeding 90%, despite long hours and poorly presented information.

The accident at the Three Mile Island Nuclear plant is another example of a critically complex system where the monitoring process failed. (Bignell & Fortune, 1984) show that the failure of one valve and the way the information was presented to the operators was the overwhelming cause of the disaster. The confusion in the data and the amount of signals to be monitored by the operators was too great for effective intervention.

Sheridan (1997) describes the whole process by which human interaction with control systems can be categorised. His descriptions of the stages of supervisory control are reproduced here for clarity. There are five components in Sheridan’s’ description: a sensor, actuator, display, controller and computer (Figure 1).

Figure 1.

Supervisory control

jitsa.2013070102.f01

He details the five stages required for supervisory control:

  • Planning off-line what tasks to do;

  • Teaching or programming the computer;

  • Monitoring the automatic action on-line to detect failures;

  • Intervening i.e. Taking over control in emergencies;

  • Learning from experience to do better in future.

In his seminal paper (Sheridan, 1985) on trends in man-machine systems Sheridan relates how the mental model (Figure 2) of the operator affects all parts of the control process.

Figure 2.

Mental model modified from Sheridan

jitsa.2013070102.f02

Complete Article List

Search this Journal:
Reset
Volume 17: 1 Issue (2024)
Volume 16: 3 Issues (2023)
Volume 15: 3 Issues (2022)
Volume 14: 2 Issues (2021)
Volume 13: 2 Issues (2020)
Volume 12: 2 Issues (2019)
Volume 11: 2 Issues (2018)
Volume 10: 2 Issues (2017)
Volume 9: 2 Issues (2016)
Volume 8: 2 Issues (2015)
Volume 7: 2 Issues (2014)
Volume 6: 2 Issues (2013)
Volume 5: 2 Issues (2012)
Volume 4: 2 Issues (2011)
Volume 3: 2 Issues (2010)
Volume 2: 2 Issues (2009)
Volume 1: 2 Issues (2008)
View Complete Journal Contents Listing