The classic aim of automation is to replace human manual control, planning and problem solving by automatic devices and computers. This paper [emphasizes] the irony that the more advanced a control system is, so the more crucial may be the contribution of the human operator. article , sciencedirect , pdf
Lisanne Bainbridge, Department of Psychology, University College London, Automatica, 1983, 19, 775-779. (conference proceedings)
There are two general categories of task left for an operator in an automated system. They may be expected to monitor that the automation is operating correctly. If it is not, they may be expected to call a more experienced operator or to takeover themselves.
Unfortunately, physical skills deteriorate when they are not used. A formerly experienced operator who has been monitoring an automation may now be an inexperienced one.
Manual takeover is needed when there is something wrong with the automation. Unusual actions will be needed to control it. The operator will need to be more skilled rather than less skilled. The operator will need to be less loaded than average in order to take on the unusual control.
Major problems may be raised for an operator who is highly practised at using [analytics dashboards] if these are no longer available in an emergency.
Catastrophic breaks to failure are relatively easy to identify. Unfortunately automatic control can camouflage system failure by controlling against the variable changes, so that trends do not become apparent until they are beyond control.
It is inadequate to expect the operator to react to unfamiliar events solely by consulting operating procedures. These cannot cover all the possibilities, so the operator is expected to monitor them and fill in the gaps. However, it is ironic to train operators in following instructions and then put them in the system to provide intelligence.
.
This article is profound in its reach and relevance.
I recently learned that Lisanne Bainbridge is updating a web site in her retirement.
.
Looking at Google's recommendations for SRE reveals the relevance of these findings from 1983. The foundation of automation is monitoring followed by incident response and then learning from incidents. Incidents are where the humans must take over because the automations failed and learning from those incidents are how the humans must adapt to these ironies of automation and further to adapt the automations themselves.
SRE Reliability Hierarchy (source )
.
Contrast XKCD cartoons. First is conventional math where automation is replacing human effort.
Is it worth the time? (xkcd 1205 )
Automation (xkcd 1319 )
See also Collaborative Automation
~
STRAUCH, Barry, 2017. Ironies of automation: Still unresolved after all these years. IEEE Transactions on Human-Machine Systems. 2017. Vol. 48, no. 5, p. 419–433.
> The concerns regarding the effects of automated systems on operator performance identified in the paper are as relevant today, when autonomous highway vehicles are being developed for large-scale use, as when the paper was written.
Over three and a half decades later, issues with automated systems that Bainbridge had raised still have not been resolved. This is the case despite the considerable research it has inspired, the lessons learned from the widespread implementation of automated systems, and numerous investigations of accidents caused, at least in part, by operator errors through interactions with automated system operations.
[…] Not only does Bainbridge’s paper continue to be cited in scholarly works, its influence on our understanding of the field has been substantial as well. Its impact can be seen in studies of out-the-loop performance [6], automation bias [7], [8], automation complacency [9], mode awareness [10], automation-related errors [11], adaptive automation [12], automation-related skill degradation [13], and the operator role in automated system operations [14], among others.