Keywords

1 Introduction

With the development of technology, various work previously conducted by human operators has been replaced by automation. The work replacement can be made by machines, systems, or even artificial intelligent, as different forms of automation. Though making progress in the technology of developing more “intelligent” machines, social-technical systems still face the problem of putting human operators and automatic systems together while leaving human in the positions of monitoring and supervision. Great many incidents have been categorized as “human errors” are in fact originated in the lack of concern of human operators’ role in the system. At the present, the autonomous systems do not possess the capability of fully automation in complex and unpredictable situations. Therefore the effective teaming between humans and automation requires considerable attention in the design and operations of social-technical system.

In nuclear industry, advanced digital technology has been applied to assist operating crews in managing the power plant. Automation functions ranging from information gathering, information filtering, analysis, diagnosis, decision making to action implementation, can be found in various forms of system design features. Computerized operating procedures and operator support systems developed in recent years can interactively or independently execute series of functions that can be considered as autonomous agents to some extent. However, the influence of automation on team performance of operating crews, as well as how the related automation design could be beneficial to crew performance, have not been systematically discussed for nuclear power plant operations.

2 Issues Introduced by Automation

Introducing automation into the system results in unexpected problems with human-automation interaction [1]. Since the automation does not have a uniform effect on workload, the redistribution rather than reduction of workload creates new demands for communication and coordination, causing the problem of “clumsy automation”. The operator’s task has shifted from active control to supervisory control and monitoring, with occasional need to shift to manual control. These changes imposes new cognitive demands on operators, in that they are required to know more about the system in order to understand, predict, and manipulate the system behavior. New knowledge of the interface and interaction with the automation function should be learned. Misunderstanding and miscalibration of the system are also possible. If gaps and misconceptions in an operator’s mental model of the system awareness exist, there could be breakdowns in attention allocation and loss of situation awareness. Breakdowns in mode awareness result in “automation surprises”. As the advanced technology increases the complexity of modes, increases feedback delay of system behavior, and changes modes based on system conditions, it becomes more difficult for the operators to detect and recover from errors and maintain awareness of the active modes.

Creating a partially autonomous agent is like adding a new team member into the system that induces new demands for coordination. If it is difficult to see the intentions and activities of the machine agent, human operators will not be able to coordinate. In the example of decompensation incidents, the capacity of the automation’s compensation for the abnormal influences becomes exhausted over time, however human team members may not be prepared to take over the control either because the situation is misunderstood or the situation have progressed too far [2]. Apart from this, lower level of cognitive engagement accompanied by passive monitoring of automation would decrease situation awareness of human operators, and increase the likelihood of Out Of The Loop (OOTL) error if unexpected transitions occur [3].

2.1 Use, Misuse, Disuse, and Abuse of Automation

Parasuraman and Riley (1997) summarized factors that influence the use, misuse, disuse, and abuse of automation. The term misuse is defined as overreliance on automation, disuse as underutilization of automation, and abuse as inappropriate application of automation by designers or managers [4]. Trust, mental workload, and risk can influence automation use, with interactions between factors and large individual differences. Misuse can result in failures of monitoring or decision biases. Factors affecting the monitoring of automation include workload, automation reliability and consistency, and the saliency of automation state indicators. Disuse is commonly caused by alarms that activate falsely. Automation abuse by designers tends to define the operator’s roles as by-products of the automation, which can promote misuse and disuse of automation by human operators.

2.2 Trust and Complacency

Trust is a key element in developing effective human-technology relationship. Human, technology partner, and environment all have effects on trust development. Schaefer et al. (2016) conducted a meta-analysis of factors influencing the development of trust in automation [5]. The results presented influencing factors in three groups: 1) human related emotive, cognitive, and demographic factors; 2) automation related factors such as automation capability, behaviour, error, feedback, and automation features; 3) task types of cognitive, control, and perceptual tasks.

Since the attention allocated by people to the automation tends to decrease with increased trust, trust can be related to the problem of complacency or overreliance [3]. Complacency on automation would result in failure of monitoring or decision bias. Lower reliability is associated with lower trust [6]. The strategy of attention allocation could be influential on monitoring. Therefore, compared with stable reliability, changing reliability of automation would promote better monitoring of the system and easier detection of automation failure [7]. As for the characteristics of automation, higher transparency would promote higher trust. Even with lower automation reliability, human operators were more trusting if they were aware of it [8]. Hoff and Bashir (2015) stated that the operator’s dependence on automation could be mitigated by the degree to which the operator was able to independently assess the system performance, the complexity of the automation, the novelty of the situation, the operator’s ability to perform the task manually, and the operator’s decision freedom [9].

The application of automation may also influence the human-human reliance in a team. In a computer-based scenario that participants interacted with a human aid and an automated tool simultaneously, Lyons and Stokes (2012) found reduction of reliance on human aid during high-risk decisions. However, there were no reported differences in intentions to rely on either source [10].

2.3 Levels of Automation and Related Models

One way to explore the influence of automation on human performance is to categorize automation into different levels and study each level respectively. Sheridan, Verplanck and Brooks (1978) developed taxonomy of 10 different levels of automation in man-computer decision making based on the authority in system operations, information provided to the user by the system, and the agent who implements an action for undersea teleoperation [11]. This 10-level taxonomy was later expended in the model for types and levels of human interaction with automation [12]. Parasuraman, Sheridan and Wickens (2000) proposed a four-staged model based on human information processing, namely information acquisition, information analysis, decision and action selection, and action implementation. Each of these four functions can be automated to different levels [12]. For the levels of automation of decision and action selection, a 10-level classification was suggested (Table 1).

Table 1. Levels of automation of decision and action selection [12]

Wickens et al. (2010) suggested that both level of automation and information processing stages should be considered in determining the degree of automation [13]. The integration of these two dimensions would make the degree of automation a continuum. Based on such assumption, a trade-off model that describes the relationships between degree of automation and variables of routine performance, failure performance, workload, and loss of situation awareness were proposed (Fig. 1) [13, 14]. According to this trade-off model, when the degree of automation increases, routine performance would improve, but not failure performance. High degree of automation indicates faster drop of performance during automation failure. With the increase of degree of automation, workload would have a decreasing trend. However, the attention allocated to the automated tasks would be reduced, results in loss of situation awareness. Onnasch et al. (2014) conducted a meta-analysis based on 18 empirical studies on automation to examine this trade-off model. The results indicated that a) automation benefits routine system performance with increasing degree of automation, b) there is a similar but weaker pattern for workload when automation functioned properly, and c) higher degree of automation has a negative impact on failure system performance and situation awareness. It was found in the analysis that when degree of automation moved across a critical boundary, the negative consequences of automation became most likely [14].

Fig. 1.
figure 1

Illustration of trade-off model [13, 14]

Endsley and Kaber (1999) proposed a hierarchy of levels of automation applicable to dynamic-cognitive and psychomotor control task performance, formed from the combination of human and computer performance across four task stages: 1) monitoring and information presentation; 2) generation of options; 3) decision making/selection of course of action; 4) implementation of actions [15]. Later the concept of situation awareness was supplemented and resulted in the following levels: manual control, information cueing, situation awareness support, action support/tele-operation, batch processing, shared control, decision support, blended decision making (management by consent), rigid system, automated decision making, supervisory control (management by exception), and full automation [3]. This level of automation is part of a Human-Autonomy System Oversight (HASO) model (Fig. 2) [3]. The HASO model depicts key system design features that influence the human cognitive processes involved in successful oversight, intervention, and interaction with automated systems. It consists of system design features, environment/system feature, emergent system characteristics, cognitive construct, and decision, as well as their relationships. System design features include automation interaction paradigm, automation interface, automation robustness, and automation reliability. The corresponding emergent system characteristics include workload, engagement, and complexity. While important cognitive constructs include situation awareness, mental model, attention allocation, and automation trust.

Fig. 2.
figure 2

Human-Autonomy System Oversight (HASO) model [3]

3 Team and Automation

3.1 Models on Team Performance

Bowers et al. (1996) applied an input-process-outcome Team Effectiveness Model (TEM) to organize the discussions on how the characteristics of team effectiveness interact with automation (Fig. 3). The model includes organizational and environmental characteristics, individual characteristics, team characteristics, task characteristics, work characteristics, and team processes [16]. Advanced technology systems reduce team size, which would have impact on team performance. At the meantime, the team composition and staffing are also influenced in that the authority gradient could be reduced. Members in the team would have different roles and responsibility after automation has been introduced. Shared expectation and team norms could be associated and changed with automated systems. From the aspects of team processes, the significant influence of automation on team communication and coordination would be observed. Automatic systems seem to change the pattern of communication as a result of redistributed workload. The structure of communication is also affected, because human operators are required to interact actively with the automation. There may be modifications of nonverbal information transmission, in that automation reduces the control actions of operators. These facts suggest that the nature of effective team processes might be quite different in automatic systems.

Fig. 3.
figure 3

Team effectiveness model [16]

O’Neill et al. (2020) conducted a review and analysis of empirical studies on human-autonomy teaming [17]. The coding framework of the literature review was based on Inputs-Mediators-Outcomes (I-M-O) model of team effectiveness, which is similar to the TEM mentioned above (Fig. 4). While the definition of autonomy is different from automation, the corresponding variables in human-autonomy teaming are also suitable for team performance in automated system. The inputs of I-M-O model include autonomy characteristics, team composition, task characteristics, human individual difference, and training. Specifically, autonomy characteristics include level of autonomy, transparency, and reliability. Task characteristics such as complexity and interdependence levels were considered. For the mediators, “transition, action, and interpersonal processes” include planning, communication, coordination, and conflict management, whereas “affective and cognitive emergent states” include trust, shared mental models, situation awareness, and workload. In this model, outcomes not only contain individual and team task performance, but also team viability and individual learning, development, and need satisfaction.

Fig. 4.
figure 4

Inputs-Mediators-Outcomes (I-M-O) model of team effectiveness [17]

Strater et al. (2007) presented a preliminary framework illustrating how automation may influence team cognition and team coordination in complex operational environments [18]. The framework includes the effects of information exchange and updating between humans and automation on lower-level and higher-level cognitive processes as well as teams’ higher-order “metacognitive” processes.

3.2 Team Process and Performance Measurement

Team coordination and collective thinking are important to good team performance. A number of models have been developed to describe the process. Crew resource management model has been successfully applied in aviation [19]. Other models include teamwork model [20], meta-cognitive and macro-cognitive model of team collaboration [21], team sensemaking [22], and the mutual belief model [23].

Models and reviews indicate that it is important for team members to know what others on the team are thinking and doing (situation awareness about team processes), as well as knowing the system and surrounding context (global situation awareness). Maintaining good global situation awareness without good situation awareness about team processes is not enough for good team performance. Besides, the operators’ habits, assumptions, complacency, and reliance on established conduct of operations standards can be contributed to losing situation awareness of other team members. Such situation is harmful to team performance, and could lead to teamwork errors such as failing to do peer checks or independent verifications [24].

Mathieu et al. (2000) tested the influence of teammates’ shared mental models on team processes and performance in a series of missions on a personal-computer-based flight combat simulation [25]. The study distinguished between teammates’ task-based and team-based mental models. The model convergence or “sharedness” was measured using individually completed paired-comparisons matrices, and analyzed using a network-based algorithm. The results indicated that both shared-team-based and shared-task-based mental models related positively to subsequent team process and performance. Furthermore, team processes fully mediated the relationship between mental model convergence and team effectiveness.

Wright and Kaber (2005) investigated the effects of automation on team performance in a simulated Theater Defense Task using this four-stage information processing model [26]. Four automation conditions were simulated applied to realistic combinations of information acquisition, information analysis, and decision selection functions across two levels of task difficulty. Team effectiveness and team coordination were measured. Results indicated that an increase in automation of information acquisition led to an increase in the ratio of information transferred to information requested; an increase in automation of information analysis resulted in higher team coordination ratings; and automation of decision selection led to better team effectiveness under low levels of task difficulty but at the cost of higher workload.

4 Team-Automation Model

Based on the review, a preliminary model of team-automation model is proposed as illustrated in the following diagram (Fig. 5). The elements within the diagram have been discussed in previous sections, and the features of the corresponding elements are listed below in a dashed frame. This team-automation model follows the basic concept of input-process-outcome team effectiveness model with the consideration of mediators in the I-M-O model. Although cognitive process of team coordination is not specifically depicted, this model illustrates the relationships among major elements.

Fig. 5.
figure 5

Team-automation model

The concept of “team” in this model means multiple human operators team as generally organized in nuclear power plants. In the main control room of a nuclear power plant, operating crew including reactor operator, turbine operator, and shift supervisor for instance, are required to work together to achieve safe and effective operation of the plant. Monitoring, detection, situation assessment, response planning, and response implementation are functional roles of operators in supervising the plant [27]. These are also major functions that the design of automation should support. In the design of complex system such as nuclear power plants, the level of automation partly depends on the results of functional requirement analysis and function allocation, as well as the following task analysis. Therefore, the characteristics of tasks and automation are interrelated, which means automation cannot be separated from task requirements. As for the automation interface, O’Hara and Higgins (2010) in a report provided review guidance and technical basis on human-system interfaces to automatic systems [28]. Trust and complacency as important cognitive constructs play a critical role in human-automation interaction. They can be affected by automation, task, and team characteristics, and have strong effects on other team process variables. Proper training of operators can change the team characteristics such as cohesiveness and familiarity, improve team process skill, and result in better team performance. Thus training and re-training of operators should consider team shared mental model and human-automation interaction, as well as skills on coordination, communication, decision making/problem solving, attention allocation, and workload management. The feedback of team performance to automation cannot be ignored, in that interdependent functions of automation require inputs from system and human operators.

5 Conclusions

Issues on automation have been extensively examined by researchers. Problems such as “clumsy automation”, “automation surprises”, and Out Of The Loop (OOTL) error can be introduced with the application of automation. As the automation becomes more versatile in nature, designers of complex social-technical system should pay more attention to the human-automation interaction and its influence on system performance. Apart from all the automation functions applied in the modern nuclear power plants, the influence of automation on the operating crews was less examined. Based on the review of automation related and team performance literature and models, a team-automation model is proposed and discussed. This review wishes to provide an outline for the team-automation relationship and offer some supports for related future research.