Abstract
Recent developments and general penetration of society by relations between robots and human beings generate multiple feelings, opinions, and reactions. Such a situation develops a request to analyze this area; multiple references to facts indicate that the situation differs from public opinion. This paper provides a detailed analysis performed on the wide area of human–robot interaction (HRI). It delivers an original classification of HRI with respect to human emotion, technical means, human reaction prediction, and the general cooperation-collaboration field. Analysis was executed using reference outcome sorting and reasoning into separate groups, provided in separate tables. Finally, the analysis is finished by developing a big picture of the situation with strong points and general tendencies outlined. The paper concludes that HRI still lacks methodology and training techniques for the initial stage of human–robot cooperation. Also, in the paper, instrumentation for HRI is analyzed, and it is inferred that the main bottlenecks remain in the process of being understood, lacking an intuitive interface and HRI rules formulation, which are suggested for future work.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Introduction
The vast amount of people’s fears about robots as a device, social phenomenon, or industrial development phase take many forms. These forms of fears are associated with robots themselves, stemming from concerns about their capabilities, impacts, and the potential consequences of their integration into society (Porpora, D. 2021). These fears reflect the uncertainty and unknown outcomes associated with rapidly advancing robotic technologies and their increasing presence in daily life. Subsequently, robots’ technical and aesthetic aspects have a marginal influence on robophobia (Davey, 1997). A particular fear of losing a job from the robotization of the industry mainly comes from the difference between humans and robots in the field of emotion and intellectual activity (Porpora, 2021). Humans have doubts and uncertainties regarding the artificial intellect means used by robots. Even though it may sound paradoxical, acceptance of robot decisions results in better mental achievements in humans (Hayashi & Wakabayashi, 2018). When a robot’s decisions look reasonable, the public accepts them better because people’s social culture and habits are very distinct and significantly impact the robot’s acceptance (Maccarini, 2021). Humans can train their behavior and acceptance of robots. Therefore, in areas such as nursing and medicine, a methodology rounding the corners about extreme human emotional reactions to robots is necessary (Archer, 2021), as well as a special robot control methodology preventing humans from losing social competencies due to long-lasting human–robot relations (Kempt, 2022).
The “pragmacentric”—practical behavior-based approach to robot action acceptance develops higher robot decision acceptance quality concerning human opinion (Kempt, 2020). In circumstances where robot decisions and actions cause serious outcomes, like medical injections or nursing actions, the human reaction becomes tenser and, depending on human experience, knowledge, feelings, and emotional state, varies from strongly positive to negative (Bhattacharya, 2021). Thus, it is necessary to focus on developing comprehensive social, educational, and technological solutions for human–robot interaction (HRI) to minimize unreasonable fears.
Conversely, robophobia lies in the vast area between standard human behavior concerning unknown objects or their actions. Typically, humans are conservative toward new and unknown processes or objects. Human reaction to robots differs from many factors, including geographical location. For instance, society’s agreement to use robots in Greece (57%) is much lower than in Denmark (95%) (Hofstede Insights, 2023).
Research on human–robot interaction is often considered to be closely related to human–computer interaction. Still, in contrast to the general computer science and human interfaces to it, the concept of robotics involves many technical, psychological, and even social aspects besides electronics, electrotechnics, software, and artificial intelligence (AI), which generates effects of robophobia as well. The recent appearance of social robotics as a separate product development field has sparked interest and opened new niches of investigation, bringing important questions about HRI to a new light.
This research aims to highlight the progress of human–robot interaction and qualify levels of such cooperation and their limits. The paper proposes a system of HRI evaluation and reveals the complexity of the area. Nevertheless, a systematic approach is delivered from the engineering point of view.
In our review, we hypothesize:
-
1.
Human-robot cooperation and collaboration can be effective in industry and possibly in other partially predefined environments;
-
2.
Communication and messaging in HRI are at slow progress and lack of systematic approach;
-
3.
HRI reveals new socio-psychologic phenomena;
-
4.
The general acceptance of robots over the entire population is mixed and underexplored;
-
5.
Emotional communication is very effective between humans, but better understanding and classification are necessary for improved human-robot interaction.
Materials and methods
This overview of research conducted in the area of human–robot interaction provides a multi-criteria analysis covering research questions related to available hardware and software limitations, methodological issues, and humans’ social and psychological reactions to the robots. Analysis was conducted using 106 scientific research papers selected from Google Scholar, ScienceDirect, and IEEExplore databases during three stage inclusion process. In the first stage, more than 560 publications from the last five years (except a few older ones containing fundamental statements) were selected according to the title and keywords. The following keywords were used to filter the articles: Human–robot interaction, Human emotions definition, Instrumental emotions detection methods, Human safety, Psychological comfort, Robophobia, Human motion detection/prediction, Human–machine communication, and Human reaction to robot/machine. In the second stage, after the screening, almost 320 papers were excluded from the analysis as not suitable due to the out of scope research problems, lack of validation, and low quality. In the third stage, after removing the repetitive records, 116 papers were classified into four categories (human–robot collaboration—60 papers; human–robot communication—17; human emotions and physical state evaluation in HRI—11; human perception of robots—28) and analyzed in detail. The main criteria for including the paper were: clear formulation of the research problem and proposed solution, the applicability of the results in the HRI area, and the reliability of provided results.
Review outcomes
Performed analysis delivers a vast amount of data; therefore, the outcomes of our research beg for structured presentation. In our opinion, these findings are split into four fields: human–robot collaboration, human–robot communication, human emotion, and physical state evaluation as input for robot controls and human perception of robots. This classification covers both technical and psychological issues.
Human–robot collaboration
Human–robot collaboration covers various research areas, many of which are well-explored in the scientific literature. In the majority of analyzed references, two typical approaches can be noticed:
-
Study of the possibilities of robot-human collaboration in specific fields such as medicine, agriculture, machine manufacturing, shipbuilding, waste management, etc.;
-
Analysis of the characteristics of robot-human collaboration directly unrelated to a particular specific field. For example, it is common to think that a robot can perform simple repetitive actions. Still, applying robots in areas such as shipbuilding is challenging due to many unique designs and technical solutions (Zacharaki et al., 2022).
The main way for evaluating the intensity of human–robot collaboration is done with a universally recognizable classification into five levels: (i) no collaboration; (ii) coexistence; (iii) synchronization; (iv) cooperation; (v) collaboration (Dzedzickis et al., 2021). The issue of human–robot collaboration quality regarding human understanding or psychological status stays outside the collaboration quality evaluation; the paper focuses on the technical solutions in various aspects of HRI. Human behavior evaluation belongs to the HRI realization, which is covered in many types of research.
The classification presented in Fig. 1 evaluates the possibilities of sharing workspaces, work objects, and tasks. The lowest HRI level is no collaboration—the robot remains inside a closed work cell, and workspace sharing is strictly forbidden. The second level is coexistence—a case when closed cells are removed, but workspaces between humans and robots remain strictly separated. The third level is synchronization, where robots and humans can share part of the workspace and work objects, but never simultaneously. The fourth level is cooperation—shared tasks and workspace are acceptable, but physical interaction is forbidden. Collaboration is the highest level of interaction when physical interaction and common operations between humans and robots are allowed.
Various types and levels of human–robot interaction in manufacturing were examined to develop a typical robot-human interaction methodology according to established conditions (Malik & Bilberg, 2019). The authors provide a synthesis of a human–robot collaboration architecture based on three aspects: team composition, engagement level, and safety, allowing them to describe collaboration using a 3-dimensional reference scale. A review presented by (Li et al., 2023) provides a detailed analysis of safety standards and methods ensuring human safety in HRI.
Despite different attempts to classify the intensity of human–robot collaboration, it remains one of the fundamental factors affecting the required features of HRI. Collaboration intensity typically correlates with the technological development level of HRI; the higher the collaboration level—the more advanced HRI is required.
General issues in the field of human–robot interaction
The research on HRI for common operations between robots and humans faces many scientific uncertainties. From the research reports provided in the last five years, we defined that the main research interests are general HRI issues, the possibility of adapting robots to individual humans, and the robotization of specific industries or research areas. Extensive literature analysis provided by (Faccio et al., 2023) revealed the five most important human factors impacting the success of long-term human–robot interaction. According to the authors, the main factors are physical ergonomics, mental workload, trust, acceptance, and usability. In addition, the presence of complex machines and their relations with the technological process also impacts human–robot collaboration. Balancing robotic assembly lines containing complex machines, robots, and humans sharing a common workpiece has remained an actual problem for over 30 years (Chutima, 2022, Bänziger 2020).
Another actual HRI research area is the issue of robot-human interaction in unexpected situations. Paper (Gualtieri et al., 2022) presents a virtually simulated solution for HRI in unforeseen situations and provides guidelines that effectively support non-expert users in designing and improving collaborative assembly systems from a security perspective. The authors declare that minimizing human motion amplitude and optimizing the assembly process could reduce the risk of accidents by 33%. Situation awareness is crucial not only for humans but also for robots. Research (Müller et al., 2023) presents an attempt to develop a metric capable of evaluating situation awareness by the robot using a digital twin. Research presented by (Kousi et al., 2019) introduces an augmented reality-based software suite to assist operators in manufacturing systems using mobile robots when it is challenging to predict cooperation between a robot and a human due to the presence of various tasks. The developed tool was tested in a case study inspired by the automotive industry, showing that it can facilitate communication between humans and mobile robots, increasing the work quality of human operators and supporting the assembly that connects them. A study presented by (Murata et al., 2017) focuses on analyzing the robots’ ability to train each other in a special neural network that evaluates error probabilities.
Table 1 provides a summarized overview of research on general HRI issues reported in the last five years.
Adaptive human–robot interaction
Robots’ capability to adapt to individual needs and human emotional or physical states is a widely studied issue (Umbrico et al., 2022). Robot adaptation can facilitate human physical work and perform the social functions of robots, but on the other hand, it results in complex structures requiring specialized hardware and software. Figure 2 provides an example of the architecture of an adaptive human–robot interaction case.
Adaptive robotic solutions are especially preferred in medical or rehabilitation applications. One example could be exoskeletons used to ease human physical exertion when lifting heavy objects. Their purpose is to duplicate the movements of the human body parts while creating a correspondingly greater force (than the parts of the human body produce).
Research provided by (Huang et al., 2019) examines such issues as the suitability of the exoskeleton for people of various body types, the comfort issues of the exoskeleton, and the ability of the exoskeleton control algorithms to interact with the human harmoniously. The research authors noted that relevant parameters, such as height, mass, and body mass index, could describe human body composition. Therefore, they developed an exoskeleton model that evaluates human complexity and provides data for exoskeleton adaptation.
Another issue of exoskeleton adaptation is its ergonomics. Research performed by (Ballen-Moreno et al., 2022) described a method that quantifies the difference in orientation between a user’s limb and the exoskeleton joint. This method brings a better understanding of human–robot interactions in implementing exoskeletons. In addition, the method proposed in the article determines the performance indicator of the physical interfaces of the exoskeleton.
Apart from physical adaptation, the question of perception is also relevant. In various applications, there is a problem of communication between humans and robots when the need to provide the necessary tools or equipment to humans in time arises. An experimental study providing a method of how a human can use gestures to request one or another tool for the assembly operation was conducted by (Neto et al., 2019). In the presented approach, the data captured by the robot is divided into static and dynamic blocks that are recognized using unsupervised machine learning. The proposed method demonstrated 98% accuracy in recognizing eight static and four dynamic gestures.
Cooperation between humans and robots may not necessarily be based on targeted physical tasks. It could also be based on psychological reasoning. The robot can partially perform the social function of a colleague, friend, or pet. In such cases, the social characteristics of a person and how the robot adapts to a person become essential for successful human–robot interaction. Paper (Lavit Nicora et al., 2021) describes the developed interaction model that proves the possibility of creating software that facilitates a robot’s adaptation to a person’s characteristics. Meanwhile, the research provided by (Oliveira et al., 2021) analyses the difference in HRI characteristics between a single person and a group of people. The authors suggest some avenues and future methodological trends that can help assess human behavior in human–robot interaction situations by increasing ways to assess these interactions in groups. In (Bajcsy et al., 2018) presented a model for human–robot interaction that evaluates extraneous physical factors in a case where two robots interact with two humans. Case, when few human operators interact with one collaborative robot, is also possible (Boschetti, 2021). Research provided by (Cacace et al., 2023) reveals issues of interactive physical cooperation between humans and collaborative robots. Their approach is based on the idea that robots should be able to estimate human intentions and adjust initially defined tasks and motions. About 80% of 40 undergraduate students with different experiences related to robots stated that interaction with a collaborative system seems safe and straightforward.
Studying robots’ global and local information needs is another research topic. In (Yu et al., 2022), the modeling of an information network to provide rational information for the robot is described. Addressing the problem of robot reaction speed deficiency, (Mugisha et al., 2022) presented an experimental study of improving the movement prediction time and reducing the time required for the robot to reach the desired position. The experimental results of this study revealed that eye gaze-based prediction significantly improved system performance. The detection time was reduced by 37%, and the time required to reach the target was reduced by 27%.
A summary of the research focused on the robots’ adaptation to the individual humans’ needs in the last five years period is provided in Table 2.
Summarizing the information described above, it should be noted that it is necessary to emphasize questions such as the robot’s need for global and local information, human–robot interaction in assessing extraneous physical factors, the laws of the lack of robot reaction speed, and the ability of robots to train each other. As well as the question of whether it is possible to create a typical robot-human interaction methodology according to the established conditions demands special attention. To answer this query, studying the development of human–robot interaction research methodologies is required.
Human–robot interaction in specific application areas
The number of areas in which robots can be applied is not defined or thought to be finite. For that reason, new application areas are constantly appearing, and thus, the research question of possibilities to robotize one or another process becomes more actual. There are several application areas where robots are not widely used or have yet to be fully explored. Despite advances in medical robots and assistive technologies, the application of robots in healthcare and robots used for education is still in its infancy. The application of robots in agriculture is still limited, although there is increasing interest in using robots for tasks such as crop monitoring and harvesting.
Moreover, robots have yet to be widely adopted in the service industry, although certain customer service and hospitality tasks show growing demand for them. Additionally, robots used for environment monitoring and cleaning pollution are instrumental fields, as the potential of using robots in environmental applications remains largely untapped. Figure 3 shows a few examples of different levels of human–robot collaboration in various applications.
Shipbuilding is one of the industrial areas in which questions about the possibility of automation of the processes often appear. Major challenges in shipbuilding are the large variety of weights and dimensions of the elements to be installed—from several kilograms to tens of tons. There is also a diversity of required precision of movements. One robot cannot be adapted to such a wide range of needs. The paper’s authors (Zacharaki et al., 2022) proposed an algorithm for grouping operations, creating robot operation zones, and ensuring the safety of their interaction with humans while considering the above mentioned operations.
Another problematic field from the point of view of robot-human cooperation is the field of equipment and components utilization. On the one hand, a lot of routine work can be automated in this field. On the other hand, the equipment used is diverse and creates many limitations (Hjorth & Chrysostomou, 2022; Qu, 2023). The main difficulties identified by the authors are high variability of the technical conditions of the used parts; insufficient information about the recycled products; increasing complexity of recycled products; short product life cycle and a large variety of products; increasing quality requirements for regenerated materials, components and varying requirements for utilization efficiency.
In contrast to shipbuilding or equipment utilization, agriculture faces a lot of the same repetitive work, so the application of robots here should be effective. Such operations as land plowing, harrowing, and fertilizing should be mechanized and sometimes even robotic. The question of harvesting is more complicated, thus gathering attention in the scientific literature (Vasconez et al., 2019). The paper evaluates the complexity of various tasks in terms of robot-human cooperation. Operations, such as grain crop harvesting, are massive and straightforward processes conventionally agreed to be relatively simple. However, operations with fruits and vegetables are much more complicated. There is a considerable variety of them (for example, some with dice, others with stems, etc.), different sizes, masses, and different requirements for handling them. Pruning and thinning of fruit trees is a separate issue. According to the article’s authors, these operations can be performed by joining robots and human abilities. However, such integration requires special algorithms and software for the robots.
Another problematic area similar to agriculture is forestry. Large areas of forests and predictable cases characterize this area. One of them is forest fires. One of the most effective measures to facilitate the extinguishing of forest fires is fire detection at an early stage (before the fire has covered large forest areas). A research paper (Lim et al., 2021) discusses optimizing the distribution of decision-making between humans and artificial intelligence by implementing a robotic control module.
A similar issue of optimizing decision-making processes is reported in medicine, where robots are used not only in treatment and rehabilitation processes but also to help organize the work of a medical facility, such as dispensing drugs or tools (Lestingi et al., 2021). Also, in a medical institution, prioritization of work is very important. Research presented in (Wan et al., 2020) examines a developed algorithm that allows a medical robot to distinguish priority tasks from others.
Robot implementation for treatment and rehabilitation also remains a relevant research topic—for example, using a robot to treat autism. Paper (Katsanis & Moulianitis, 2021) presents a taxonomy of child-robot interactions in autism interventions, explaining its entire framework. Interactions are modeled according to this taxonomy, where an interaction case is used to define the structure of the interaction. Based on this, a safety architecture is proposed to be integrated into the robot controller. Scientific articles also address the suitability of robots for rehabilitating human limbs (Shi et al., 2021). The study proposes a human-centered adaptive control of a lower limb rehabilitation robot based on a dynamic human–robot interaction mode. A dynamic human–robot system model is developed based on the HRI model. An equivalent spring model in three-dimensional space is proposed.
A summary of the research focused on the robots’ implementation issues in the specific application areas in the last five years period is provided in Table 3.
Human motion detection and prediction/behavior prediction
Human–robot collaboration is impossible without the use of specific methods preventing humans from rough/dangerous interactions during the operation. Human motion detection and its trajectory prediction are some of the most preferred and researched methods in HRI. Those methods are typically used to solve two major issues in robotics: prevent contact between the robot and human, or vice versa—synchronize the motion for smooth common action. However, the unequivocal assignment of the research task to one or another case is not simple since both cases include the part of motion detection. Methodology and equipment for human motion detection should account for humans’ accidental reactions caused by fear and general robophobia. Therefore, classification based on the implemented technique seems to be more reliable. The following parts provide a detailed analysis of research on human motion and detection issues, classifying the proposed approach according to the implemented methods. An overview of available reports revealed three main approaches used for motion detection and prediction: implementation of predefined algorithms (Table 4), use of physical sensors (Table 5), and application of machine learning algorithms (Table 6).
As seen from Table 4, human motion detection and prediction based on prescribed algorithms require initial references such as posture, eye contact, or workspace distribution into smaller parts. The main problem with using prescribed algorithms is existing application constraints due to the possible unexpected human reaction. Nevertheless, there are cases where human motion prediction is impossible without implementing prescribed algorithms. Analysis and prediction of human gait is a complicated process due to the complex neuromusculoskeletal system and cannot be performed without using predictive models and experimental data. A detailed review conducted by (De Groote & Falisse, 2021) presents modern methods and models suitable for human gait analysis, motion, and trajectory prediction. Such methods are essential when it is necessary not only to avoid contact with a person but also to synchronize the movements of the human and the device precisely.
Human motion detection based on physical sensors is more precise than the methods based on prescribed algorithms, but this method also comes with its own issues. The major one is the restriction created by contact sensors. Such sensors limit humans’ ability to move freely and can even distract their attention from the main focus point, as the sensors cause discomfort. An alternative to this is the application of non-contact methods, for example, based on computer vision. However, computer vision-based methods are not so accurate and reliable. Moreover, they require stable environmental conditions (light intensity, the position of a light source, etc.) and higher computational power than methods based on contact sensors, for example, electromyography.
The most advanced approach nowadays to detecting and predicting human motion is the implementation of various machine learning algorithms (Table 6). It combines the advantages of both previously described methods and simultaneously allows us to avoid their drawbacks. Such algorithms can be trained using accurate data collected using various sensors in predefined/controlled conditions and later implemented in systems equipped only with basic sensors. For example, algorithms can be trained using precise data about motion obtained from accelerometers or electromyography and implemented in systems equipped with average-resolution vision sensors.
In terms of complexity in human motion detection and prediction, the most challenging tasks require motion synchronization (Table 7). A successful solution to such problems must include human motion detection, prediction models (Vianello et al., 2023), and real-time trajectory generation capabilities.
Concerning Table 7, the most recent research on human–robot motion synchronization indicates that the latter is more applicable in home appliances and general service robots than in industrial ones. Such a situation could be explained by the industry’s low popularity of collaborative robots and strict work safety regulations. Nevertheless, pressure in the labor market develops an increasing number of collaborative robot installations in the industries, thus fostering an interest in the highest degree of human–robot collaboration. An excellent example of human–robot collaboration is provided in (Rahman, 2021), where a robotic manipulator defines the weight of the box lifted by a human.
Human–robot communication
Proper understanding of the situation and future actions plays an essential role in the life of humans, clever life beings, and intellectual equipment. Communication between humans and robots is artificial; therefore, an intuitive understanding of signals and signaling back requires additional effort. The advantages of such communication bring high benefits—remote control mode, diminishing of technical breaks and interoperation stops, enhancement of human comfort level, allowing the operator to control a robot with lower stress level. While communication between robots and humans is still artificially driven, robot behavior toward humans is disclosed by technical means. Depending on the implemented technique, three main types of human–robot communication techniques can be distinguished: speech-based communication (Table 8), sensors-based communication (Table 9), and symbolic language/gest-based communication (Table 10).
The voice and speech recognition-based technique (Richards & Matuszek, 2021) promises a powerful tool for robot control and expression of natural human reactions. The proposed speech recognition technique reaches recognition accuracy up to 90.3% using a dataset containing more than 7000 descriptions of 300 items.
Research provided by (Maggioni, 2023) proves the importance of verbal reaction. Authors experimentally defined that implementing verbal functions to the robot makes it possible to achieve a higher robot acceptance ratio since it becomes more attractive for interaction.
Efforts to teach natural language are most prospective from the point of intuitive control, but they contain a lot of obstacles and technicalities. Firstly, language understanding anchor differs from language to language. Secondly, the lexicon of the operator should be limited by a set of keywords. Training of language with a particular operator (Higgins et al., 2021) solves a task with more limitations, but it is not transferable to another operator directly. This technique works well in individual cases, but operator change brings extra expenses and resources in the long run. Voice recognition tasks in a noisy environment remain complicated, especially when signal/noise power and spectrum are in a similar range. Interesting selective voice recognition is described in (Fukumori et al., 2022), where the Doppler-effect sensor distinguishes voice signals from environmental noise. Miscommunication with robots due to accident fear or general robophobia can be detected by instrumental methods since the speed of human movement and their type differs from standard human operation mode. (Richardson, 2020).
Currently, sensors-based human–robot communication is one of the fastest emerging research trends in human–robot interaction. Such an increase is mainly caused by the increasing need for more advanced communication methods and significant developments in sensing, signals acquisition, and data processing technologies. Simultaneously, with the development of techniques intended to transform human action or reaction into a robot control command, much attention is dedicated to the inverse case—feedback from the robot to the human as a response to the actual action. Haptic devices that use force, vibration, or motion to create a sense of touch are popular in the HRI (Kuhail, 2023). Their main development trend is optimizing design and enhancing functionality to achieve more realistic and intuitive responses from the machine. The robot’s appearance also plays a significant role in human–robot communication (Song, 2022). It could provide feedback for the human, for example, by changing the robot’s eye color or providing associative images on the robot control screen.
The communication system between robots and humans using human expressions like mimic, hand, and body gestures has a particular perspective on human–robot cooperation and collaboration. For this purpose, most solutions distinguish gesture language as a special command language (Canuto et al., 2022), which is not intuitively developed and requires human education. Nevertheless, such language generates better and clearer commands due to their active character. Special gestures, which significantly differ from the natural human reaction, were developed and presented in (Mazhar et al., 2019).
Another way of human–robot communication lays in the robot’s understanding of natural human gestures. It is a more technically complicated option, but it eliminates the need to educate operators or bypass personnel about special gestures. One of these methods translates gestures into control commands (Obo & Takizawa, 2022). Another method trains robots to understand gestures (Caccavale et al., 2019). Both ways are prospective, but their reliability and dependency on human personality require additional research. Misunderstanding human gestures by robots in case of unnatural fear or general robophobia requires a separate robot operation mode, but no certain research is available. Social behavior for fear of robots is analyzed better (Fraune et al., 2019).
Human emotion and physical state evaluation as input for robot controls
Emotions and subconscious body language play a crucial role in inter-human communication, as they are the fundamental modes of communication in the animal world. Natural emotional messaging is unconscious and is known as emotional contagion (Hatfield et al., 2014). Despite the traditional belief that robots generally are ill-suited for emotional communication, there is plenty of ongoing research about computer recognition of human emotions and the emotional appearance of the robots (Kulke et al., 2020; Lim et al., 2020; Noroozi et al., 2021; Park & Whang, 2022; Ruhland et al., 2015; Toichoa Eyam et al., 2021; Weis & Herbert, 2022). While the latter aspect is more relevant for social and service robots, reading human operators’ emotional contagion and non-verbal cues is becoming an important control input if a human–robot collaborative environment is established.
Safety is supposed to be a priority over other human–robot collaborative aspects, such as production effectiveness or speed. Boredom, fatigue, and stress are human-specific variables that can lead to physical and psychological accidents during human–robot interaction. Biometric artificial intelligence methods, such as facial, speech, and body language recognition, can be applied to read these variables. Commonly, reading electrical signals of the human body employing electrocardiography (ECG), electromyography (EMG), and electroencephalography (EEG) remain fundamental non-invasive methods for emotional state detection (Dzedzickis et al., 2020). Brain activity measurement by EEG was recently shown to be adequate for measuring the emotional state of humans’ joint work with the collaborative robot during the assembly process (Toichoa Eyam et al., 2021). Moreover, the detected emotional state of the human operator (such as stress level) was used as an input for the robot velocity adjustment. It was found that emotional feedback to the robot increased the trust and comfort levels of the human operator but reduced the engagement due to routine settlement. Also, the authors admit that EEG is the most accurate and reliable technique of human emotion measurement among the other widely known techniques.
Still, EEG can be uncomfortable and even unacceptable in many practical situations. Therefore, biometric emotion measurement methods are gaining importance, especially as they are backed by the rapid development of artificial intelligence and augmented reality algorithms. The human face is one of the most important instruments of emotional communication, and the eyes are central to conveying emotional information between humans (Ruhland et al., 2015). There are plenty of parameters to be tracked to utilize the eyes for emotion detection: eyeball position and movements, eyelid position and movements, pupil diameter and its variation, fixation duration, saccade, and many others (Lim et al., 2020; Ruhland et al., 2015). The taxonomy of emotion recognition using eye-tracking is shown in Fig. 4. Desktop and wearable (mounted onto glasses) eye trackers were recently created to read most of the important eye parameters, and they are often used in augmented reality applications.
Although there is some basic knowledge about the relationship between the above mentioned eye parameters and particular emotions, there is an obvious lack of up-to-date experimental research demonstrating any practical application of eye tracking for emotional feedback to the robots. At present, eye tracking is seen only as a component of multimodal emotion measurement systems (Lim et al., 2020). In their latest study from 2022, (Lewandowska et al., 2022) employed eye-tracking technology to gauge user focus and emotional reactions to both negative and positive webpage content. However, it’s worth noting that the findings from Lewandowska’s research do not offer any real-time feedback based on the eye-tracker data.
Rapidly developing AI-backed facial movement and facial expression analysis algorithms, based on the Facial Action Coding System proposed by American psychologist Paul Eckman (Ekman, 1992), is another trend for recognition of basic human emotions, such as “happy”, “angry” and “neutral”. Recently it was found that AI-detected emotional states correlate well with the results of the parallel emotion measurement done by interpreting EMG data (Kulke et al., 2020). However, the conditions under which this research was performed are quite far off from the practical situations of human–robot collaborative environments since participants of the study (twenty students) were explicitly instructed to imitate emotions.
Similarly, AI-backed classification of body postures and their relationships with intense emotions are currently being researched by behavioral and technology scientists. The pipelined concept of automatic emotion detection from body postures is illustrated in Fig. 5. It involves several detection stages, starting from capturing the person in a video stream or a photo, estimating its body pose, which is based on the part-based skeletal or kinematic model of the human body, and recognition of emotion based on several emotions models: categorical, dimensional and componential. Still, the system’s output remains of limited reliability and doubtful value since many personal, cultural, and gender related disturbances may cause serious misinterpretations (Noroozi et al., 2021).
The topic of emotional feedback from humans to robots remains at the basic research and demonstration level. Most authors admit the lack of a more general understanding of the importance of the relationships between human emotions and the parameters of the robot control programs. The measurement of emotions, in general, remains technologically complicated and of low reliability since the variability between the subjects is one of the unsolved challenges in this field. Also, different efforts are needed to measure or classify emotions since negative or neutral emotion recognition brings more challenges than the measurement or classification of positive ones. Furthermore, recognition of emotions often leads to big data issues, and a successful solution requires edge computing available by using centralized cloud resources. On the other hand, such an approach raises cybersecurity issues (Yao et al., 2022). A summary of research on human emotions evaluation data as an input variable for HRI is presented in Table 11.
Human perception of robots
Human perception of robots defines human reaction to the robot from psychological, social, and economic perspectives.
Socializing with a robot is not science fiction nowadays; several social robots with empathetic functionalities have been recently released to the market. The most widely known models are Softbanks Pepper and Jibo by NTT Disruption, which were created almost a decade ago. Several startups followed with similar products, loading them with valuable functions ranging from remote presence and entertainment to shopping assistance. Still, despite many early promises and heavy marketing, social robots lack true popularity. One of the most probable reasons is limited human trust in robots and artificial intelligence in general. Sometimes, this is expressed as an anxiety disorder called robophobia, which is specific for almost 20% of the world’s population, according to (Davey, 1997). Much greater numbers of humans are possibly deeply biased against the robots in many ways, not only because of their experiences of limited efficiency and performance during previous application and/or collaboration attempts but also because of some irrational disbelief. This is presently identified as a social problem, preventing the general humankind from more efficient technological development. Therefore, there is a number of ongoing research directed towards exploring perceptional, cognitive, behavioral, cultural, existential, or economic concerns of humans related to the application of robots and artificial intelligence (Tables 12, 13, 14 and 15).
The term “empathy” is employed to summarize, understand, and explore the affective (emotional, primitive) and cognitive (associated with the ability to understand the mental state or perspective of another person) parameters of human–robot interaction (HRI) (Park & Whang, 2022). A model of empathy in HRI is illustrated in Fig. 6. Generally, the study revealed the absence of the cognitive responses of humans during interaction with robots, while affective response dominates. The authors note the importance of sophisticated emotional models in robotic software for improved human perception and admit the lack of research that would disclose the models with the potential of a more positive perception of the robot by a human user.
One of the few more or less successful emotional models for improved HRI is related to dog-like (canine) social robots, such as Sony Aibo (De Visser et al., 2022; Krueger et al., 2021). While investigating the emotional reaction of humans to the robots (Table 12), authors hypothesize that framing a robot as a puppy, which has a corresponding appearance and mimics a dog, the learning process will have a higher potential for positive acceptance than simply a robot, such as Spot by Boston Dynamics (De Visser et al., 2022). Authors identify the “uncanine valley” phenomenon in the emotional reaction versus dog-likeness graph (Fig. 7). Experiments with the appearance of a canine robot by dressing it in fur showed that the presence or absence of fur changed the emotional reaction of human participants. However, these changes depend on whether a robot is framed as a puppy or just a robot. Overall, it was concluded that framing a canine robot as a learning puppy will lead to a richer interactive pattern and human perceptions in HRI, and this experience can be elaborated for human and robot collaborative environments and situations.
Social touch is another research track of HRI and robot social acceptance improvement. The phenomenon of social touch is investigated mainly in behavioral sciences, and it is widely known that it can elicit a vast range of emotional and behavioral responses. Recently it was shown that a social touch from a robot could be perceived positively by a human participant, reducing stress and improving the sense of intimacy between a human and a robot (Willemse & van Erp, 2019). Still, this research focuses on human psychology, and robot operation was only imitated via the master–slave configuration involving a human moderator at the master controls.
Present research on the human reactions to robots introduced new concepts or paradigms known as anthropomorphism (Fischer, 2022) and “computers are social actors” (J.-E. R. Lee & Nass, 2010). Anthropomorphizing behavior is a specific psychological phenomenon when people tend to attribute human-like traits to robots. Anthropomorphizing behavior was observed in many psychological and physiological studies, but this phenomenon still lacks more generalized explanations. Systematic qualitative research (Fischer, 2022) demonstrated significant intra- and interpersonal variation in the responses of human participants to identical robot behavior patterns, with easy switching from anthropomorphizing behavior to technical behavior (treating the robot as a machine). These observations are taken as arguments that the paradigm “computers are social actors” does not hold in general since anthropomorphizing behavior is temporal and will be different for different persons. However, the conditions under which this behavior was studied in this particular research were not universal since the appearance and behavior of a robot used in the experiment were far from anthropomorphic.
Furthermore, human reactions and behavior with respect to robots strongly depend on the actual robot implementation use case (Etemad-Sajadi et al., 2022) and the robot’s structure as well as external look (Jørgensen et al., 2022). Questioning respondents after reviewing Peper robots in various actions brought answers that robot acceptance by humans is mainly affected by safety and trust as well as by robot behavior scenarios. Evidently, the more similar to humans it is, the higher acceptance can be achieved (Etemad-Sajadi et al., 2022). However, according to (Van Maris et al., 2021), higher acceptance of robots leads to positive consequences only if it is related to broader implementations of the robots. In other cases, it can lead to stronger emotional attachment, dependence on robots, and self-isolation, especially when using service robots to assist older people. A similar involvement in affection is demonstrated in (Lee & Liang, 2019), where authors have proved that a foot-in-door strategy—smaller requests followed by large ones could be successfully implemented in HRI to persuade humans to perform required actions.
The study provided by (Jørgensen et al., 2022) defined that human reaction and behavior patterns differ in the case of interaction with conventional and soft robots. Nevertheless, the experiment participants could not specify which robot type seemed more natural. Such behavior proves the complexity of human perception and the need for further extensive multifactorial research in this field.
Human trust and safety in robotic installations were analyzed in many references, which are embraced in Table 13. Direct evaluation of human trust in the robot operation is a complicated task, and the indirect definition of trust level experimental research is provided in (Story et al., 2022) and (Horstmann & Krämer, 2022). In addition to experimental methodology, a simulation of the robot’s impression to the examined person in the working environment exists (Akalin et al., 2022). A modern augmented reality method and game environment are useful for human trust research and bring outstanding results (Alarcon et al., 2021). The game environment can attract young people into robotic and industrial action circumstances and help promote industrial careers.
Theoretical analysis of human behavior and comfort level in HRI by using meta-analysis and a broad view in the available literature presented in (Hancock et al., 2011). Meta-analysis can predict HRI at an early stage and provide fast results for frequent cases, but local influence (human habits, societal opinion, etc.) opens a broad space for further analysis.
Human acceptance of the robot environment develops a certain human reaction to them. The reaction can be expressed as acceptance of different degrees. Evaluation of this degree was analyzed in the paper (Roesler et al., 2022). It can be based on the level of anthropomorphism (Xiao et al., 2022)(Páez & González, 2022b)n mode (Páez & González, 2022) or general sociality (Londoño et al., 2022).
Another issue limiting the implementation of emotional models into HRI is related to the challenges of evaluating real human emotions: subconscious (instrumental) and conscious (questionnaires) evaluation often provide opposite or noncorrelating responses. Therefore, it is necessary to foster the development of instrumental human emotions evaluation models suitable for HRI (Table 14).
Recent research (Y. Hu et al., 2022) involving 35 participants proved the existence of the relationships between humans’ physical and psychological data and their age, gender, perception, and personality during HRI. Such dependencies are vital in developing adaptive HRI frameworks capable of responding to the physical and mental state of the robot operator. Furthermore, it has been experimentally proved that humans misunderstanding robot intentions is one of the major issues (Khoramshahi & Billard, 2019). The authors performed experiments on developing a task-based HRI, where the robot must recognize the human intention, switch to a corresponding task, or adjust motion parameters. Nevertheless, it was found that sometimes humans falsely assume that robot has recognized their intention, and as a result, humans disturb the process by themselves. Therefore, reliable adaptive HRI requires not only instrumental evaluation of human intentions but also feedback from the robot. Furthermore, this study defined that a different data update rate is required to ensure stable operation: 1 kHz for the haptic device, 200 Hz for the lightweight robotic arm, and 125 Hz for the mobile robot. The benefits of the feedback from the robot to the human are discussed in (Willemse & van Erp, 2019). Researchers defined that the touch initiated by the service robot has a similar positive effect as the friendly touch of a human. It minimizes psychological stress, creates stronger bonds between robots and humans, and could extend non-verbal communication capabilities.
Implementing robots in social-based applications brings other HRI issues (Table 15). Humans’ reactions to robots are highly hidden due to their complexity and frequent possibility of avoiding undesired interactions. Outstanding research (Choi et al., 2020) on laymen’s human social behavior reveals a reaction to the robots serving the hotel. Interviews with hotel guests reveal quite a strong reaction vector to robots’ existence in the hotel; a study with 400 participants showed that human staff services are perceived as having slightly higher interaction quality than the service of service robots. Theoretical research on human expectations from robots and robotic technologies is provided in the review (Saunderson & Nejat, 2019), where some specialists’ survey brings the main direction on the acceptance degree of robot non-verbal behavior. Special robot social conditions in operation in the social area of aged persons are defined experimentally (Moro et al., 2019) using a very intelligent robot, Casper. Based on the findings, it was confirmed that human emotion expression capability raises engagement of HRI and brings a general positive effect.
To summarize, all the recent research on the human reaction to robots is at the beginning of a quest to discover the complexity of human-to-robot reactions and corresponding behavior models. The complexity of certain technological developments can make it difficult to fully understand their causes and consequences. Such a lack of understanding can lead individuals or society as a whole to resist or accept these technologies. However, it is also important to note that other factors, such as ethical issues, privacy concerns, or social impact, may contribute to this unacceptability. Scientists and technology developers must establish open and transparent communication with individuals, communities, and society to better understand their concerns and find solutions that meet everyone’s needs. This may include research to evaluate the technology, developing ethical guidelines to ensure its responsible use, or considering alternative methods that address problems and promote acceptance.
Provided theoretical and experimental research in HRI opens the free space for the psychological impact of robots on human measurement. Another big issue is the improvement of human attitudes toward robots and their installations. These issues stimulate new research and development of new technologies in the future.
Discussion
The recent research on HRI and related sociopsychological phenomena, such as robophobia and anthropomorphism, is at the beginning of the process of disclosing the vast complexity of its context and indicates the necessary contributions from a wide variety of disciplines. Here, engineering is taking just a minor part of the whole, while psychology, sociology, humanities, and design are becoming equally important.
-
1.
Human–robot cooperation and collaboration can be effective in industry and possibly in other partially predefined environments;
-
2.
Communication and messaging in HRI are at slow progress but lack a systematic approach;
-
3.
HRI reveals new socio-psychologic phenomena;
-
4.
The general acceptance of robots over the entire population is mixed and underexplored;
-
5.
Emotional communication is very effective between humans, but achieving better understanding and classification is necessary for improved human-robot interaction.
Dynamic intrusion and the limited success of social robotics during the last decade are significant motivators for investments in further HRI research. In our review, we found the confirmation that HRI reveals some new socio-psychologic phenomena. As our review has shown, not many publications describe successful collaboration between different disciplines, which defines the context of such HRI research. On the contrary, technology research continues to progress rapidly at its own pace, while psychology and social research are often organized with outdated and obsolete technical equipment, without explicit practical value.
Even in the wide public exists Grimwade’s Syndrome known as the effect of robophobia. Relations between robophobia and access to robots have a multiverse connection. The performed analysis discovered that robophobia mostly affects the broad public with minimal access to robots. Regarding levels of HRI, the first level (isolated robotic cell) is mostly safe and causes fear for untrained people. Higher levels of interaction typically happen with trained personnel, with minimal tendencies to robophobia. On the other hand, this research doesn’t reveal the cause of such phenomena—possibly, that Grimwade syndrome can be cured by contact with robots or personnel with such conditions avoid robots.
Our review found a sound positive backing for our hypothesis about the effectiveness of human–robot cooperation and collaboration in industry and other environments. The collaborative approach between humans and robots is being demonstrated as productive in industry and medicine, although reliable and unambiguous feedback from humans to robots remains an issue. This also supports our hypothesis about the slow progress of communication and messaging in HRI. While several works demonstrate promising results in applying and developing tactile sensors, others target the development of gesture languages. We found that high adaptivity based on effective machine learning of various types and corresponding AI is essential when humans and robots share the workplaces, synchronize their work, cooperate, and collaborate.
We found support for the hypotheses about emotional feedback as another promising field of HRI research. However, we identify it as still remaining at the basic research and demonstration level. It appears that there is a lack of a more general understanding of human emotions to be considered as important feedback inputs for robot algorithms. Also, the measurement of emotions remains technologically complicated and of low reliability, mainly because of the variability between human subjects.
In the current discussion, we would like to emphasize the complexity of the human perception of the robots. We found support for our hypothesis about the mixed acceptance of robots over the entire population. Although early work identified computers as potential social actors, more recent research has demonstrated mixed and easily switchable social perceptions and acceptance of robots. There are situations in which the same robot can be perceived as an anthropomorphic entity. At the same time, it can be treated as just a machine if the situation has changed. For example, a robot created for entertainment purposes, such as a humanoid or quadrupedal robot that can dance or play games, may be perceived by audiences as an anthropomorphic or zoomorphic entity. In contrast, a robot for industrial use, such as a task to assemble a robot arm, may be perceived by operators and maintenance personnel as just a machine.
Future research in HRI will develop intuitively predictable robot signaling to humans, discovering intended actions before they are estimated rather than factual operations; this will add some trust to robot perception by laymen in interference with robots or robotic complexes. Human emotion evaluation by the robotic system will ass flexibility to robot operation, especially in the mobile mode. All these enhancements require new HRI conception and backing of such conception by hardware and software. Pure emotional factors of robot perception, like design, coloration, and sound, significantly impact HRI, so a broad area is open for new activities and design findings.
Social science and psychology have their own challenges: early robophobia detection and prevention, suggestive phobia treatment, or social prevention of phobia-induced people to access robots on a physical level. Psychologists should initially develop questionaries for robophobia detection; there are existing ones for agoraphobia, social phobia or other types of phobias. Special cases with children’s phobias raise requests for education methodologies or even social animation material, including graphic games. Reconsidering recently existing games and proper visualization of robots, there will be an aim to reduce robotic fear in general because the maturing of young generation will not be infected with some mysterious-born phobias. Treatment of specific phobias lacks a methodology for robotic phobia as well; there are no references pointing to robophobia treatment methodology so far.
Additionally, people’s perceptions of robots can be influenced by factors such as their design, the way they interact with humans, and the tasks they perform. Also, the same situation is valid for humans, and different perceptions can result from education. These interpersonal variations are similar in principle to those mentioned in the emotional feedback context and, therefore, similarly challenge the robot developers. While emotional feedback can be a highly effective data source for improving the collaborating robots, our hypothesis about the need for better understanding and interpretation of human emotions has also found good support.
Conclusions
Many areas in daily life show potential for robotization; however, many factors must be assessed and successfully combated to employ robots working alongside humans seamlessly. Different intensity levels of robot-human interaction propose their own benefits and drawbacks, signifying the increase in relational complexity as we move up the intensity scale. To achieve efficient human–robot collaboration, a lot of work in defining guidelines, safety measures, and design is yet to be done. Considering these factors, further research on human trust, reaction, and response must be conducted to provide more data for generalizations.
There are several recent demonstrations of successful human–robot cooperation with sensory and emotional feedback. However, the extension of these achievements to wider application areas, except in industry and medicine, remains limited.
Effective human-to-robot communication remains an issue despite several examples of gesture language, AI-backed speech and face recognition, and emotion recognition algorithms. New gesture, body language, or speech recognition features for robots are required, but such research is not available in public sources.
New sociopsychological phenomena such as robophobia and sporadic anthropomorphism are gaining importance in HRI research. In contrast to others, these phenomena are still new and indicate very indirect effects on the human psyche in the absence of robots.
Emotional feedback of humans to robots is assumed to be the preferable adaptive input to cooperative/collaborative robot behavior. However, realistic human emotion recognition remains a big issue and has low reliability. Moreover, general acceptance of robots and caused emotion as such requires deeper analysis.
Human perception of robots is very complex and presently can be regarded as sporadic, i.e., hardly predictable, easily spreading over the audience, and having strong interpersonal variability. Emotional human–robot communication is and will continue to remain limited; forthcoming progress will decrease communication limits.
Future HRI will intensify and cover the broader public, which implies some challenges in this field. Authors suppose that industrial robots should be classified into more than now existing two groups (robots and cobots), according to operation intensity. As a result, safety standards will appear, describing all categories of robots, thus developing optical and other markings to distinct security levels of this robot, defining human behavior in robot environments and safety levels where robots can enter as service. There is a prediction for big society preparation for robotic safety rules, like behavior in the street with traffic. Therefore, all these changes must be naturally understandable and clear for everybody. Specialized production areas will keep no-enter zones for the public, and the robot interaction level will still require training and education. Ultimately, authors would encourage leaving space for humans in HRI and unlimited robot development for a comfortable human life, unshaded by massive robophobia.
References
Akalin, N., Kristoffersson, A., & Loutfi, A. (2022). Do you feel safe with your robot? Factors influencing perceived safety in human-robot interaction based on subjective and objective measures. International Journal of Human-Computer Studies, 158, 102744. https://doi.org/10.1016/j.ijhcs.2021.102744
Alarcon, G. M., Gibson, A. M., Jessup, S. A., & Capiola, A. (2021). Exploring the differential effects of trust violations in human-human and human-robot interactions. Applied Ergonomics, 93, 103350. https://doi.org/10.1016/j.apergo.2020.103350
Archer, M. S. (2021). Can humans and AI robots be friends? In Post-Human Futures: Human Enhancement, Artificial Intelligence and Social Theory (pp. 132–152). Taylor and Francis. https://doi.org/10.4324/9781351189958-7
Bajcsy, A., Herbert, S. L., Fridovich-Keil, D., Fisac, J. F., Deglurkar, S., Dragan, A. D., & Tomlin, C. J. (2018). A scalable framework for real-time multi-robot, multi-human collision avoidance. In Proceedings - IEEE International Conference on Robotics and Automation, 2019-May, 936–943.
Ballen-Moreno, F., Bautista, M., Provot, T., Bourgain, M., Cifuentes, C. A., & Múnera, M. (2022). Development of a 3D relative motion method for human-robot interaction assessment. Sensors, 22(6), 2411. https://doi.org/10.3390/s22062411
Bänziger, T., Kunz, A., & Wegener, K. (2020). Optimizing human–robot task allocation using a simulation tool based on standardized work descriptions. Journal of Intelligent Manufacturing, 31, 1635–1648. https://doi.org/10.1007/s10845-018-1411-1
Bhattacharya, S. (2021). A note on robotics and artificial intelligence in pharmacy. Applied Drug Research, Clinical Trials and Regulatory Affairs, 8(2), 125–134. https://doi.org/10.2174/2667337108666211206151551
Bi, L., & Feleke, A. (2019). A review on EMG-based motor intention prediction of continuous human upper limb motion for human-robot collaboration. Biomedical Signal Processing and Control, 51, 113–127. https://doi.org/10.1016/j.bspc.2019.02.01
Boschetti, G., Bottin, M., Faccio, M., & Minto, R. (2021). Multi-robot multi-operator collaborative assembly systems: A performance evaluation model. Journal of Intelligent Manufacturing, 32, 1455–1470. https://doi.org/10.1007/s10845-020-01714-7
Buerkle, A., Eaton, W., Lohse, N., Bamber, T., & Ferreira, P. (2021). EEG based arm movement intention recognition towards enhanced safety in symbiotic Human-Robot Collaboration. Robotics and Computer-Integrated Manufacturing, 70, 102137. https://doi.org/10.1016/j.rcim.2021.102137
Cacace, J., Caccavale, R., Finzi, A., & Grieco, R. (2023). Combining human guidance and structured task execution during physical human–robot collaboration. Journal of Intelligent Manufacturing, 34, 3053–3067. https://doi.org/10.1007/s10845-022-01989-y
Caccavale, R., Saveriano, M., Finzi, A., & Lee, D. (2019). Kinesthetic teaching and attentional supervision of structured tasks in human–robot interaction. Autonomous Robots, 43(6), 1291–1307. https://doi.org/10.1007/s10514-018-9706-9
Canuto, C., Freire, E. O., Molina, L., Carvalho, E. A. N., & Givigi, S. N. (2022). Intuitiveness level: Frustration-based methodology for human-robot interaction gesture elicitation. IEEE Access, 10, 17145–17154. https://doi.org/10.1109/ACCESS.2022.3146838
Caporaso, T., Grazioso, S., & di Gironimo, G. (2022). Development of an integrated virtual reality system with wearable sensors for ergonomic evaluation of human-robot cooperative workplaces. Sensors, 22(6), 2413. https://doi.org/10.3390/s22062413
Choi, Y., Choi, M., Oh, M., & Kim, S. (2020). Service robots in hotels: Understanding the service quality perceptions of human-robot interaction. Journal of Hospitality Marketing & Management, 29(6), 613–635. https://doi.org/10.1080/19368623.2020.1703871
Chutima, P. (2022). A comprehensive review of robotic assembly line balancing problem. Journal of Intelligent Manufacturing, 33(1), 1–34. https://doi.org/10.1007/s10845-020-01641-7
Davey, G. (1997). Phobias : A Handbook of Theory, Research, and Treatment. Wiley.
De Groote, F., & Falisse, A. (2021). Perspective on musculoskeletal modelling and predictive simulations of human movement to assess the neuromechanics of gait. Proceedings of the Royal Society B: Biological Sciences, 288(1946), 20202432. https://doi.org/10.1098/rspb.2020.2432
De Visser, E. J., Topoglu, Y., Joshi, S., Krueger, F., Phillips, E., Gratch, J., Tossell, C. C., & Ayaz, H. (2022). Designing man’s new best friend: Enhancing human-robot dog interaction through dog-like framing and appearance. Sensors. https://doi.org/10.3390/S22031287
Desideri, L., Ottaviani, C., Malavasi, M., di Marzio, R., & Bonifacci, P. (2019). Emotional processes in human-robot interaction during brief cognitive testing. Computers in Human Behavior, 90, 331–342. https://doi.org/10.1016/j.chb.2018.08.013
Duarte, N. F., Rakovic, M., Tasevski, J., Coco, M. I., Billard, A., & Santos-Victor, J. (2018). Action anticipation: Reading the intentions of humans and robots. IEEE Robotics and Automation Letters, 3(4), 4132–4139. https://doi.org/10.1109/LRA.2018.2861569
Dzedzickis, A., Kaklauskas, A., & Bucinskas, V. (2020). Human emotion recognition: Review of sensors and methods. Sensors, 20(3), 592. https://doi.org/10.3390/s20030592
Dzedzickis, A., Subačiūtė-Žemaitienė, J., Šutinys, E., Samukaitė-Bubnienė, U., & Bučinskas, V. (2021). Advanced applications of industrial robotics: New trends and possibilities. Applied Sciences, 12(1), 135. https://doi.org/10.3390/app12010135
Ekman, P. (1992). An argument for basic emotions. Cognition and Emotion, 6(3–4), 169–200. https://doi.org/10.1080/02699939208411068
Etemad-Sajadi, R., Soussan, A., & Schöpfer, T. (2022). How ethical issues raised by human-robot interaction can impact the intention to use the robot? International Journal of Social Robotics, 14(4), 1103–1115. https://doi.org/10.1007/S12369-021-00857-8
Faccio, M., Granata, I., Menini, A., Milanese, M., Rossato, C., Bottin, M., Minto, R., Pluchino, P., Gamberini, L., Boschetti, G., & Rosati, G. (2023). Human factors in cobot era: A review of modern production systems features. Journal of Intelligent Manufacturing, 34(1), 85–106. https://doi.org/10.1007/s10845-022-01953-w
Fisac, J. F., Bajcsy, A., Herbert, S. L., Fridovich-Keil, D., Wang, S., Tomlin, C. J., & Dragan, A. D. (2018). Probabilistically safe robot planning with confidence-based human predictions. Robotics: Science and Systems. https://doi.org/10.48550/arxiv.1806.00109
Fischer, K. (2022). Tracking anthropomorphizing behavior in human-robot interaction. ACM Transactions on Human-Robot Interaction, 11(1), 1–28. https://doi.org/10.1145/3442677
Fraune, M. R., Sherrin, S., Šabanović, S., & Smith, E. R. (2019). Is human-robot interaction more competitive between groups than between individuals?. In 2019 14th acm/ieee international conference on human-robot interaction (hri) (pp. 104–113). IEEE. https://doi.org/10.1109/HRI.2019.8673241
Fukumori, T., Cai, C., Zhang, Y., el Hafi, L., Hagiwara, Y., Nishiura, T., & Taniguchi, T. (2022). Optical laser microphone for human-robot interaction: Speech recognition in extremely noisy service environments. Advanced Robotics, 36(5–6), 304–317. https://doi.org/10.1080/01691864.2021.2023629
Gaggioli, A., Chirico, A., di Lernia, D., Maggioni, M. A., Malighetti, C., Manzi, F., Marchetti, A., Massaro, D., Rea, F., Rossignoli, D., Sandini, G., Villani, D., Wiederhold, B. K., Riva, G., & Sciutti, A. (2021). Machines like us and people like you: Toward human-robot shared experience. Cyberpsychology, Behavior, and Social Networking, 24(5), 357–361. https://doi.org/10.1089/cyber.2021.29216.aga
Gomez Chavez, A., Ranieri, A., Chiarella, D., Zereik, E., Babić, A., & Birk, A. (2019). CADDY underwater stereo-vision dataset for human-robot interaction (HRI) in the context of diver activities. Journal of Marine Science and Engineering, 7(1), 16. https://doi.org/10.3390/jmse7010016
Gualtieri, L., Rauch, E., & Vidoni, R. (2022). Development and validation of guidelines for safety in human-robot collaborative assembly systems. Computers & Industrial Engineering, 163, 107801. https://doi.org/10.1016/j.cie.2021.107801
Gui, L.-Y., Zhang, K., Wang, Y.-X., Liang, X., Moura, J. M. F., & Veloso, M. (2018). teaching robots to predict human motion. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018, 562–567. https://doi.org/10.1109/IROS.2018.8594452
Hancock, P. A., Billings, D. R., Schaefer, K. E., Chen, J. Y. C., De Visser, E. J., & Parasuraman, R. (2011). A meta-analysis of factors affecting trust in human-robot interaction. Human Factors: THe Journal of the Human Factors and Ergonomics Society, 53(5), 517–527. https://doi.org/10.1177/0018720811417254
Hatfield, E., Bensman, L., Thornton, P. D., & Rapson, R. L. (2014). New perspectives on emotional contagion: A review of classic and recent research on facial mimicry and contagion. Interpersona an International Journal on Personal Relationships, 8(2), 159–179. https://doi.org/10.5964/ijpr.v8i2.162
Hayashi, Y., & Wakabayashi, K. (2018). Influence of robophobia on decision making in a court scenario. In Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, 121–122. https://doi.org/10.1145/3173386.3176988
Hellström, T., & Bensch, S. (2018). Understandable Robots. Paladyn, 9(1), 110–123. https://doi.org/10.1515/PJBR-2018-0009
Hentout, A., Aouache, M., Maoudj, A., & Akli, I. (2019). Human–robot interaction in industrial collaborative robotics: A literature review of the decade 2008–2017. Advanced Robotics, 33(15–16), 764–799. https://doi.org/10.1080/01691864.2019.1636714
Higgins, P., Kebe, G. Y., Berlier, A. J., Darvish, K., Engel, D., & Ferraro, F. (2021). Towards Making Virtual Human-Robot Interaction a Reality. https://doi.org/10.13016/M2LHCH-CUZP
Hjorth, S., & Chrysostomou, D. (2022). Human–robot collaboration in industrial environments: A literature review on non-destructive disassembly. Robotics and Computer-Integrated Manufacturing, 73, 102208. https://doi.org/10.1016/j.rcim.2021.102208
Horstmann, A. C., & Krämer, N. C. (2022). The fundamental attribution error in human-robot interaction: An experimental investigation on attributing responsibility to a social robot for its pre-programmed behavior. International Journal of Social Robotics, 14(5), 1137–1153. https://doi.org/10.1007/S12369-021-00856-9
Hu, H., & Fisac, J. F. (2022). Active Uncertainty Reduction for Human-Robot Interaction: An Implicit Dual Control Approach. http://arxiv.org/abs/2202.07720
Hu, Y., Abe, N., Benallegue, M., Yamanobe, N., Venture, G., & Yoshida, E. (2022). Toward active physical human-robot interaction: Quantifying the human state during interactions. IEEE Transactions on Human-Machine Systems, 52(3), 367–378. https://doi.org/10.1109/THMS.2021.3138684
Huang, R., Cheng, H., Qiu, J., & Zhang, J. (2019). Learning physical human-robot interaction with coupled cooperative primitives for a lower exoskeleton. IEEE Transactions on Automation Science and Engineering, 16(4), 1566–1574. https://doi.org/10.1109/TASE.2018.2886376
Innes, J. M., & Morrison, W. B. (2021). Experimental studies of human-robot interaction: Threats to valid interpretation from methodological constraints associated with experimental manipulations. International Journal of Social Robotics, 13(4), 765–773. https://doi.org/10.1007/s12369-020-00671-8
Hofstede Insights, (2023) Retrieved December 4, 2023, from https://www.hofstede-insights.com/country-comparison-tool?countries=denmark,greece
Jørgensen, J., Bojesen, K. B., & Jochum, E. (2022). Is a soft robot more “Natural”? Exploring the perception of soft robotics in human-robot interaction. International Journal of Social Robotics, 14(1), 95–113. https://doi.org/10.1007/s12369-021-00761-1
Kaonain, T. E., Rahman, M. A. A., Ariff, M. H. M., Yahya, W. J., & Mondal, K. (2021). Collaborative robot safety for human-robot interaction in domestic simulated environments. IOP Conference Series: Materials Science and Engineering, 1096(1), 012029. https://doi.org/10.1088/1757-899X/1096/1/012029
Katsanis, I. A., & Moulianitis, V. C. (2021). An architecture for safe child-robot interactions in autism interventions. Robotics, 10(1), 20. https://doi.org/10.3390/robotics10010020
Kempt, H. (2020). Social Reverberations. In Social Reverberations (pp. 137–173).
Kempt, H. (2022). Social Integration. In Synthetic Friends (pp. 163–1834). Berlin: Springer.
Khairuddin, I. M., Sidek, S. N., Majeed, A. P. P. A., Razman, M. A. M., Puzi, A. A., & Yusof, H. M. (2021). The classification of movement intention through machine learning models: The identification of significant time-domain EMG features. PeerJ Computer Science, 7, 1–15. https://doi.org/10.7717/PEERJ-CS.379/SUPP-2
Kitagawa, R., Liu, Y., & Kanda, T. (2021). Human-inspired motion planning for omni-directional social robots. In ACM/IEEE International Conference on Human-Robot Interaction, 34–42. https://doi.org/10.1145/3434073.3444679
Kousi, N., Stoubos, C., Gkournelos, C., Michalos, G., & Makris, S. (2019). Enabling Human robot interaction in flexible robotic assembly lines: An augmented reality based software suite. Procedia CIRP, 81, 1429–1434. https://doi.org/10.1016/j.procir.2019.04.328
Krueger, F., Mitchell, K. C., Deshpande, G., & Katz, J. S. (2021). Human–dog relationships as a working framework for exploring human–robot attachment: A multidisciplinary review. Animal Cognition, 24(2), 371–385. https://doi.org/10.1007/S10071-021-01472-W
Kuhail, M. A., Berengueres, J., Taher, F., Alkuwaiti, M., & Khan, S. Z. (2023). Haptic systems: Trends and lessons learned for haptics in spacesuits. Electronics, 12, 1888. https://doi.org/10.3390/electronics12081888
Kulke, L., Feyerabend, D., & Schacht, A. (2020). A comparison of the affectiva iMotions facial expression analysis software with emg for identifying facial expressions of Emotion. Frontiers in Psychology, 11, 329. https://doi.org/10.3389/FPSYG.2020.00329
Lai, Y., Paul, G., Cui, Y., & Matsubara, T. (2022). User intent estimation during robot learning using physical human robot interaction primitives. Autonomous Robots, 46(2), 421–436. https://doi.org/10.1007/S10514-021-10030-9
Lavit Nicora, M., Ambrosetti, R., Wiens, G. J., & Fassi, I. (2021). Human-robot collaboration in smart manufacturing: Robot reactive behavior intelligence. Journal of Manufacturing Science and Engineering, Transactions of the ASME,. https://doi.org/10.1115/1.4048950/1089694
Lee, J.-E. R., & Nass, C. I. (2010). Trust in Computers. In Trust and Technology in a Ubiquitous Modern Environment (pp. 1–15). IGI Global. https://doi.org/10.4018/978-1-61520-901-9.ch001
Lee, S. A., & Liang, Y. J. (2019). Robotic foot-in-the-door: Using sequential-request persuasive strategies in human-robot interaction. Computers in Human Behavior, 90, 351–356. https://doi.org/10.1016/j.chb.2018.08.026
Leichtmann, B., Nitsch, V., & Mara, M. (2022). Crisis ahead? Why human-robot Interaction user studies may have replicability problems and directions for improvement. Frontiers in Robotics and A, I, 9. https://doi.org/10.3389/frobt.2022.838116
Lestingi, L., Askarpour, M., Bersani, M. M., & Rossi, M. (2021). A deployment framework for formally verified human-robot interactions. IEEE Access, 9, 136616–136635. https://doi.org/10.1109/ACCESS.2021.3117852
Lewandowska, A., Rejer, I., Bortko, K., & Jankowski, J. (2022). Eye-tracker study of influence of affective disruptive content on user’s visual attention and emotional state. Sensors, 22(2), 547. https://doi.org/10.3390/s22020547
Li, G., Li, Z., & Kan, Z. (2022). Assimilation control of a robotic exoskeleton for physical human-robot interaction. IEEE Robotics and Automation Letters, 7(2), 2977–2984. https://doi.org/10.1109/LRA.2022.3144537
Li, J., Lu, L., Zhao, L., Wang, C., & Li, J. (2021). An integrated approach for robotic Sit-To-Stand assistance: Control framework design and human intention recognition. Control Engineering Practice, 107, 104680. https://doi.org/10.1016/j.conengprac.2020.104680
Li, W., Hu, Y., Zhou, Y., & Pham, D. T. (2023). Safe human–robot collaboration for industrial settings: A survey. Journal of Intelligent Manufacturing. https://doi.org/10.1007/s10845-023-02159-4
Lim, J. Z., Mountstephens, J., & Teo, J. (2020). Emotion recognition using eye-tracking: Taxonomy. Review and Current Challenges. Sensors, 20(8), 2384. https://doi.org/10.3390/s20082384
Lim, Y., Pongsakornsathien, N., Gardi, A., Sabatini, R., Kistan, T., Ezer, N., & Bursch, D. J. (2021). Adaptive human-robot interactions for multiple unmanned aerial vehicles. Robotics, 10(1), 12. https://doi.org/10.3390/robotics10010012
Liu, Z., Lyu, K., Wu, S., Chen, H., Hao, Y., & Ji, S. (2021). Aggregated multi-GANs for controlled 3D human motion prediction. Proceedings of the AAAI Conference on Artificial Intelligence, 35(3), 2225–2232.
Londoño, L., Röfer, A., Welschehold, T., & Valada, A. (2022). Doing Right by Not Doing Wrong in Human-Robot Collaboration. https://doi.org/10.48550/arxiv.2202.02654
Maccarini, A. M. (2021). The social meanings of perfection: Human self-understanding in a post-human society. In What is Essential to Being Human?: Can AI Robots Not Share It? (pp. 197–213). Taylor and Francis. https://doi.org/10.4324/9780429351563-10
Maggioni, M. A., & Rossignoli, D. (2023). If it looks like a human and speaks like a human. Communication and cooperation in strategic Human-Robot interactions. Journal of Behavioral and Experimental Economics, 104, 102011. https://doi.org/10.1016/j.socec.2023.102011
Malik, A. A., & Bilberg, A. (2019). Developing a reference model for human–robot interaction. International Journal on Interactive Design and Manufacturing (IJIDeM), 13(4), 1541–1547. https://doi.org/10.1007/s12008-019-00591-6
Maroger, I., Ramuzat, N., Stasse, O., & Watier, B. (2021). Human trajectory prediction model and its coupling with a walking pattern generator of a humanoid robot. IEEE Robotics and Automation Letters, 6(4), 6361–6369. https://doi.org/10.1109/LRA.2021.3092750
Matheson, E., Minto, R., Zampieri, E. G. G., Faccio, M., & Rosati, G. (2019). Human-robot collaboration in manufacturing applications: A review. Robotics, 8(4), 100. https://doi.org/10.3390/ROBOTICS8040100
Mazhar, O., Navarro, B., Ramdani, S., Passama, R., & Cherubini, A. (2019). A real-time human-robot interaction framework with robust background invariant hand gesture detection. Robotics and Computer-Integrated Manufacturing, 60, 34–48. https://doi.org/10.1016/j.rcim.2019.05.008
Melchiorre, M., Scimmi, L. S., Mauro, S., & Pastorelli, S. P. (2021). Vision-based control architecture for human–robot hand-over applications. Asian Journal of Control, 23(1), 105–117. https://doi.org/10.1002/asjc.2480
Moro, C., Lin, S., Nejat, G., & Mihailidis, A. (2019). Social robots and seniors: A comparative study on the influence of dynamic social features on human-robot interaction. International Journal of Social Robotics, 11(1), 5–24. https://doi.org/10.1007/s12369-018-0488-1
Mugisha, S., Guda, V. K., Chevallereau, C., Zoppi, M., Molfino, R., & Chablat, D. (2022). Improving haptic response for contextual human robot interaction. Sensors, 22(5), 2040. https://doi.org/10.3390/s22052040
Müller, M., Ruppert, T., Jazdi, N., & Weyrich, M. (2023). Self-improving situation awareness for human–robot-collaboration using intelligent Digital Twin. Journal of Intelligent Manufacturing. https://doi.org/10.1007/s10845-023-02138-9
Murata, S., Yamashita, Y., Arie, H., Ogata, T., Sugano, S., & Tani, J. (2017). Learning to perceive the world as probabilistic or deterministic via interaction with others: A neuro-robotics experiment. IEEE Transactions on Neural Networks and Learning Systems, 28(4), 830–848. https://doi.org/10.1109/TNNLS.2015.2492140
Neto, P., Simão, M., Mendes, N., & Safeea, M. (2019). Gesture-based human-robot interaction for human assistance in manufacturing. The International Journal of Advanced Manufacturing Technology, 101(1–4), 119–135. https://doi.org/10.1007/s00170-018-2788-x
Noroozi, F., Corneanu, C. A., Kaminska, D., Sapinski, T., Escalera, S., & Anbarjafari, G. (2021). Survey on emotional body gesture recognition. IEEE Transactions on Affective Computing, 12(2), 505–523. https://doi.org/10.1109/TAFFC.2018.2874986
Obo, T., & Takizawa, K. (2022). Analysis of timing and effect of visual cue on turn-taking in human-robot interaction. Journal of Robotics and Mechatronics, 34(2), 55–63.
Oliveira, R., Arriaga, P., & Paiva, A. (2021). Human-robot interaction in groups: Methodological and research practices. Multimodal Technologies and Interaction, 5(10), 59. https://doi.org/10.3390/mti5100059
Páez, J., & González, E. (2022). Human-robot scaffolding: An architecture to foster problem-solving skills. ACM Transactions on Human-Robot Interaction, 11(3), 1–17. https://doi.org/10.1145/3526109
Park, S., & Whang, M. (2022). Empathy in human-robot interaction: Designing for social robots. International Journal of Environmental Research and Public Health. https://doi.org/10.3390/IJERPH19031889
Pathi, S. K., Kiselev, A., & Loutfi, A. (2022). Detecting groups and estimating F-formations for social human-robot interactions. Multimodal Technologies and Interaction, 6(3), 18. https://doi.org/10.3390/mti6030018
Porpora, D. (2021). On robophilia and robophobia. What Is Essential to Being Human?: Can AI Robots Not Share It?, 26–39. https://doi.org/10.4324/9780429351563-2
Qian, K., Xu, X., Liu, H., Bai, J., & Luo, S. (2022). Environment-adaptive learning from demonstration for proactive assistance in human–robot collaborative tasks. Robotics and Autonomous Systems, 151, 104046. https://doi.org/10.1016/j.robot.2022.104046
Qu, W., Li, J., Zhang, R., Liu, S., & Bao, J. (2023). Adaptive planning of human–robot collaborative disassembly for end-of-life lithium-ion batteries based on digital twin. Journal of Intelligent Manufacturing. https://doi.org/10.1007/s10845-023-02081-9
Rabb, N., Law, T., Chita-Tegmark, M., & Scheutz, M. (2022). An attachment framework for human-robot interaction. International Journal of Social Robotics, 14(2), 539–559. https://doi.org/10.1007/s12369-021-00802-9
Rahman, S. M. M. (2021). Machine learning-based cognitive position and force controls for power-assisted human-robot collaborative manipulation. Machines, 9(2), 28. https://doi.org/10.3390/machines9020028
Richards, L. E., & Matuszek, C. (2021). Learning to Understand Non-Categorical Physical Language for Human Robot Interactions. https://doi.org/10.13016/m2lbuq-ulee
Richardson, S. (2020). Affective computing in the modern workplace. Business Information Review, 37(2), 78–85. https://doi.org/10.1177/0266382120930866
Roesler, E., Naendrup-Poell, L., Manzey, D., & Onnasch, L. (2022). Why context matters: The influence of application domain on preferred degree of anthropomorphism and gender attribution in human-robot interaction. International Journal of Social Robotics, 14(5), 1155–1166. https://doi.org/10.1007/S12369-021-00860-Z
Ruhland, K., Peters, C. E., Andrist, S., Badler, J. B., Badler, N. I., Gleicher, M., Mutlu, B., & McDonnell, R. (2015). A review of eye gaze in virtual agents, social robotics and HCI: Behaviour generation, user interaction and perception. Computer Graphics Forum, 34(6), 299–326. https://doi.org/10.1111/cgf.12603
Sanders, T., Kaplan, A., Koch, R., Schwartz, M., & Hancock, P. A. (2019). The relationship between trust and use choice in human-robot interaction. Human Factors: THe Journal of the Human Factors and Ergonomics Society, 61(4), 614–626. https://doi.org/10.1177/0018720818816838
Saunderson, S., & Nejat, G. (2019). How robots influence humans: A survey of nonverbal communication in social human-robot interaction. International Journal of Social Robotics, 11(4), 575–608. https://doi.org/10.1007/s12369-019-00523-0
Schydlo, P., Rakovic, M., Jamone, L., & Santos-Victor, J. (2018). Anticipation in human-robot cooperation: A recurrent neural network approach for multiple action sequences prediction. IEEE International Conference on Robotics and Automation (ICRA), 2018, 1–6. https://doi.org/10.1109/ICRA.2018.8460924
Shi, D., Zhang, W., Zhang, W., Ju, L., & Ding, X. (2021). Human-centred adaptive control of lower limb rehabilitation robot based on human–robot interaction dynamic model. Mechanism and Machine Theory, 162, 104340. https://doi.org/10.1016/j.mechmachtheory.2021.104340
Song, C. S., & Kim, Y. K. (2022). The role of the human-robot interaction in consumers’ acceptance of humanoid retail service robots. Journal of Business Research, 146, 489–503. https://doi.org/10.1016/j.jbusres.2022.03.087
Song, S., Kidziński, Ł, Peng, X. B., Ong, C., Hicks, J., Levine, S., Atkeson, C. G., & Delp, S. L. (2021). Deep reinforcement learning for modeling human locomotion control in neuromechanical simulation. Journal of NeuroEngineering and Rehabilitation, 18(1), 126. https://doi.org/10.1186/s12984-021-00919-y4
Spatola, N., & Wudarczyk, O. A. (2021). Implicit attitudes towards robots predict explicit attitudes, semantic distance between robots and humans, anthropomorphism, and prosocial behavior: From attitudes to human-robot interaction. International Journal of Social Robotics, 13(5), 1149–1159. https://doi.org/10.1007/S12369-020-00701-5
Story, M., Webb, P., Fletcher, S. R., Tang, G., Jaksic, C., & Carberry, J. (2022). Do speed and proximity affect human-robot collaboration with an industrial robot arm? International Journal of Social Robotics, 14(4), 1087–1102. https://doi.org/10.1007/S12369-021-00853-Y
Strazdas, D., Hintz, J., Khalifa, A., Abdelrahman, A. A., Hempel, T., & Al-Hamadi, A. (2022). Robot system assistant (RoSA): Towards intuitive multi-modal and multi-device human-robot interaction. Sensors, 22(3), 923. https://doi.org/10.3390/s22030923
Toichoa Eyam, A., Mohammed, W. M., & Martinez Lastra, J. L. (2021). Emotion-driven analysis and control of human-robot interactions in collaborative applications. Sensors, 21(14), 4626. https://doi.org/10.3390/s21144626
Umbrico, A., Orlandini, A., Cesta, A., Faroni, M., Beschi, M., Pedrocchi, N., & Makris, S. (2022). Design of advanced human–robot collaborative cells for personalized human–robot collaborations. Applied Sciences, 12(14), 6839. https://doi.org/10.3390/app12146839
Van Maris, A., Zook, N., Dogramadzi, S., Studley, M., Winfield, A., & Caleb-Solly, P. (2021). A new perspective on robot ethics through investigating human-robot interactions with older adults. Applied Sciences, 11(21), 10136. https://doi.org/10.3390/app112110136
Vasconez, J. P., Kantor, G. A., & Auat Cheein, F. A. (2019). Human–robot interaction in agriculture: A survey and current challenges. Biosystems Engineering, 179, 35–48. https://doi.org/10.1016/j.biosystemseng.2018.12.005
Vianello, L., Ivaldi, S., Aubry, A., & Peternel, L. (2023). The effects of role transitions and adaptation in human–cobot collaboration. Journal of Intelligent Manufacturing. https://doi.org/10.1007/s10845-023-02104-5
Vianello, L., Mouret, J.-B., Dalin, E., Aubry, A., & Ivaldi, S. (2021). Human posture prediction during physical human-robot interaction. IEEE Robotics and Automation Letters, 6(3), 6046–6053. https://doi.org/10.1109/LRA.2021.3086666
Wan, S., Gu, Z., & Ni, Q. (2020). Cognitive computing and wireless communications on the edge for healthcare service robots. Computer Communications, 149, 99–106. https://doi.org/10.1016/j.comcom.2019.10.012
Wang, W., Chen, Y., Li, R., & Jia, Y. (2019). Learning and comfort in human-robot interaction: A review. Applied Sciences, 9(23), 5152. https://doi.org/10.3390/app9235152
Weis, P. P., & Herbert, C. (2022). Do I still like myself? Human-robot collaboration entails emotional consequences. Computers in Human Behavior, 127, 107060. https://doi.org/10.1016/j.chb.2021.107060
Willemse, C. J. A. M., & van Erp, J. B. F. (2019). Social touch in human-robot interaction: Robot-initiated touches can induce positive responses without extensive prior bonding. International Journal of Social Robotics, 11(2), 285–304. https://doi.org/10.1007/s12369-018-0500-9
Xiao, C., Fan, Y., Zhang, J., & Zhou, R. (2022). People do not automatically take the level-1 visual perspective of humanoid robot avatars. International Journal of Social Robotics, 14(1), 165–176. https://doi.org/10.1007/s12369-021-00773-x
Xiong, J., Chen, J., & Lee, P. S. (2021). Functional fibers and fabrics for soft robotics, wearables, and human-robot interface. Advanced Materials, 33(19), 2002640. https://doi.org/10.1002/adma.202002640
Yao, X., Ma, N., Zhang, J., Wang, K., Yang, E., & Faccio, M. (2022). Enhancing wisdom manufacturing as industrial metaverse for industry and society 5.0. Journal of Intelligent Manufacturing, 35(1), 235–255. https://doi.org/10.1007/s10845-022-02027-7
Yu, J., Gao, H., Chen, Y., Zhou, D., Liu, J., & Ju, Z. (2022). Deep object detector with attentional spatiotemporal LSTM for space human-robot interaction. IEEE Transactions on Human-Machine Systems, 52(4), 784–793. https://doi.org/10.1109/THMS.2022.3144951
Zacharaki, N., Dimitropoulos, N., & Makris, S. (2022). Challenges in human-robot collaborative assembly in shipbuilding and ship maintenance, repair and conversion (SMRC) industry. Procedia CIRP, 106, 120–125. https://doi.org/10.1016/j.procir.2022.02.165
Funding
This project has received financial support from the Research Council of Lithuania (LMTLT), Nr. P-LLT-21–6, State Education Development Agency of Latvia, Ministry of Science and Technology (MOST)) of Taiwan.
Author information
Authors and Affiliations
Contributions
Conceptualization, AD. and VB; methodology, VB and DV; validation, AD, GV and VB; formal analysis, DV and VB; investigation, GV and KL; resources, VB; data curation, GV and KL; writing—original draft preparation, DV; AD; KL.; writing—review and editing, DV, VB. and AD; visualization, AD; supervision, VB; funding acquisition, VB. All authors have read and agreed to the published version of the manuscript.
Corresponding authors
Ethics declarations
Conflict of interest
The authors have no conflicts of interest to declare that are relevant to the content of this article.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Dzedzickis, A., Vaičiūnas, G., Lapkauskaitė, K. et al. Recent advances in human–robot interaction: robophobia or synergy. J Intell Manuf (2024). https://doi.org/10.1007/s10845-024-02362-x
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10845-024-02362-x