Social Robotics is an interdisciplinary playground where researchers from sociology, psychology, neuroscience, and robotics meet to design one kind of future robot: the social robot. This challenge is simple to describe, but complex to achieve. The social robot can be a classical robot or a machine with sufficient resources and capabilities to receive and exchange information with humans, with the ultimate result of helping people in daily-life activities. Exchanges should be intuitive and easy: these two key characteristics strongly define the sociability of artificial agents.

The system formed by a human and a social robot is hybrid by nature. How should we consider its communication and interactions as compared with a human-human system? Humans use all available sensory and motoric modalities to communicate and establish social relations. Human-human interaction has a vast literature base that continues to be expanded by communication scientists, psychologists, and neuroscientists among others. In human-robot systems, however, the inherent asymmetries prevent researchers from directly borrowing classical human-human approaches. However, one can be inspired by these efforts when developing tools and techniques that observe and evaluate human-robot interaction (HRI). The fundamental question is whether the same, or similar, mechanisms that drive human-human interaction also drive human-robot interactions.

Although the controllable, measurable, and observable nature of robots provides researchers with a potentially infinite set of experimental conditions, the field of HRI is nevertheless constrained by a shortage of objective measurement methods for evaluating the nature and quality of communication among the elements of such systems.

Current approaches for evaluating interaction data include overt measures such as gestures, gesticulations, facial expressions, inter-person distance, verbal speech, eye gaze, body language, and impulsive movements. An arguably more difficult challenge is inferring the more covert affective states, emotional reactions (trust, sense of safety) and non-verbal messages, as well as autonomic responses such as body temperature, skin pressure, and other biological signals including the electrical activity of the brain. The ultimate objective, of course, is to transcend the many, and often isolated, data gathering efforts and combine the various approaches through classification and interpretation, using tools from fields such as neuroscience, signal processing and modeling, and psychology (including survey data regarding non-technophobia, reciprocal behaviors, and positive ratings) in the hopes of answering the following research questions: Can we trust a robot? Can a robot interpret human mental states? Can we understand the information that the robot is trying to display? Can biological signals be used to predict human actions towards a robot? Are biological signals directly interpretable, to the point of allowing artificial systems to make inferences about human behavior towards robots? More generally, which physiological and behavioral features should be considered when investigating the perception-action loop between a human and a robotic agent?

In this perspective, we sought contributions that address methodologies and techniques developed by or used within the social robotics community to measure human-robot synergy.

The first paper, Measuring human-robot interaction through motor resonance by Alessandra Sciutti, Ambra Bisio, Francesco Nori, Giorgio Metta, Luciano Fadiga, Thierry Pozzo and Giulio Sandini, provides an overview of findings within the motor-resonance literature that could be exploited to measure the interaction involvement with robots. Motivated by the fact that the physiological measurements adopted so far are not tightly related to the mechanisms at the basis of social interactions, the authors show that assessing motor resonance may be a promising approach for evaluating interactions. Indeed, robotic agents have been shown to evoke a similar activity in the Mirror Neuron System as humans do, with respect to both neurophysiological and neuroimaging cues. Such activity gives rise to a series of resonant behaviours: humanoid robots seem to generate motion interference on human gestures only when moving according to the biological laws of motion; humans trained with a motor task—firstly performed by an agent—behaved similarly, regardless of the nature of the motor primer (human vs. robotic). When comparing physiological and behavioural cues of motor resonance, the first are present when a non-biological agent moves even with non-biological kinematics, whereas the second apparently require a higher degree of human resemblance or attributed animacy. It is then important how phenomena such as the Uncanny Valley are assessed. The authors propose to measure two natural phenomena that are linked with motor resonance—proactive gaze and automatic imitation—to ecologically improve HRI protocols.

In the second article of the edition, You want me to trust a ROBOT? The Development of a human-robot interaction trust scale, by Rosemarie E. Yagoda and Douglas J. Gillan, the authors address a basic and natural question most naïve users of complex technologies ask, at least unconsciously: “Should I trust a robot?” However, in this article, the trust issue is addressed in a social robotics context, namely, they propose a methodology to build a scale allowing objective trust measurements in different human-robot team configurations. Inspired by human-machine interactions evaluations, the proposed two-phases technique measures person’s expectations that the behavior, statements (verbal or written), or promises of the robot can be relied upon. As an initial option, the authors propose a complete multidimensional representation of trust and, from that, they retain a subset of the initial parameters to build the final trust measurement tool.

In the third article Facial Communicative Signals Valence Recognition in Task-Oriented Human-Robot Interaction, Christian Lang, Sven Wachsmuth, Marc Hanheide and Heiko Wersing propose to take the perspective of a robot during learning sequences: they investigate about naïve users’ reactions when they try to teach a robot to name objects. This investigation has two aims: (1) to use a robot to determine the teachers’ facial communicative signals (FCS) in learning situations, (2) to let a robot classify the derived FCS to measure the human collaborator valences. During learning, the robot reacts following different predetermined schemes, signifying to the teacher whether or not he/she (the robot) understood. In a second step, relevant FCSs are combined together with an active appearance model (AAM) to train a classifier, which interprets the human teacher feedback in the previous learning situations.

In the fourth paper Telerobotic Pointing Gestures Shape Human Spatial Cognition, John-John Cabibihan, Wing-Chee So, Sujin Saj and Zhengchen Zhang consider two aspects of evaluating interactions: tele-operation as a practical way to avoid robot’s autonomy issues and multimodal interactions as a key approach in HRI. In their experiment, a robot is used in tele-operated mode to measure the effectiveness of both speech and gestures in helping organize a user’s spatial memory. Spatial information of given objects is displayed through a multimodal interface including gestures and/or speech. The obtained results are in line with findings in neuroscience, demonstrating the enhancement of spatial organization when the world description is delivered to all available sensory channels, which have the power to disambiguate confusing messages.

The fifth article Evaluation of the Robot Assisted Sign Language Tutoring using video-based studies by Hatice Kose-Bagci, Rabia Yorganci, Hatice Esra Algan and Dag S. Syrdal reports on methodologies comparing the similarity between the behaviors of human teachers and a humanoid robot simulator during the learning of Turkish sign language. The authors designed specific protocols and performed several studies to cross-validate the utility of relying on a robotics platform to achieve an effective social activity. Large surveys covering all targeted populations and real-use conditions are performed to demonstrate the effectiveness of robotics-mediated teaching activity.

The sixth article, Measuring the Improvement of the Interaction Comfort of a Wearable Exoskeleton: a multi-modal control mechanism based on force measurement and movement prediction by Michele Folgheraiter, Mathias Jordan, Sirko Straube, Anett Seeland, Su Kyoung Ki, Elsa Andrea Kirchner, proposes a novel measurable approach for safe HRI and control: it integrates surface EEG signals into an indirect control of the exoskeleton. EEG signals are used to predict intentions of movement via machine learning techniques in order to smooth the transitions between use and non-use periods of the exoskeleton. In other words, the exoskeleton is able to prepare for the movement, thereby reducing the effort of the user for its initiation. Additional evidence is the observed reduced interaction force needed. Such passive Brain-Machine Interface is tested in teleoperation mode using an Immersive Virtual environment. The authors show that it facilitates safety of the operator in any working modality, while reducing effort and ensuring functionality and comfort even in case of possible misclassification of the EEG instances.

A Methodological Outline and Utility Assessment of Sensor-based Biosignal Measurement in Human-Robot Interaction, written by Wilma A. Bainbridge, Shunichi Nozawa, Ryohei Ueda, Kei Okada and Masayuki Inaba, closes this special issue. It investigates whether there are correlations between human behavioural and biological signals and classical surveys to assess positive or negative attitude towards a humanoid. Temperature sensors and tactile sensors mounted on a robot hand, as well as face distance measurements from the robot head were correlated with both survey and behavioral measurements reflecting positive thinking towards a robot (including non-technophobia, reciprocal behaviors, and positive ratings) during two interaction modes: handshaking and teaching the robot how to play a rock-paper-scissors game.

The authors found similar correlation trends between the sensor and survey data compared with sensor and behavioural data. Specifically, there appears to be a link—confirmed by other research in psychology about human-human interaction—between positive attitudes towards robots and higher temperature, higher temperature growth, higher tactile pressures, and further face distances. Experiments involved both controlled and rather unstructured environments, with participants of different genders, ages, and cultural backgrounds.

In this special issue the reader will find methodologies and techniques to measure the relationships between humans and robots. Measurement in HRI contexts is inherently complex because many of the significant human state variables are hidden. Different facets have been addressed within the selected contributions including the cognitive evaluation of the shape or of the behavior of the robot, the measurement of natural, unconscious causes of human motor actions, the analysis of the unconscious effects of the observation of robotic actions, and the effect of the physical contact with a robotic agent.

Generally speaking, the contributions in this special issue suggest that a joint analysis of behavioral and physiological signals together with the stimuli—coming from the robot—to which they could be correlated, can be effective in measuring HRI. Moreover, such analysis must be performed in a safe, natural, and minimally invasive setup to avoid artifacts due to biased human behaviours.

This special issue also highlights the use of the robot itself as an ideal tool to measure the interaction with a human counterpart: this perspective-taking attitude very much resembles what humans do as they interact with other humans. In this sense, a robot capable of continuously measuring and modeling a human—i.e. able to “have its own idea”—is by definition a good candidate to be called a “social robot”.

Guest editors

Luca Brayda

Ryad Chellali