Abstract
Nonverbal communication (NVC) can benefit from any human sense. The sense of touch, or haptic sense, is often used as a channel of NVC. This paper organizes existing knowledge on the use of technology to communicate nonverbally through the haptic sense (HNVC). The analysis of reported work and ongoing projects in the area during the last five years has resulted in an initial taxonomy that is based on three major dimensions, based upon the intent of the messages exchanged: interpretive, affective, and active communication. Thus, haptic devices are used to convey meaning in interpretive HNVC, to convey or to generate emotional reactions in affective HNVC, or to request specific actions or tasks in active HNVC. We characterize existing work using these major categories and various subcategories. This initial organization provides a general overview of all the areas that benefit from technology as a NVC channel. Analysis shows that using the haptic sense as a communication means still has many open research areas and potential new applications, and has proven to date to be an effective mechanism to communicate most of what humans would want to: messages, emotions, and actions.
Access provided by Autonomous University of Puebla. Download conference paper PDF
Similar content being viewed by others
Keywords
- Nonverbal communication
- Haptic technology
- Technology mediated communication
- Nonverbal interaction mediated through haptic technology
1 Introduction
Even though technology has been changing the ways in which people exchange information, the essence of human communication remains the same. Human communication is composed mainly of language and nonverbal communication (NVC). Any signal other than speech or writing is considered NVC, and is important because it can complement, repeat, accent, contradict, regulate, and even substitute language entirely. Arguably, just being human involves the constant use and interpretation of nonverbal signals. NVC was not considered a formal research subject until the 1950’s, when it was recognized as a new field [1].
NVC is thus a fundamental component of communication as humans, and it must be investigated further from the perspective of Human-Computer Interaction. The role of various technologies in supporting or enabling new forms of NVC deserves special attention, from machine vision or motion sensing to wearable and haptic technologies. We present an extensive literature review of existing work in the specific area of NVC mediated by haptic devices, as we consider there is great potential of this technology to support and enhance NVC.
Our work provides perspective of NVC mediated by haptic technologies by following specific criteria (Sect. 2), organizing knowledge in the field through a taxonomy (Sect. 3) that considers dimensions derived from the intent of the messages exchanged: affective, active and interpretive communication. We use our taxonomy to contextualize salient projects in each category and to facilitate comprehension of their approaches and interrelationships. We discuss challenges, open issues and research directions for the community interested in advancing the field (Sect. 4). Finally, we present our ongoing work on haptic NVC and its applications (Sect. 5).
2 Criteria for Inclusion
Given the vast amount of work that has been undertaken on NVC, we adopted a few criteria in order to narrow down the field of study. Firstly, published work relevant to our research must involve some kind of NVC, even though authors may not mention it explicitly. Some examples of NVC include [1]: body movements (the study subject of kinesics), facial movements, gestures, eye movement (the study subject of oculesics), haptics, sounds different from speech (known as paralanguage), use of space (studied by proxemics), and use of time (studied by chronemics).
Secondly, works considered must focus on the haptic sense, either for sending or receiving purposes in the NVC. Haptics includes all touch behavior, and it is considered an area in need of research [1]. Haptics is the term used for the investigation of human-machine communication using the sense of touch, both as input and output [2]. Touch can broadly be divided into discriminative and affective touch, the difference being feeling stimuli outside or inside the body respectively [3]. Functionally speaking, touch can also be divided into the cutaneous and kinesthetic senses. Kinesthesia is related to body movement, information about position of the body, muscular effort and force-feedback. Haptic interfaces are generally associated with the kinesthetic sense. The cutaneous sense is more skin-centered and is generally addressed by temperature and vibrotactile technology. Vibrotactile is the term used to refer to vibrations felt through the skin. The scope of our work also includes multisensory works, with the sole condition that the haptic sense must be present.
Thirdly, the communication means employed by the projects we have reviewed include some form of technology. Mediation through technology can happen in real time or asynchronously. This technology can either be a robot, wearable technology, or grounded technology. Wearable technology refers to small body-worn devices. This implies a new form of human-computer interaction also known as constant user interface [4], and thus it is of significant interest to this article.
Using haptic technology as a nonverbal medium to communicate has, theoretically, applications in every daily life scenario. However, exactly to what extent haptic technology is being used has not been systematically studied. We provide a general overview of all the different applications reported in the literature, organizing them according to their goals and applications. We propose a taxonomy that comprises every application found in our literature review, with the ultimate goal of fitting any new application into a category, or perhaps adding subcategories of the taxonomy.
Our literature search started by limiting the publications’ years from 2015 to date. The search included all related terms: nonverbal, mediated, movements. Furthermore, results included at least one of the following terms: wearable, wearables, haptic, tactile. Using the variety of applications yielded by the search, we started constructing the taxonomy from the bottom up, adding or removing words in the search to find area-specific works relevant to NVC mediated by technology.
3 Proposed Taxonomy
We have developed a taxonomy for haptic-based NVC mediated by technology (HNVC) that considers mainly the purpose of communication, focusing on reported uses and applications of haptic technology in daily human activities. The taxonomy is intended to classify the wide range of HNVC applications and to inspire researchers to fill any gaps in the depth of the taxonomy. As a starting point to classify research works in the area of HNVC, works are divided into three major categories: interpretive, affective and active (Fig. 1).
This classification differs from the work of MacLean et al. [5], since they divided communication through touch into three main categories: signaling and monitoring, expression of affect, and sharing control with intelligent systems. Our classification is also inspired on the ultimate goal of communication, and is intended to further classify current works found in literature, thus making the initial classification more general. If the goal of the research is mainly to help users understand some meaning, it falls under the interpretive area. If it is related to emotion, it falls under the affective area. And if the communication is intended to convey actions, it falls under the active area. An alternative perspective for this classification considers the most common purposes of verbal communication: sharing a story (interpretive), an exchange of feelings towards something or someone (affective), or instructions to achieve something (active).
3.1 Interpretive HNVC
Interpretive HNVC refers to research work that focuses exclusively on using haptic devices for conveying meaning associated to nonverbal cues. Its main intention is that communication can be understood. Examples of interpretive HNVC includes representing words with different haptic sensations instead of words or phrases. Interpretive HNVC can be further subdivided into two subcategories, depending on the granularity of the signals: word-oriented and phrase-oriented, as discussed below.
Word-Oriented HNVC
Works in the literature that focus on transmitting some meaning using haptic stimuli analogous to a word in verbal communication fall under this category. The work by Enriquez et al. [6] investigate how to convey words related to plants and fruits using haptic feedback to users. They even relate their research to phonemes and not words. They refer to a haptic phoneme as “the smallest unit of a constructed haptic signal to which a meaning can be assigned”. Their aim is associating arbitrary meanings to haptic phonemes, testing how well users remember them. The technology they rely on is a grounded haptic knob. They used self-guided and enforced learning in sequence to train users. They pose that a smarter training method is needed to avoid some interpretation errors made by the users. The work by Chen et al. [7] compared guided learning, self-guided learning and mnemonics for the acquisition of haptic words. They highlight the importance of an effective learning method for users to fully adopt haptic communication. Their chosen technology is a wearable device for the forearm, consisting of 24 actuators. They mapped 13 letters into vibrotactile stimuli, including six vowels and seven consonants. Their experiment involved 100 English words using the coded vowels and consonants, with native and non-native speakers. Their findings support the use of guided learning, reporting an accuracy superior to 90% using different modes: incremental rehearsal, flashcards, explore, and video game. They discuss the need of training being more comprehensive, their device to be smaller, and planned to extend the research to sentence comprehension, not just words. Brewster et al. [8] define tactile icons called tactons. Their work is relevant because tactons are building blocks of tactile communication, adjusting vibrotactile parameters such as intensity, frequency, waveform and body location to assign a meaning to the vibration.
Phrase-Oriented HNVC.
When research aims to convey haptic complex meanings analogous to phrases or sentences, it falls under this category. The work by Oliveira et al. [9] focuses on directional cueing and obstacle detection; hence their meanings are more complex than just one word. They research different tactile vocabularies with and without prefixation to communicate effective navigation. Their technology is a belt with eight actuators, arranged in a compass setting. They research first the perception of the vibrations, then interpretation, and finally the navigation task. They report that vibrotactile patterns in sequence present perception difficulties for the users. The work in Schelle et al. [10] involves inclusion of people with cognitive disabilities, and uses a textile pillow that allows for communication between patients with dementia and their families or caregivers. A single pillow is used, and all participants in the communication process must be at the same physical location touching the pillow. One of their main aims is the effect on personalization of the haptic stimuli on the patients. Their goal is to establish a dialogue in an alternative bodily manner, in what is known as tangible interaction. Their experiment consisted of three sessions, a mirroring and design of personalization, the personalized patterns, and mirroring and personalization. The work by Velázquez et al. [11] reports research on assisting visually impaired people in their situational awareness. They achieve this by testing the recognition of words coded with haptic stimuli, and then progress to even more complex sentences (involving two, three and four words). Their aim is to understand the learning processes and memory capabilities of the subjects, defining three main tactile concepts: leaning, language, and memory. The technology they use is a tactile display for the foot, including four vibrotactile actuators on the foot’s plantar surface. They stress that their technology could be inserted into a shoe, concealing it completely, thus making it more wearable than other approaches. Their results include the feasibility of combining individual tactons to make sentences, achieving high recognition accuracy. They discuss long time memory, finding that after a month of their experiment, not a single user could recall meanings. Work they consider for the future includes means for users being able to concentrate on tactons, independently from noisy environments or crowded places.
3.2 Affective Communication Mediated Through Haptic Technology
The affective communication category involves work with the aim of understanding, emulating, or even creating an emotional reaction of the users. In turn, the affective area can be further subdivided into emotion, mood, and sentiment (Fig. 1). This classification is in line with the theory of Brave et al. [12]. According to them, emotion is object directed, intentional and short-lived (seconds). Mood is nonintentional and undirected at an object, therefore being diffuse and general. Mood influences appraisal of external events, and are processed over a longer time (hours or days) than emotions. Sentiments last longer (persist indefinitely) than both emotions and moods. Sentiments guide which situations and objects humans seek out or avoid [12]. Examples of the affective category include making people feel a certain emotion through vibration, exploring cultural differences in emotion representation, or using robots to elicit certain human emotions.
Emotion-Oriented HNVC.
In this first subcategory, only instantaneous affective perception of humans is considered. The work by Morrison et al. [13] enables users to interact through a vest and a wall, both of which are endowed with vibrotactile technology. The vest is wearable, and has 32 vibrotactile actuators. Their goal was to make people feel a given emotion (relaxed, calm, and aware of danger). Their vibrotactile patterns were inspired by conversations with a therapist. Their experiment consisted on fitting the vest, training, interaction with the wall, and evaluation. They report that users lack the proper language to express vibrotactile sensations. Their future work suggests familiarizing users with the proper vocabulary. The research done by Haritaipan et al. [14] presents a conceptual design of communication using tactile and gestural interactions. Their main objective is to identify the difference between the Japanese and the French cultures. The interpersonal emotions they consider are love, gratitude and sympathy. Their technology includes hand-held and wrist-worn tactile devices. They conclude that cultural background does have an impact on the representation of emotions. The work by Hirano et al. [15] centers on the interaction between humans and robots, and how visual and haptic interaction affects the way the user feels. They used the robot Pepper for their research. They mention that human-robot interaction through touch can be divided into touching a robot, being touched by a robot, and mutual touch. Their findings encourage that human initiate touch, and not the robot. Mutual touch is the one area in need of more research. They point to future work researching gaze difference in different touch situations.
Mood-Oriented HNVC.
The work by Ahmed et al. [16] shows that users associate moods to the system, and said moods have an influence on them. They researched different forms of touch interaction, vibrotactile and force-feedback, to gain insight of which is referred by users to convey affection. Their setting involves a user interacting with a 3D model of a virtual agent, which interacts through haptic signals sensed by the user’s hand. Their findings also suggest that force-feedback interactions are preferred over vibrotactile for mood and affection purposes. The work by Mazzoni et al. [17] explores how a wearable vibrotactile globe can enhance moods watching a film. Their experiment consists of selecting a movie clip associated with a mood, then designing vibrotactile stimuli and evaluating the users’ mood response, and finally pairing movie clips and vibrotactile stimuli. Their findings show that the intensity and frequency of the vibrotactile stimuli can produce anticipation, tension, and calmness.
Sentiment-Oriented HNVC.
The work of Goedschalk et al. [18] is a good example of a negative sentiment towards a virtual agent. They use haptic feedback to enhance believability to a virtual “bad guy”, trying to inspire negative sentiments on the user. Their haptic feedback was provided by a push sensation on a wearable vest, thus mimicking a common aggression nonverbal cue. Although their results are reported to have non-statistical significant effects, they highlight the importance to further research this interaction scenario. Their planned future work includes effect of speech loudness, alternative and repeated haptic feedback. Bosse et al. [19] also research virtual “bad guys”: although no real threat was presented to users, only a threatening non-functioning device was used. Users were made to believe the device could send a small electrical shock in their finger area. Their findings highlight that only feeling a device that can potentially hurt rises the anxiety sentiments of the users. They point to future research in exploring these effects on different personalities, and reducing the predictability of the virtual agent.
3.3 Active Communication Mediated Through Haptic Technology
Active HNVC in our taxonomy refers to emotion-free messages intended not only to be understood, but to be performed as well. In this category users need to understand the message and then perform the required action. The active HNVC category is one we have elaborated with more detail, and hence involves more subcategories. It can be first divided, according to the messages’ main purpose, into either perception-, control- or instruction-oriented HNVC (Fig. 2). Examples of the active category include navigation, control and interaction with virtual environments.
Perception-Oriented HNVC.
This category involves all applications focusing on sensing virtual objects, or people sensing their own motions and reactions. Research in this category explores HNVC without regard of emotional or affective implications. The category can be further divided into sensing- and immersion-based HNVC.
Sensing-Based HNVC.
Sensing-based HNVC focuses on either enhancing or complementing any human sense. It is of special interest for inclusion of people with disabilities. Houde et al. [20] performed a review of what is known about deaf people experiencing their own bodies, body-related abilities and body-related processing. Typical users of research in this sub-area are both deaf and blind people. They advise future research relating deafness and its effects on body-related process, Bălan et al. [21] focus on how visually impaired people can benefit from auditory and haptic senses to construct a mental representation of their environment. Their findings include that visually impaired people successfully use other senses, including touch, to acquire and model spatial information.
Immersion-Based HNVC.
Any work that relies on HNVC aiming to remove users from their actual environment, either in a presentation or in a virtual environment, falls under this subcategory. Of particular interest of this sub-section are virtual environments and games. Kruijff et al. [22] focus on navigation of particularly large virtual environments or games. They used three senses: haptic, visual, and auditory. The haptic sense is achieved through vibrotactile cues under the participant’s feet, using bass-shakers. They compared their approach with other techniques to navigate virtual environments, namely regular seated joystick and standing learning locomotion. They conclude their multisensory method can enhance self-motion perception and involvement/presence. An interesting finding was that participants appreciated the benefits of walking-related haptic cues, independent of joystick or standing mode. Feng et al. [23] worked on enhancing a non-fatiguing walking in a virtual environment. They used tactile cues such as movement wind (the wind resistance felt by walking), directional wind (environmental wind), footstep vibration and footstep sound. They also conclude that tactile cues are of significant improvement for the activity. Bernardet et al. [24] developed an open source software framework to construct real-time interactive systems based on movement data. They highlight the importance of recording, sensing, storing, retrieving, analyzing, understanding and displaying movement data through various senses, while all these components communicate with each other. They provide further justification for this field by stating: “to better understand humans, and/or to build better technology, we need to take into account the body, and with it, movement”.
Control-Oriented HNVC.
Any research focusing on controlling certain technology using the haptic sense, either as a control means or as feedback of the controlled technology is included here. Distinctions in the type of technology lead to two sub-categories: logical and physical HNVC.
Logically-Based HNVC.
Any work that aims to control software is included here. Saunders et al. [25] focus on controlling desktop applications through the feet, using discrete taps and kicks. They used motion capture technology for their experiment. They propose ten design guidelines to be followed, the most general of which are: tapping for frequent actions, consider both feet as dominant ones, prefer toe taps, and sensing techniques should be robust. Their planned future work involves developing a fully working system in a real setting, researching other foot related gestures, or including other objects for interaction with the foot. The work by Costes et al. [26] use pseudo-haptics effects to virtually evoke some haptic properties: stiffness, roughness, reliefs, stickiness, and slipperiness. They refer to pseudo-haptics as additional visual effects that enhance the touch experience in touchscreens. Their results show that haptic properties can be successfully conveyed to a user interacting with a software. Their future work includes quantitative evaluation, thresholding methods, and models to improve quality, different effects, and the use of haptic databases.
Physically-Based HNVC.
In contrast to logical HNVC, physical HNVC involves research of controlling robotic mechanisms or hardware in general. Bufalo et al. [27] focus on a disturbance rejection task of a controlled element. They considered three different modalities of haptic feedback: Variable Haptic Aid (VHA), Constant Haptic Aid (CHA), and No Haptic Aid (NoHA). Technology they used includes a display and a grounded electrical control-loaded sidestick, movable only laterally. Their results show that VHA is the best modality to help users learn the task. They point to further research on changing both the amount of help given on the CHA modality and the stiffness of the control, as to improve experimental results of this category. D’Intino et al. [28] focus on haptic feedback used to learn a compensatory tracking task. Their experiment uses two groups, one without any haptic feedback, and the other training with feedback and evaluating without feedback. They relied on a grounded sidestick with only lateral axis movement. Their conclusion is that users of haptic feedback learn more rapidly. Van Oosterhout et al. [29] explored the possible effects of inaccuracies in haptic shared control. They found out inaccuracies do have a negative effect in using haptic control tasks, and therefore any system should be accurate to work properly. Xu et al. [30] used wearable technology to develop a motion capture of arm movement. MYO armbands are located in the upper arm and forearm, so the motion of the operator’s arm is obtained. They used this data to control a virtual robotic arm that mimics the movement of the operator.
Instruction-Oriented HNVC.
This category involves all works and research that include HNVC for learning or training. It includes acquisition, improvement, and assistance of both skills and knowledge. Instruction-oriented HNVC is divided into two main types of applications: concept- and skill-oriented. Concept-oriented work focuses on abstract knowledge, while skill-oriented research involves any physical ability.
Concept-Based HNVC.
This area does not deal with movement, but with teaching abstract ideas using tangible methods. Robotti works on helping students with difficulties in algebra, especially the ones with developmental dyscalculia [31]. Help is based on providing meaning to algebraic notations through visual, nonverbal, and kinaesthetic-tactile systems. They stress the importance of supporting abstract concepts with “visual non-verbal and kinaesthetic channels of access to information”. The work by Magana et al. [32] is aimed to improve learning of electricity and magnetism concepts, using a visuohaptic simulation approach. Visuohaptic refers to the use of both visual and haptic sense. Their work compares visuohaptic to visual only and instructional multimedia. Their results show that visuohaptic did better, although not significantly. Their findings include that force feedback is a novelty for the majority of the users, and therefore might contribute to overloading work memory. They point to future work in the implementation of guided-inquiry approaches, designing materials for instruction and assessment more haptic-centered, learning strategies for touch interaction, and personalized calibration of force feedback.
Skill-Based NVC.
The skill-oriented HNVC category involves human-human interaction mediated through technology, with the purpose of teaching a new skill to the user. According to Yokokohji et al. [33] training is a skill mapping from an expert to a learner, also called trainer and trainee, respectively. Their work highlights the importance of skill transfer, because it is time consuming and thus not easy to achieve. Furthermore, motor skills add the challenge of being difficult to describe with language. Schirmer et al. [34] use a tactile interface for eyes-free pedestrian navigation in cities. They used only two actuators and divided their experiments into compass and navigation modes. Prasad et al. [35] use a vibrotactile vest to provide navigation and obstacle detection to the wearer. Their findings include that vibration produces less cognitive load and greater environmental awareness than other navigational approaches.
Narazani et al. [36] work on feet-related skill transfer. They centered in dance and gymnastics, using both tactile and visual feedback. They refer to their users as instructor and recipient, respectively. Their goal is to achieve “real-time performance-oriented telexistence context”. Their methodology consisted of three stages: First, the instructor performs a move that is interpreted and sent to recipient. Second, the movement is displayed visually or haptically to the recipient who will attempt the directed action. Third, feedback on accuracy is provided to the recipient. They also mention that, in general, the process involves feedforward (sending the move) and feedback (on the accuracy of the move), both of which can be either visual or haptic. The work by Ros et al. [37] use a robot as a dance tutor for children. Their focus is on how children accommodate to the tutor across interaction sessions.
Aggravi et al. [38] present a communication between an instructor and the student. Their aim is to assist the student, who is visually impaired, to learn to ski. They achieve it with the help of two vibrating bracelets in order to improve communication between instructor and student. Valsted et al. [39] use a haptic wristband and chest belt that assist runners achieve a technique called rhythmic breathing. Their results show that the runner context must be taken into account in designing the interaction mode, and that, overall, runners prefer to feel the feedback while exhaling.
Yang et al. [40] work on a handwriting tutor-tutee scenario mediated by a robot, using haptic feedback for corrections. Their setting involved the computer capturing an expert’s motion skills, and then the robot can handle multiple tutees by itself. Pedemonte et al. [41] also work on transferring handwriting skills. They researched remote user communication with the goal of improving their handwriting skills. They focus on human-human interaction mediated by a haptic device between novice and expert. Their findings encourage the use of haptic guidance in the handwriting learning process.
Rajanna et al. [42] developed a system based on wearable technology for long periods of rehabilitation after surgery based on vibro-haptic feedback. Their system is able to set personal goals, track the progress, and easy to include in daily life. Their feedback works as follows: A vibro-haptic feedback is delivered when the body part being tracked is lifted to a required angle of elevation [42]. Ostadabbas et al. [43] propose a system that can be controlled with the tongue for people recovering from a stroke. They implemented a novel interaction system with the tongue to use an already existing rehabilitation system for the hand. Their goal is to improve the quality of life and the upper limbs’ movements of people post-stroke.
Choi et al. [44] study a simulator for closing a wound with suturing procedures. Their aim is focused not only in medicine, but in nursing as well. They use both hands for the procedure and are aided by haptic feedback to learn the correct procedure. Uribe et al. [45] study the feasibility of using robots to assist brain surgery. Their robot is able to differentiate tissues, thus being of great help to the surgeon in high quality measurements. The work of Guo et al. [46] designed a robotic catheter system to protect surgeons and improve the overall surgery. Vascular interventional surgery is the specific application of the robot. They use force-feedback as the interaction with the surgeon. Their results include errors in the order of a few millimeters; errors reported to be acceptable in the surgery.
4 Research Directions
Based upon our review of the field, we have generated a taxonomy that provides perspective on existing work. In this section, we discuss directions for research in each of the three major categories of the taxonomy: affective, interpretive and active HNVC.
The affective category involves work that started with recognizing emotions, and then evolved to be able to communicate and simulate them. Given the popularity and feasibility of using the haptic sense to convey emotions, recent research directions are shifting towards emotions that linger through time: moods and feelings. It is expected that HNVC will be able to identify and change the affective states of humans even better than we as humans can. Because touch is strongly related to expressing emotions with human beings, it is important to continue to research the effects on artificial touch for interaction with robots, or with other humans mediated by technology.
The interpretive category has also evolved from users being able to differentiate single words, to understand multiple words that form full sentences. This area has also achieved to convey complex ideas and instructions with iconicity, much like every icon we use to save or open a document. This area is expected to achieve a high level of communication between humans using only the tactile sense. There is still need for research about differentiability of tactile patterns, the cognitive capacity to associate each pattern with a meaning, and if the association lingers in time. This research is fundamental to continue to progress in both the affective and active categories. The more we know about the foundations of tactile interpretations and meaning, the more HNVC can be explored in diverse real-life scenarios.
The active category has the most variants and subcategories in our proposed taxonomy. This is only natural, as humans perform a variety of activities in daily life, so we cannot turn off the haptic sense for any of them. In the control area the main goal is to achieve full controllability over the technology, being as natural and error-free as possible. Therefore, future research directions need to focus on the mutual understanding of human and technology. It is also important to explore different modalities of feedback depending on the current control capabilities of the users, without users heavily relying on the feedback. Perception strives to enhance our haptic sensory capabilities, possibly to compensate another sense or just to enhance an experience. Future research directions are to what extent our different senses correlate and can compensate each other, and which sensations help users to feel a more realistic virtual immersion in games and simulators, without causing dizziness or distractions. This research is central to be inclusive with people with sensorial disabilities, and to learn more about how humans use and perceive touch. The instruction area is very open, because it deals with acquiring new skills using technology. This area needs to benefit from the others, since it needs understanding of interpretive HNVC, and also of some degree of control between human and technology. The concept instruction subarea is still very open, as normal classroom settings do not use currently the haptic sense to help students understand abstract concepts. There are many abstract and even not abstract concepts that would definitely benefit with a better understanding of how human beings rely on the sense of touch to perceive and understand the world. The health applications have attracted attention because of the high level of training doctors need, and also because of the importance for everyone’s well-being. Research must focus on providing realistic tactile and haptic sensations to the doctors in training, so that the training is as similar as possible to a real-life surgery or intervention.
HNVC has a wide range of application domains. Applications may range from analyzing cultural backgrounds for understanding emotions to using robots for assisting medical doctors in surgery. Understanding this diversity demands a rich taxonomy to encompass all existing research, and also to show the full potential of the human haptic sense to communicate and to perform different tasks. The use of the haptic sense also allows applications to be more inclusive, because visually impaired and deaf people can use their touch sense to compensate for deficiencies in their other senses. People with some sort of cognitive deterioration (such as dementia) may also benefit from HNVC, because the haptic medium offers them a novel way to express themselves and understand what others are trying to say. Everyone can benefit from advances in HNVC research, therefore there is a current need to continue research in this area. To formalize HNVC protocols and languages is of general importance, because all other subareas will benefit from having a solid basis in the general area.
5 Ongoing and Future Work
While our taxonomy provides a general overview of HNVC, specific details could potentially have been overlooked. Even though our proposed taxonomy focuses mainly on the purpose and applications of HNVC, we are aware of other transversal attributes that could further characterize the taxonomy, such as human senses or technology used. These attributes could further refine the proposed taxonomy, and potentially give insight of good practices depending on each area and sub-area of the taxonomy. They could also help identify under-researched technologies or senses in certain categories, further expanding the discussion on research directions. We are currently working on refining our taxonomy by considering those cross-category dimensions.
Since we have focused on finding diverse applications of HNVC, and this is a very active research area, there may be more recent works that can be reviewed and included in the proposed taxonomy. Therefore, we are also currently working on an extended literature review, fitting new works that fit under our criteria for HNVC categorization. These new works will expectedly further enrich each sub-area or the taxonomy, and provide more detailed research directions and good practices within each specific area. Overall, the more insight and information we have about how humans use the haptic sense and associated HNVC, the more we will understand ourselves, improve our daily activities, be more inclusive as a society, and develop technology that feels more natural and we can understand better.
References
Schroeder, M.L.: A research framework for second nonverbal code acquisition. Globe: J. Lang. Cult. Commun. 5, 103–124 (2017). https://doi.org/10.5278/ojs.globe.v5i0.1943
Hayward, V., Astley, O.R., Cruz-Hernandez, M., Grant, D., Robles-De-La-Torre, G.: Haptic interfaces and devices. Sens. Rev. 24(1), 16–29 (2004). https://doi.org/10.1108/02602280410515770
Huisman, G.: Social touch technology: a survey of haptic technology for social touch. IEEE Trans. Haptics 10(3), 391–408 (2017). https://doi.org/10.1109/TOH.2017.2650221
Mann, S.: Humanistic computing: “WearComp” as a new framework and application for intelligent signal processing. Proc. IEEE 86(11), 2123–2151 (1998). https://doi.org/10.1109/5.726784
MacLean, K.E., Pasquero, J., Smith, J.: Building a haptic language: communication through touch. Computer Science Department, University of British Columbia TR-2005-29, pp. 1–16 (2005). ftp://ftp.cs.ubc.ca/local/techreports/2005/TR-2005-29.pdf. Accessed 11 October 2019
Enriquez, M., MacLean, K., Chita, C.: Haptic phonemes: basic building blocks of haptic communication. In: 8th International Conference Proceedings on Multimodal Interfaces, pp. 302–309. ACM, Canada (2006). https://doi.org/10.1145/1180995.1181053
Chen, J., Turcott, R., Castillo, P., Wahyudinata, S., Lau, F., Israr, A.: Learning to feel words: a comparison of learning approaches to acquire haptic words. In: 15th ACM Symposium Proceedings on Applied Perception, pp. 1–7. ACM, Canada (2018). https://doi.org/10.1145/3225153.3225174
Brewster, S., Brown, L.M.: Tactons: structured tactile messages for non-visual information display. In: Cockburn, A. (eds.) 5th Conference on Australasian User Interface. AUIC 2004 Proceedings, vol. 28, pp. 15–23. Australian Computer Society Inc. (2004)
de J. Oliveira, V.A., Maciel, A.: Assessment of tactile languages as navigation aid in 3D environments. In: Auvray, M., Duriez, C. (eds.) EUROHAPTICS 2014. LNCS, vol. 8619, pp. 104–111. Springer, Heidelberg (2014). https://doi.org/10.1007/978-3-662-44196-1_14
Schelle, K.J., Gomez, N.C., Bhömer, M.T., Tomico, O., Wensveen, S.: Tactile dialogues: personalization of vibrotactile behavior to trigger interpersonal communication. In: 9th International Conference Proceedings on Tangible, Embedded, and Embodied Interaction, pp. 637–642. ACM, USA (2015). https://doi.org/10.1145/2677199.2687894
Velázquez, R., Pissaloux, E.: Constructing tactile languages for situational awareness assistance of visually impaired people. In: Pissaloux, E., Velázquez, R. (eds.) Mobility of Visually Impaired People, pp. 597–616. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-54446-5_19
Brave, S., Nass, C.: Emotion in human-computer interaction. In: The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications, pp. 81–96. L. Erlbaum Associates Inc., Hillsdale (2003). https://doi.org/10.1201/b10368-6
Morrison, A., Knoche, H., Manresa-Yee, C.: Designing a vibrotactile language for a wearable vest. In: Marcus, A. (ed.) DUXU 2015. LNCS, vol. 9187, pp. 655–666. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-20898-5_62
Haritaipan, L., Mougenot, C.: Cross-cultural study of tactile interactions in technologically mediated communication. In: Rau, P.-L.P. (ed.) CCD 2016. LNCS, vol. 9741, pp. 63–69. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-40093-8_7
Hirano, T., et al.: Communication cues in a human-robot touch interaction. In: 4th International Conference Proceedings on Human Agent Interaction, pp. 201–206. ACM, USA (2016). https://doi.org/10.1145/2974804.2974809
Ahmed, I., Harjunen, V., Jacicci, G., Hoggan, E.: Reach out and touch me: effects of four distinct haptic technologies on affective touch in virtual reality. In: 18th International Conference Proceedings on Multimodal Interaction, pp. 341–348. ACM, USA (2016). https://doi.org/10.1145/2993148.2993171
Mazzoni, A., Bryan-Kinns, N.: Mood glove: a haptic wearable prototype system to enhance mood music in film. Entertain. Comput. 17, 9–17 (2016). https://doi.org/10.1016/j.entcom.2016.06.002
Goedschalk, L., Bosse, T., Otte, M.: Get your virtual hands off me! – developing threatening IVAs using haptic feedback. In: Verheij, B., Wiering, M. (eds.) BNAIC 2017. CCIS, vol. 823, pp. 61–75. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-76892-2_5
Bosse, T., Hartmann, T., Blankendaal, R.A.M., Dokter, N., Otte, M., Goedschalk, L.: Virtually bad: a study on virtual agents that physically threaten human beings. In: 17th International Conference Proceedings on Autonomous Agents and MultiAgent Systems, pp. 1258–1266. International Foundation for Autonomous Agents and Multiagent Systems, Richland (2018)
Houde, M.S., Landry, S.P., Pagé, S., Maheu, M., Champoux, F.: Body perception and action following deafness. Neural Plast. 2016, 1–7 (2016). https://doi.org/10.1155/2016/5260671
Bălan, O., Moldoveanu, A., Moldoveanu, F., Butean, A.: Auditory and haptic spatial cognitive representation in the case of the visually impaired people. In: 22nd International Congress Proceedings on Sound and Vibration, Italy, pp. 1–5 (2015)
Kruijff, E., et al.: On your feet!: enhancing vection in leaning-based interfaces through multisensory stimuli. In: 2016 Symposium Proceedings on Spatial User Interaction, pp. 149–158. ACM, Japan (2016). https://doi.org/10.1145/2983310.2985759
Feng, M., Dey, A., Lindeman, R.W.: An initial exploration of a multi-sensory design space: tactile support for walking in immersive virtual environments. In: 2016 IEEE Symposium Proceedings on 3D User Interfaces, pp. 95–104. IEEE, USA (2016). https://doi.org/10.1109/3dui.2016.7460037
Bernardet, U., et al.: m+ m: a novel middleware for distributed, movement based interactive multimedia systems. In: 3rd International Symposium Proceedings on Movement and Computing, pp. 21–30. ACM, USA (2016)
Saunders, W., Vogel, D.: The performance of indirect foot pointing using discrete taps and kicks while standing. In: 41st Graphics Interface Conference Proceedings, pp. 265–272. Canadian Information Processing Society, Canada (2015)
Costes, A., Argelaguet, F., Danieau, F., Guillotel, P., Lécuyer, A.: Touchy: a visual approach for simulating haptic effects on touchscreens. Front. ICT 6, 1–11 (2019). https://doi.org/10.3389/fict.2019.00001
Bufalo, F., Olivari, M., Geluardi, S., Gerboni, C.A., Pollini, L., Bülthoff, H.H.: Variable force-stiffness haptic feedback for learning a disturbance rejection task. In: 2017 IEEE International Conference Proceedings on Systems, Man, and Cybernetics, pp. 1517–1522. IEEE, Canada (2017). https://doi.org/10.1109/smc.2017.8122829
D’Intino, G., et al.: Evaluation of haptic support system for training purposes in a tracking task. In: 2016 IEEE International Conference on Systems, Man, and Cybernetics Proceedings, pp. 2169–2174. IEEE, Hungary (2016). https://doi.org/10.1109/smc.2016.7844560
Van Oosterhout, J., et al.: Haptic shared control in tele-manipulation: effects of inaccuracies in guidance on task execution. IEEE Trans. Haptics 8(2), 164–175 (2015). https://doi.org/10.1109/TOH.2015.2406708
Xu, Y., Yang, C., Liang, P., Zhao, L., Li, Z.: Development of a hybrid motion capture method using MYO armband with application to teleoperation. In: 2016 IEEE International Conference Proceedings on Mechatronics and Automation, pp. 1179–1184. IEEE, China (2016). https://doi.org/10.1109/icma.2016.7558729
Robotti, E.: Designing innovative learning activities to face difficulties in algebra of dyscalculic students: exploiting the functionalities of AlNuSet. In: Leung, A., Baccaglini-Frank, A. (eds.) Digital Technologies in Designing Mathematics Education Tasks. MEDE, vol. 8, pp. 193–214. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-43423-0_10
Magana, A.J., et al.: Exploring multimedia principles for supporting conceptual learning of electricity and magnetism with visuohaptic simulations. Comput. Educ. J. 8(2), 8–23 (2017)
Yokokohji, Y., Hollis, R., Kanade, T., Henmi, K., Yoshikawa, T.: Toward machine mediated training of motor skills-skill transfer from human to human via virtual environment. In: 5th IEEE International Workshop Proceedings on Robot and Human Communication, pp. 32–37. IEEE, Japan (1996). https://doi.org/10.1109/roman.1996.568646
Schirmer, M., Hartmann, J., Bertel, S., Echtler, F.: Shoe me the way: a shoe-based tactile interface for eyes-free urban navigation. In: 17th International Conference Proceedings on Human-Computer Interaction with Mobile Devices and Services, pp. 327–336. ACM, USA (2015). https://doi.org/10.1145/2785830.2785832
Prasad, M., Taele, P., Olubeko, A., Hammond, T.: HaptiGo: a navigational ‘tap on the shoulder’. In: 2014 IEEE Haptics Symposium Proceedings, pp. 1–7. IEEE (2014). https://doi.org/10.1109/haptics.2014.6775478
Narazani, M., Seaborn, K., Hiyama, A., Inami, M.: StepSync: wearable skill transfer system for real-time foot-based interaction. In: 23rd Annual Meeting Proceedings, pp. 1–4. Virtual Reality Society of Japan (2018)
Ros, R., Coninx, A., Demiris, Y., Patsis, G., Enescu, V., Sahli, H.: Behavioral accommodation towards a dance robot tutor. In: 9th ACM/IEEE International Conference Proceedings on Human-Robot Interaction, pp. 278–279. ACM, USA (2014). https://doi.org/10.1145/2559636.2559821
Aggravi, M., Salvietti, G., Prattichizzo, D.: Haptic assistive bracelets for blind skier guidance. In: 7th Augmented Human International Conference Proceedings, pp. 1–4. ACM, USA (2016). https://doi.org/10.1145/2875194.2875249
Valsted, F.M., Nielsen, C.V.H., Jensen, J.Q., Sonne, T., Jensen, M.M.: Strive: exploring assistive haptic feedback on the run. In: 29th Australian Conference Proceedings on Computer-Human Interaction, pp. 275–284. ACM, USA (2017). https://doi.org/10.1145/3152771.3152801
Yang, C., Liang, P., Ajoudani, A., Li, Z., Bicchi, A.: Development of a robotic teaching interface for human to human skill transfer. In: 2016 IEEE/RSJ International Conference Proceedings on Intelligent Robots and System, pp. 710–716. IEEE, USA (2016). https://doi.org/10.1109/iros.2016.7759130
Pedemonte, N., Laliberté, T., Gosselin, C.: Bidirectional haptic communication: application to the teaching and improvement of handwriting capabilities. Machines 4(1), 1–15 (2016). https://doi.org/10.3390/machines4010006
Rajanna, V., et al.: KinoHaptics: an automated, wearable, haptic assisted, physio-therapeutic system for post-surgery rehabilitation and self-care. J. Med. Syst. 40(3), 1–12 (2016). https://doi.org/10.1007/s10916-015-0391-3
Ostadabbas, S., et al.: Tongue-controlled robotic rehabilitation: a feasibility study in people with stroke. J. Rehabil. Res. Dev. 53(6), 989–1006 (2016). https://doi.org/10.1682/JRRD.2015.06.0122
Choi, K.-S., Chan, S.-H., Pang, W.-M.: Virtual suturing simulation based on commodity physics engine for medical learning. J. Med. Syst. 36(3), 1781–1793 (2012). https://doi.org/10.1007/s10916-010-9638-1
Uribe, D.O., Schoukens, J., Stroop, R.: Improved tactile resonance sensor for robotic assisted surgery. Mech. Syst. Signal Process. 99, 600–610 (2018). https://doi.org/10.1016/j.ymssp.2017.07.007
Guo, J., Yu, Y., Guo, S., Du, W.: Design and performance evaluation of a novel master manipulator for the robot-assist catheter system. In: 2016 IEEE International Conference Proceedings on Mechatronics and Automation, pp. 937–942. IEEE, USA (2016). https://doi.org/10.1109/icma.2016.7558688
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Camarillo-Abad, H.M., Alfredo Sánchez, J., Starostenko, O. (2019). Organizing Knowledge on Nonverbal Communication Mediated Through Haptic Technology. In: Ruiz, P., Agredo-Delgado, V. (eds) Human-Computer Interaction. HCI-COLLAB 2019. Communications in Computer and Information Science, vol 1114. Springer, Cham. https://doi.org/10.1007/978-3-030-37386-3_20
Download citation
DOI: https://doi.org/10.1007/978-3-030-37386-3_20
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-37385-6
Online ISBN: 978-3-030-37386-3
eBook Packages: Computer ScienceComputer Science (R0)