Keywords

1 Introduction

Ultrasonic mid-air tactile stimulation recently attracted the attention of researchers as it can provide distinctive haptic feedback in a precisely specified location in the three-dimensional (3D) space.

Previous research showed that interactive haptic maps efficiently create mental maps for people with visual impairment (VI) [3, 6]. Low-abstraction haptic maps comprising miniatures of real-world objects are usable even for older adults with VI [11]. Guiding users toward a specific location in the map space requires complex interaction methods such as hand tracking and voice output. Some methods even require special gloves that can interfere with touch sensitivity. Especially for older adults with VI, these methods can be inefficient and frustrate some users.

In this paper, we focus on the research question of whether ultrasonic mid-air stimulation (haptic cursor) in combination with a map comprising physical objects – miniatures of room equipment – can guide the user’s attention toward particular objects. Furthermore, we want to measure the objective (time to react, accuracy) and subjective (comfort, ease of use) properties of the proposed method.

2 Related Work

Ultrasonic tactile stimulation has been investigated for a long time, e.g., Dalecki et al. [4] investigated the force of acoustic radiation to determine the threshold for tactile perception in a human finger and the upper forearm. They found that the maximum tactile sensitivity of the fingers occurs at 200 Hz. Further development showed that mid-air tactile stimuli could be actuated in a 3D position by ultrasonic tactile stimulation based on phase-arrayed ultrasound speakers [7, 10]. These devices can generate non-contact mid-air ultrasound tactile stimuli that can be sensed by various parts of human skin or even lips; however, providing stimuli to palms and fingers is the most common approach. To our knowledge, no method focused on marking a specific location in 3D space by localized tactile stimuli in combination with physical objects.

Marzo et al. in [12] present an open-source system for mid-air ultrasound interaction based on Arduino Mega. In [16], Suzuki et al. presented a scalable mid-air ultrasound haptic display. Their solution allows connecting multiple modules (similar to the module depicted in Fig. 1 via Ethernet. It allows individual control of the phase and amplitude of each of the connected transducers. The authors achieved a synchronization accuracy of \(0.1 \mu s\), and the phase and amplitude can be specified using the 8 bits resolution. This allows for covering larger portions of 3D space.

Hajas et al. [5] investigated the perception of 2D shapes rendered mid-air using ultrasonic arrays. They conducted two experiments to measure accuracy and confidence. The authors evaluated two methods for displaying 2D geometric shapes in mid-air – static and dynamic. The static method relies on the presentation of a full outline in mid-air, while the dynamic method relies on a haptic pointer (focal point) moving along the perimeter of the shapes. The results show that the participants identified dynamic shapes more accurately and with greater confidence. Moreover, the authors suggest that a short pause in the movement of the haptic pointer in corners of polygons can drastically improve shape recognition accuracy. Alakhawand et al. [1] propose a method to test mid-air haptics with a biomimetic tactile sensor. Their approach allows for producing detailed visitations of mid-air sensations in 2D and 3D space.

Voudouris et al. [18] state that the perception of tactile stimuli presented on a moving hand is systematically suppressed, which could be attributed to the limited capacity of the brain to process task-irrelevant sensory information. The authors investigated whether humans can enhance relevant tactile signals in parallel movement when performing a goal-directed reach movement. The experiments carried out suggest that the participants were able to flexibly modulate tactile sensitivity by suppressing movement-irrelevant signals and enhancing movement-relevant signals in parallel when performing target-reaching tasks. Bensmaia et al. [2] investigate the effects of extended suprathreshold vibratory stimulation on the sensitivity of three types of neural afferents (slowly adapting type 1, rapidly adapting, and Pacinian). The results indicate that prolonged suprathreshold stimulation can result in substantial desensitization of all types of neural afferents. Juravle and Spence [9] investigated sensory suppression in complex motor tasks such as juggling. The experiment required participants to detect gaps in the continuous signal provided by different modalities (haptic, auditory). The authors stated that the participants were significantly less sensitive to detecting a gap in tactile stimulation while juggling. The results demonstrate movement-related tactile sensory suppression related to the decision component in tactile suppression.

Rakkolainen and Raisamo [14] surveyed possible advantages, problems, and applications of mid-air ultrasonic haptic feedback. They state that most methods use frequencies of  200 Hz to trigger Lamellar corpuscles that are dense in the palm and are associated with sensing vibrations and pressure. However, other mechanoreceptors can be used, such as Meissner corpuscles on the face or Merkel cell disks, and Ruffini corpuscles on the human upper body. In [8], Jingu et al. proposed a tactile notification system called LipNotif. It provides mid-air ultrasound tactile notifications that can be sensed using lips.

In [13], Paneva et al. investigated the possibilities of using mid-air haptics for conveying Braille. The researchers tested three tactile stimulation methods: aligned temporally (constant), not aligned temporally (point-by-point), and combination (row-by-row). They reached the highest average accuracy of 88 % using the point-by-point method. Suzuki et al. in [15] investigate whether human subjects can move a hand along a path produced by ultrasound without visual information. The path is presented by switching ultrasound focal points in a way the users perceive it as a line. Users can move their hand to the target position by tracing this line. The experiment showed that the participants were able to trace the trajectory of a curved line with an average deviation of less than 40 mm.

The current research showed that ultrasonic tactile stimulation could provide salient sensations in specific locations in 3D space. However, an application as a haptic cursor for physical objects would be a novel application.

Fig. 1.
figure 1

Interactive tactile map integrated Ultrahaptics® mid-air ultrasonic array (left), Experiment setup (right).

3 Interactive Modular Tactile Map with Mid-Air Haptics

The design of an interactive modular haptic map of rooms and related interaction methods is primarily focused on older adults with vision impairments. In detail, the original design was described in [11]. The experiments showed that the participants used audio labels to identify objects on the map and a potential need for guidance toward particular objects in more complex interaction scenarios. This paper focuses on an interaction method that uses mid-air tactile stimulation as a haptic cursor for passive physical objects.

We follow the come as you are design constraint [17], so the users are not required to use any specific equipment attached to their body to use the method. Figure 1 shows the integration of the Ultrahaptics Stratos Explore® ultrasonic array with our interactive haptic map. Unlike typical setups, the ultrasonic array was mounted perpendicularly to the haptic map. Therefore, the mid-air tactile sensations are detectable primarily by the fingers rather than by the palm

Our setup involved seven physical objects – miniatures of room equipment. The tactile sensations (haptic cursors) associated with each of the objects were prepared in advance. For all the haptic cursors, the focal point generated by the ultrasonic array created a virtual square with a leg of 30mm. The only difference was the x, y offset and height of the haptic cursors that were set experimentally to appear directly above objects on the haptic map.

4 Experiment

Participants. We recruited 15 participants (P1-P15, five female, age \(MEAN=28.7\), \(SD = 6.5\), \(MIN = 21\), \(MAX = 41\)). All participants, except one, were right-handed. One participant reported a scar on the left thumb that could influence sensitivity in this area. One participant is challenged with color blindness and has bad vision in his left eye (he uses only his right eye).

Procedure. After a short ice-breaking session, participants received a consent form related to data collection, processing, and anonymization, followed by a brief introduction to the experiment. The setup of the experiment is depicted in Fig. 1. Each participant was instructed that the experiment involved physical objects and mid-air sensations (haptic cursor) that they would feel through the skin of their hands/fingers as a little vibration. The haptic cursors will be placed above one of the physical objects. Their task is to locate the object where the haptic cursor is most noticeable and press it toward the underlying board. Then the haptic cursor will move above another object.

The total number of unique physical objects marked by a mid-air haptic cursor was seven. The experiment consisted of two phases – learning and measurement. In the learning phase, participants were exposed to 49 different locations of the haptic cursor and received confirmation whether they selected the correct object. In the measurement phase, there were 98 attempts. To mitigate the learning effect, the order of objects/haptic cursors was determined using Latin squares (\(7\times 7\)) iterated in a zig-zag manner to counterbalance the learning effect. The Latin squares were randomly generated for each phase of the experiment. For the measurement phase, the same Latin square was iterated twice.

Interaction with a haptic cursor is detectable by touch is also audible. As a countermeasure, the participants received headphones. Using the headphones, white noise was played and participants received feedback when they pressed an object. In the learning phase, the feedback was “Correct” or “Not correct”, followed by the statement “Find the next item, please.”

Participants were asked to find and press the object marked by the haptic cursor. We also told them: “Be as fast and as precise as possible; precision is the priority.” Participants received no specific guidance on the strategy they should use for their exploration process.

Measures. In each session, we recorded information about which object had been marked by a haptic cursor and which was selected by the participant. We also measured the time between object selections. This allowed us to construct confusion matrices and compute speed-related and error-rate-related statistics. During each session, we collected observational data on participant behavior and strategy. After the measurement session, we collected data about the self-reported subjective experience during the experiment and subjective assessment on a five-level Likert scale (haptic cursors were comfortable, and the experience during interaction on the level of individual objects – easily noticeable, strong, distinguishable with others).

Table 1. Measurement phase - confusion matrix [%]
Table 2. Times and error rates per class

Results. All participants were able to complete both test phases (learning and measurement). Table 1 shows the confusion matrix of the measurement phase. The average false negative rate (FNR) was 23.0 % (\(SD = 7.2\,\%\)) for the learning phase, dropping to 14.4 % (\(SD = 7.7\,\%\)) in the measurement phase. As shown in Table 2, the highest FNR of 29.5 % (\(SD = 18.5\,\%\)) was recorded for the trash bin object (frequently confused with the nearby wardrobe). The lowest FNR of 4.3 % (\(SD = 8.0\,\%\)) was achieved for the table object in the measurement phase.

As shown in Table 2, the average time between object confirmation was \(6.97\,s, SD = 0.42\) for the learning phase and \(5.83\,s, SD = 0.31\) in the measurement phase.

We performed a one-way single-factor ANOVA for attempt groups and normalized reaction times (\(t_{norm}=t_{abs}/t_{avg})\), where \(t_{abs}\) is the actual value in seconds and \({t_avg}\) is the average time of a participant in a particular experiment phase). It revealed that there was a statistically significant difference in \(t_{norm}\) between at least two groups \((F(6,98)=6.53, p < 0.001)\) in the learning phase. Tukey’s post-hoc test for multiple comparisons revealed that the mean value of normalized time was significantly different between the attempt group 1–7 (\(M=1.29, SD=1.12\)) and the groups 22–28 (\(M=0.95, SD=0.10, q = 5.86\)), 29–35 (\(M=0.93, SD=0.2, q = 6.19\)), 36–42 (\(M=0.92, SD=0.15, q = 6.34\)), 43–49 (\(M=0.82, SD=0.20, q = 7.97\))). A one-way ANOVA revealed that there were no statistically significant differences between the attempt groups in \(t_{norm}\) in the measurement phase \((F(13,196)=0.78, p = 0.67)\) and between the attempt groups and the number of errors in both the learning phase \((F(6,98) = 0.82, p = 0.54)\) and the measurement phase \((F(13,196) = 0.51, p = 0.91))\).

A one-way ANOVA for object types and \(t_{norm}\) revealed that there was no statistically significant difference between object types in the learning phase \((F(6,98)\,=\,1.78, p\,=\,0.11))\) and the measurement phase \((F(6,98)\,=\,2.14, p\,=\,0.056))\). It revealed that there was not a statistically significant difference between object types and the number of errors in the learning phase \((F(6,98)\,=\,0.82, p\,=\,0.055))\), but revealed a statistically significant difference in the measurement phase \((F(6,98)\,=\,3.87, p\,=\,0.002))\). Tukey’s post-hoc test for multiple comparisons revealed that the mean value of normalized time was significantly different between the attempt group \((p\,=\,0.05, q_{crit}\,=\,4.31)\): table (\(M\,=\,0.60, SD\,=\,1.12\)) and chair (\(M\,=\,2.20, SD\,=\,2.18\)), \((q \,=\, 4.34)\), table (\(M\,=\,0.60, SD\,=\,1.12\)) and wardrobe (\(M\,=\,2.20, SD\,=\,2.18\)), \((q = 4.34)\), table (\(M\,=\,0.60, SD\,=\,1.12\)) and trash bin (\(M\,=\,4.13, SD\,=\,2.59\)), \((q = 9.58)\), chair (\(M\,=\,2.20, SD\,=\,2.18\)) and trash bin (\(M\,=\,4.13, SD\,=\,2.59\)), \((q = 5.12)\), armchair (\(M\,=\,1.73, SD\,=\,2.15\)) and trash bin (\(M\,=\,4.13, SD\,=\,2.59\)), \((q = 6.51)\), drawer (\(M\,=\,1.73, SD\,=\,2.34\)) and trash bin (\(M\,=\,4.13, SD\,=\,2.59\)), \((q = 6.51)\), and bed (\(M\,=\,1.53, SD\,=\,1.96\)) and trash bin (\(M\,=\,4.13, SD\,=\,2.59\)), \((q = 7.05)\).

Fig. 2.
figure 2

Subjective assessment of haptic cursor sensations related to objects on the Likert scale.

Figure 2 shows the results of the subjective assessment of the haptic cursor. The majority of 87 % participants agreed that the interaction using the mid-air haptic cursor was comfortable. The following questions focused on individual objects: the bed was assessed as the most noticeable object, the haptic cursor associated with the table as the strongest, and the bed with drawer was assessed as the most easily distinguishable from other objects. On the contrary, the trash can was assessed as the least noticeable, strong, and distinguishable object.

Observations. Among the study group, we observed that 8 individuals opted to use both hands for exploration, with two of these participants demonstrating a noticeable preference for their dominant hand. Conversely, the remaining 7 participants engaged in the task by exclusively using their dominant hand.

During the study, 7 participants reported experiencing symptoms of fatigue, ranging from tingling sensations in their fingers to numbness after a certain period of time. These sensations were perceived with varying intensity among the individuals. In some cases, the impact of fatigue was minimal, allowing participants to continue with little disruption. In more severe cases, participants who had previously used both hands used only one hand as they progressed, switching hands, rubbing their fingers or taking short breaks to relieve discomfort. The other 8 participants did not experience such symptoms.

Most of the participants (9) described the sensation of a haptic cursor as a stream of air. Two even described it as having different temperatures. All participants indicated that they experienced additional/false haptic cursors during the task, which increased the overall difficulty. However, they adapted their strategy by selecting the object where the haptic cursor was the strongest. These additional points perceived as haptic cursors can be explained by secondary focal points of the ultrasonic array.

5 Discussion

The experiment positively answered our research question – a combination of mid-air ultrasonic haptics and the physical environment is plausible. After a short learning period, participants were able to distinguish which object was marked by a haptic cursor with high accuracy. However, the combination of ultrasonic mid-air haptics and physical map, as depicted in Fig. 1, also involves specific issues. Most importantly, the haptic cursor is still present, but significantly less strong and noticeable for objects shielded from the ultrasonic array (trash bin) by other objects (wardrobe). We also observed that shielding the intended haptic cursor by participants’ hand(s) is possible, most often for those who chose the bi-manual exploration strategy.

6 Conclusion and Future Work

The fusion of a map comprising physical objects and mid-air haptics in a role of haptic cursor is an efficient method for guidance toward the objects. Users will more easily match their hand’s 3D position with small-scale 3D objects positioned on a flat (2D) surface. An experiment with 15 participants showed that the average FNR was \(14.4\,\%, SD = 7.7\,\%\). Considering that the experiment purposely involved edge cases of objects shielded by other objects, the accuracy could be considerably better for well-tuned setups.

Although the proposed method requires a non-standard orientation of the ultrasonic array, the results show that it is still very efficient in conveying information. This opens new possibilities for combining ultrasonic mid-air haptics with physical objects. In this paper, we focus on a low-abstraction (skeuomorphic) interactive haptic map. Still, any use case involving a haptic cursor marking a specific area in the 3D space could be considered. The community can use it to guide users towards a particular spot or, in contrast, to convey information about places where the presence of fingers is not wanted. In this way, it could be used to improve learning to play musical instruments like the piano.

A more detailed investigation of other objectives, such as response times and subjective outcomes, is out of the scope of this format. Evaluation with older adults with vision impairments is the subject of imminent future work. We also plan to evaluate the method in other use cases involving different types of topographical maps. A comparative study focused on other guidance methods involving haptic interaction will provide more insights into the application of these methods for particular use cases.