Introduction

A challenge in understanding multisensory integration is how the human brain integrates spatial information from different modalities all coded in different reference frames. One theory is that there is a multimodal map that integrates multisensory information into a single spatial representation. To transform between body coordinates and retinotopic coordinates, the brain must consider the current posture of the body as well as either (1) the location of the eyes and the head or (2) the location of gaze (because gaze is the sum of eye and head positions). If tactile information were coded in retinotopic coordinates, then any errors in the representation of the position of either the eye or head would cause systematic shifts in touch localization.

Errors in coding the position of the eyes (Harris and Smith 2008) and systematic shifts in tactile localization related to eye and head positions have previously been demonstrated (Harrar and Harris 2009, 2010; Ho and Spence 2007), suggesting that touch is indeed coded in retinotopic coordinates. Eye position was found to cause shifts in the location of touch in the same direction as eye position (Harrar and Harris 2009), while head position has been found to cause shifts in the opposite direction (Ho and Spence 2007). This would suggest that eye and head positions are coded separately in the sensory transformation, with the signal for eye eccentricity being underestimated and the signal for head eccentricity being overestimated. Large differences between the techniques using head and eye positions in these studies make comparing their results difficult. The research on eye position used solenoid touches on the arm while the head position research used vibration on the torso. The two studies also had substantial procedural differences. The present study allows for comparison of the effects of head and eye eccentricity on touch localization directly.

Errors due to eye and head positions have been found for auditory (Collins et al. 2010; Goossens and van Opstal 1999; Graziano 2001; Kopinska and Harris 2003; Lewald 1998; Lewald and Ehrenstein 1996a, b, 1998), visual (Harris and Smith 2008; Kopinska and Harris 2003; Wexler 2003), and proprioceptive (Fiehler et al. 2010; Lewald and Ehrenstein 2000) localization, suggesting that spatial information across all these senses may be integrated into a single common retinotopic reference frame.

The majority of this research has investigated either the effect of eye position or the effect of head position on spatial localization but, to our knowledge, only one study has directly compared the effects of eye and head positions and how they combine. Lewald and Ehernstein (1998) reported that both head and eye position affected the perceived location of a sound and that both effects were in the same direction (opposite to the direction of the head and eye) and of approximately the same magnitude. When the eyes and head were in opposite directions (e.g., eyes 30 degrees left and head 30 degrees right, such that gaze remained straight ahead), the effects appeared to cancel out indicating linearly combining effects.

We investigated the effects of eye and head position on the perceived location of touch. Participants held their head to the left, right, or center of their body and their eyes to the left, right, or center in their head while a touch was applied to the arm. Participants reported the position of the touch relative to a visual probe. They centered their eyes and heads before the probe was presented in order to avoid any possible effects of eye and head positions on the perceived location of the probe.

Methods

Participants

Four women and six men with an average age of 32 years participated. One male participant was left handed, and all others were right handed. All had normal or corrected-to-normal vision. Experiments were approved by the York Ethics board.

Apparatus

The touch apparatus consisted of two solenoids encased in a box with pins facing upwards. When power (amplified 5 volt signals from a CED1401 interface box (Cambridge Electronic Design, Cambridge, UK) controlled by a PC) was delivered to each solenoid its pin extended about 2 mm from the surface of the box for 50 ms. The pins were located at approximately 6 and 11.5 cm from the subject’s wrist (or 2.5 degrees left and 3.5 degrees right of straight ahead) (see Fig. 1).

Fig. 1
figure 1

Apparatus set up and experimental conditions. a Touch locations were at 6 degrees left and 7 degrees right from straight ahead, or 6 and 11.5 cm from a star on the box that encased the solenoids. The star was approximately aligned with the wrist crease. The screen displaying fixation points and a probe line was positioned directly behind the touch box, 5.2 cm behind the solenoids and 29 cm from the viewer. The bottom edge of the screen was level with the top of the solenoids. b For each of three head positions (15 degrees to the left, 15 degrees to the right, and centered), three eye positions were used, such that the eyes could be centered or at 15 degrees to the left or right in their orbits. Nine combinations of eye and head positions led to five different gaze positions, 30 degrees to the left and right, 15 degrees to the left and right, and centered. A laser mounted to the head allowed for precise control of head position

A flat screen computer monitor (54 cm, resolution 1600 × 1200 pixels) was positioned vertically 5.2 cm behind the solenoids and 29 cm from the viewer. It was used to display gaze and head fixation points as well as a probe line used for comparison with the perceived location of the touch. The probe line was a 15 cm long, one pixel wide, red, vertical line positioned on the screen with its top 5.5 cm (10.7 degrees) below the gaze fixation points. The bottom of the probe line was at the bottom of the screen, which was at the same height as the arm when positioned over the solenoids.

Controlling head and eye position

Participants wore a baseball hat with a laser pointer attached to the rim. They aligned the laser beam with head fixation targets presented on the screen. Three head fixation points were used: −15° (left), 0° (straight ahead) and +15° (right). For each head fixation, three eye fixations were used: −15° (left in head), 0° (centered in head) and +15° (right in head). Thus, five gaze fixation points were needed (−30, −15, 0, +15, +30°, relative to the body straight ahead). Head fixation points were positioned at the approximate height of the laser point projected from the hat and gaze fixations were positioned 3 cm below the head fixations at eye height. Figure 1 shows the arrangement of the apparatus and the head and gaze fixation points.

Procedure

Participants were seated in front of the apparatus and wore headphones to muffle the sound of the touches and a baseball hat with mounted laser pointer. The hat was adjusted such that the laser pointed directly at the “centered” head fixation point when their head was oriented straight ahead. Participants then positioned their arm across the touch box and aligned their wrist crease with a star on the box (see Fig. 1).Footnote 1

Each trial began with head and eyes centered. A head fixation cross was displayed in one of the 5 locations, and the participant was allowed 1 s to turn their head and point the head-mounted laser at the cross. Next, a gaze fixation point was displayed which the participant foveated. One second later, both the gaze and the head fixation points were removed from the screen. The subject maintained their head and eye position while a touch was administered at one of the two locations on the arm. A central fixation point was presented 500 ms after the touch for duration of 2 s, directing participants to recenter their head and eyes before responding. After the head and eyes were recentered, a vertical line probe was presented. Subjects were allowed to move their eyes to the line to make a judgment regarding whether the line was to the left or right of where they were touched. The line remained visible until a response was made, using left and right foot presses. The subject’s response initiated the next trial.

The position of the line probe was controlled by a best PEST adaptive procedure (Pentland 1980). For the first trial of each condition, the location of the reference line on the screen was chosen randomly. In subsequent trials, the reference line was moved to the left or right depending on the participant’s response to the previous occurrence of that condition. Step size was initially 100 mm and was halved after each reversal and doubled after three consecutive steps in the same direction. The minimum step size was 1 mm. Once the minimum step size was reached, the PEST staircase terminated for that condition and the final location of the probe line was taken as the perceived touch location. Staircases for each of the 18 conditions were interleaved and randomly selected during testing. The entire session lasted approximately 50 min.

Results

Figure 2 plots the effect of gaze on perceived touch location. A three-way repeated measures ANOVA was conducted for effects of touch location, eye eccentricity, and head eccentricity. Eye position significantly affected perceived touch location (F (2,18) = 4.37, P = .033, η 2 P  = .33). When the eyes were to the left in their orbits, the perceived location of the touch appeared displaced to the left, and when eyes were to the right in their orbits, the perceived location of the touch was displaced to the right relative to the perceived location when the eyes were straight ahead. Similarly, perceived touch location was significantly influenced by head position. The perceived location of the touch was displaced in the same direction as the head (F (2,18) = 6.03, P = .01, η 2 P  = .40). A significant effect of solenoid location (F (1,9) = 79.08, P < .001, η 2 P  = .90) confirmed that perceived touch location was related to the area of skin where the touch was administered and that the two touches could be discriminated. No significant interactions were found, indicating that the effects of eye and head position were independent (F (4, 36) = 0.82, P = .52). Also, the effect of eye (F (2, 18) = 2.70, P = .09) and head (F (2, 18) = 0.14, P = .87) position did not depend on touch location. The three-way interaction was also not significant (F (4,36) = 1.04, P = .40).

Fig. 2
figure 2

Effect of gaze position on perceived touch location on the right arm. Data points represent the results for each solenoid for each combination of head and eye position. Data for the eye positions at a particular head position are marked by different symbols (square for head 15 degrees left, circle for head centered, and triangle for head 15 degrees right). Error bars show one standard error of the mean. Regression lines are fitted to the entire data set for each solenoid. The regression equations are indicated on the figure. Dashed gray lines indicate the actual location of each solenoid. Larger numbers for gaze position and perceived touch location indicate positions further to the right and toward the elbow. Gaze shifted the perceived location of touches by 0.38 mm per degree of eccentricity

A direct comparison of the effects of eye and head positions is presented in Fig. 3, which plots the effects of eye position (collapsed across head position) and the effects of head position (collapsed across eye position) superimposed on each other. The regression lines presented in Fig. 3 show an average effect of +0.30 mm of touch displacement per degree of eye eccentricity and +0.48 mm touch displacement per degree of head eccentricity. If the effects of eye and head positions differ, an interaction of body part and direction of eccentricity should be found. A three-way repeated measure ANOVA was conducted with body part (eye or head), direction of eccentricity (15° left, center, or 15° right), and touch location as factors. The body part by direction of eccentricity interaction was not significant (F (2,15) = 0.17, P = .80). Similarly, no significant effect of body part (head or eye) was found (F (1,9) = 0.13, P = .73). Together, these results indicate that the head and eye effects were not significantly different.

Fig. 3
figure 3

The effects of eye-in-head (filled symbols) and head-on-body (open symbols) on perceived touch location for the two touch locations (squares and triangles) were similar. Data for eye position were obtained by averaging across head position. Data for head position were obtained by averaging across eye positions. Error bars show one standard error of the mean. Regression lines were fit to the data points shown for the average effect of head and eye position for each solenoid. The effect of eye position was 0.37 and 0.23 mm per degree of eccentricity for solenoids at 11.5 and 6 cm from the wrist, respectively. The effect of head position was 0.53 and 0.44 mm per degree of eccentricity for the same two solenoids. Larger numbers for eye or head position and perceived touch location indicate positions further to the right and toward the elbow

Discussion

The perceived location of a mechanical touch on the arm was found to shift in the same direction as an eccentric eye or head position and by approximately the same amount in each case. The effects were independent of one another and appeared to be linear over the ±15-degree range tested. This is consistent with a gaze signal being used to convert between body and retinotopic coordinates. If a different reference frame were used, gaze position would not be necessary to compute the transform and gaze-related errors would not be expected. This conversion may be done in order to represent tactile and visual locations in the same coordinates.

Our findings add to converging evidence that touch location is coded in an egocentric visual reference frame. Research using transcranial magnetic stimulation indicates that the posterior parietal cortex remaps touch location from body to external or visual coordinates (Azañón et al. 2010; Bolognini and Maravita 2007). Also, blind people appear to code touch location differently from sighted individuals (Röder et al. 2004, 2008), supporting the idea that a visual coordinate system is normally used in tactile coding.

Shift of perceived touch location with head eccentricity

The perceived location of sounds and lights have been found to depend on head position but only one previous study has looked at the effect of head position on the perceived location of tactile stimuli. The head position–related shift we report here is at odds with the results of that study. Ho and Spence (2007) found that the perceived location of a touch indeed shifted related to head position, but in the direction opposite to head position. The magnitude of the shift we report is also larger than Ho and Spence observed. The head-related shifts noted in the present study were of the order of +0.48 mm per degree of head eccentricity, whereas data from Ho and Spence yield an effect size of only −0.05 mm per degree.Footnote 2

The difference in the magnitude of effects between the two studies may be partially due to different head displacements. Our head position range was only ±15 degrees, while Ho and Spence (2007) used head positions of ±90 degrees. If head position had larger effects near center but the effect saturated at large head positions, this could lead to the much smaller effect size reported by Ho and Spence. Such a saturation of effects may result because of the physiology of head movements, where only about 50% of the total range of rotation actually result from the head rotating around the top two vertebrae, while the additional 50% of rotation of the head comes from rotations within the spine itself (Fielding 1964). If the signal for head displacement from the head rotation around the spine were subject to systematic errors but the signal related to the rotation of the spine was not, the effect would be expected to asymptote at that rotation.

The pattern of stimulation used is another factor that might contribute to the differences in magnitude in the two studies. Ho and Spence (2007) used vibrotactile stimulation at 250 Hz while we used a 50 ms mechanical depression of the skin. These different types of stimulation are encoded by different touch receptors. Our stimuli would optimally stimulate the slowly adapting Merkel receptors, which are the smallest and most useful receptors for tactile spatial localization. In contrast, the vibrotactile stimulation at 250 Hz used by Ho and Spence would optimally stimulate the very rapidly adapting Pacinian corpuscle receptors, which are most sensitive to vibration and have large diffuse receptive fields. The pathways from these receptors are anatomically distinct from the slowly adapting touch receptor system (Friedman et al. 2004) and may correspond to different cortical maps. Vibration-based maps are likely to be less precise which might explain the smaller effect sizes reported by Ho and Spence.

Another possible explanation for the difference in magnitude might arise from the fact that Ho and Spence (2007) used blocked trials at each head position. Since the head remained at an extreme position for several minutes, adaptation could have occurred causing a shift of perceived straight-ahead toward the current head position. As little as 3 min of eccentric head position has been shown to cause a 10% adaptation in perceived straight ahead (Lackner 1973). This might cause a drastic reduction in the systematic errors caused by the head position signal, possibly even reversing them.

Finally, the body part where touches were administered could contribute to the different pattern of errors found. We applied touches to the skin of the forearm, whereas Ho and Spence (2007) used touches on the torso. It is possible that touches to the arm and the torso are coded in different ways, causing different patterns of errors. Perhaps the visual representation of the torso is left–right reversed (as we are more used to seeing our own torsos in a mirror).

Clearly, more thorough investigation into the effects of eye and head positions on touches stimulating different parts of the body and different receptor types is warranted.

Shift of perceived touch position with eye eccentricity

The eye position–related shift we report here confirms the findings of Harrar and Harris (2009) that eye eccentricity shifts mechanical touches on the arm in the same direction as eye position. The present study showed that eye eccentricity shifted the perceived location of touches by +0.38 mm per degree whereas Harrar and Harris report a figure of +0.68 mm per degree. While both studies used the same single mechanical touch to the same part of the arm with interleaved eye position conditions, there are some differences between the two studies. Harrar and Harris touched the left arm while here the right arm was touched. Differing magnitudes of effect could reflect an asymmetry related to arm dominance; perhaps the right arm has touch coded more accurately compared to the left. Also, the method of response was different. In the study by Harrar and Harris, participants respond by reading the location of perceived touch off a ruler placed adjacent to the touch box. The eyes scanned the ruler and were not returned to a central position during the reporting. It is therefore possible that there were effects of eye position on the perceived position of the probe (ruler) as well as on the touch, thus magnifying the apparent effect. In the present study, the eyes were returned to center before responding and the more psychophysically robust PEST method was used.

Shift of perceived touch position with attention

An alternative explanation for our results is that perceived touch position is shifting with attention, rather than specifically due to eccentricity of the eyes and head. This hypothesis was tested by Harrar and Harris (2009). They had had participants maintain a centered eye and head positions while an LED flashed eccentrically diverting participant’s attention in that direction. Participants received a touch on their arm while their attention was diverted and then indicated the location of the touch. The perceived location of the touch was found to shift in the direction of attention but accounted for only about 17% of the effect of eye position. We expect that attention played a similar role in the present experiment and contributes only a small amount to the magnitude of shift we report.

Localization accuracy

While the perceived position of the touch closer to the elbow was accurate when fixating straight ahead, the touch closer to the wrist was always shifted toward the elbow as can be seen in Figs. 2 and 3. The pattern of touch being perceived as closer to the elbow is consistent with other findings that touches on the forearm are perceived proximally to their actual location (Cody et al. 2008).

Are conversions to head and retinal coordinates done in series or parallel?

The systematic errors in perceived location of touch due to eccentric eye and head positions reflect systematic underestimations that are made when accounting for eye and head positions during the reference frame conversion. How these conversions are accomplished is not well understood. It could be that the eye and head positions are accounted for separately, both causing small, independent effects on perceived touch location, reflecting a sequential conversion from body to head to retinal coordinates. Alternatively, eye and head positions could be combined into a gaze signal at an earlier stage of processing, so that only the position of gaze is needed to convert touch location into a visual reference frame. The superior colliculus codes desired gaze in a single signal, with contributions of head and eye position accounted for downstream (Freedman and Sparks 1997; Klier et al. 2001). Our data suggest that the neural code used in tactile-to-visual coordinate transformations uses a single gaze signal of that type, rather than individual signals for eye and head positions.

Conclusion

Gaze eccentricity caused a shift in perceived tactile localization. The effect was the same whether it was due to eye or head displacement. This supports of the idea that touch location is transformed into retinotopic coordinates and that a gaze signal is used to compute the transformation.