Introduction

Tool use recalibrates the user’s spatial body representations, a phenomenon known as tool embodiment. In their seminal study, Iriki et al. (1996) found that the visual receptive fields of bimodal visuo-tactile neurons in the macaque parietal cortex extended to include the body of a rake after use (see also Hihara et al. 2006; Quallo et al. 2009). Tool embodiment has since been extensively observed in humans, particularly for body representations underlying action (Cardinali et al. 2009, 2012, 2016a; Baccarini et al. 2014; Bourgeois et al. 2014; Patané et al. 2016; Patané et al. 2017) and somatosensory perception (Yamamoto and Kitazawa 2001; Cardinali et al. 2009, 2011; Sposito et al. 2012; Canzoneri et al. 2013; Miller et al. 2014; Garbarini et al. 2015; Kilteni and Ehrsson 2017; Miller et al. 2017). However, despite two decades of research (for reviews, see Maravita and Iriki 2004; Martel et al. 2016), the rules governing tool embodiment are still largely unknown.

The majority of studies to date have aimed at investigating use-based constraints on embodiment. The most common finding is that embodiment is dependent on a change in the action capabilities of the user. For example, embodiment only occurs when the user has actively wielded the tool (Maravita et al. 2002; Farnè et al. 2005; Holmes et al. 2007; Serino et al. 2007; Kao and Goodale 2009; Brown et al. 2011; Cardinali et al. 2012; Rademaker et al. 2014; Anelli et al. 2015; Garbarini et al. 2015; Miller et al. 2017) or intends to do so (Witt et al. 2005; Costantini et al. 2011). Furthermore, the tool must significantly alter the reaching space of the actor (Sposito et al. 2012; Bourgeois et al. 2014; Patané et al. 2016, 2017). Sensory feedback from multiple modalities is known to be important for conveying this information to the user (Miller et al. 2014; Serino et al. 2015; Miller et al. 2017), with the somatosensory modalities appearing to play an especially crucial role (Sengül et al. 2013; Cardinali et al. 2016b).

Another type of constraint—which we call location-based constraints—determines where on the user’s body effects of tool use can be found. Do effects of embodiment spread to limbs that were not involved in the tool wielding? The role of anatomical proximity to the tool-using limb has received recent attention by researchers. For example, we recently found that using an arm-shaped mechanical grabber modulated the represented shape of the user’s forearm (Miller et al. 2014). However, using this tool did not modulate the shape of the user’s hand representation despite its proximity to the forearm and its integral role in each action. The opposite pattern of results was found when participants used a hand-shaped tool. Several other studies have identified an identical dissociation for motor components of tool embodiment (Cardinali et al. 2009, 2016a). The results of these studies cast doubt on a constraint based on anatomical distance and suggest instead that the location of embodiment is dependent on the congruity between a body part and the tool’s shape (Miller et al. 2014) or functional characteristics (Cardinali et al. 2016a). Whereas embodiment does not spread to body parts that are anatomically close (i.e., the right hand and forearm), embodiment may still spread to body part representations that are cortically close (e.g., the right hand and cheek within the primary somatosensory cortex).

The close cortical proximity between the face and hand representations in primary somatosensory cortex (SI) is one of its defining structural features (Penfield and Boldrey 1937; Yang et al. 1993; Manger et al. 1997). It is well documented that there is a strong relationship between hand and face representations at both the neural (Pons et al. 1991; Manger et al. 1997; Weiss et al. 2004; Muret et al. 2016) and perceptual levels (Ramachandran et al. 1992; Ramachandran and Rogers-Ramachandran 1996; Farnè et al. 2002; Paqueron et al. 2003; Serino et al. 2009). For example, it has been reported that phantom hands can be ‘re-animated’ by stroking the patient’s cheek (Ramachandran et al. 1992; Halligan et al. 1993), although this has recently been challenged (Makin et al. 2013, 2015); sensory effects of anesthetizing the hand often spread to the face (Gandevia and Phegan 1999); and improvements in tactile spatial acuity on the fingertips following repetitive somatosensory stimulation lead to concurrent improvements in acuity on the cheek and lips (Muret et al. 2014). Perhaps, most relevant to the current study, viewing your hand enhances tactile spatial acuity on both the viewed hand (Kennett et al. 2001) and the unseen ipsilateral cheek (Serino et al. 2009). Thus, the perceptual effects of visual–tactile interplay between the posterior parietal cortex (Konen and Haggard 2012; Beck et al. 2015) and the SI hand region (Haggard et al. 2007; Cardini et al. 2011; Longo et al. 2011) spread to the neighboring SI cheek representation.

In the present study, we explored whether using a hand-shaped tool would recalibrate tactile perception on both the user’s hand and the cheek. Using a within-subjects design, we measured tactile distance perception on the tool-using right hand and the ipsilateral right cheek. Changes in tactile distance perception have been shown to be a reliable measure of tool embodiment (Canzoneri et al. 2013; Miller et al. 2014, 2017). Given the previously mentioned findings with tactile spatial acuity (Serino et al. 2009; Muret et al. 2014), we hypothesized that tool use would indeed recalibrate tactile perception on both the hand and the cheek. Such a finding would suggest that tool embodiment is dependent upon mechanisms of plasticity in SI (Fang et al. 2002; Muret et al. 2016) that are similar to other multisensory phenomena, such as the visual enhancement of touch (Kennett et al. 2001; Serino et al. 2009; Cardini et al. 2011). A lack of an effect on the cheek would suggest that the higher level somatosensory regions—where the hand and the cheek have become functionally disentangled—underlie the lasting effects of recalibration. This would be consistent with several studies that have shown that tactile distance perception is separable from spatial acuity (Taylor-Clarke et al. 2004; Miller et al. 2016) and likely relies on body representations outside of SI (Spitoni et al. 2010; Longo and Haggard 2011; Spitoni et al. 2013).

Materials and Methods

Participants

Twenty right-handed participants (14 females; mean age 21.17, SD 1.29) took part in the experiment for course credit. The experiment was run under the ethical guidelines of the University of California, San Diego, and all participants gave informed consent before participating in the experiment.

Tactile distance judgment task

Tactile distance perception was measured using a tactile distance judgment task (TDJ), a common task for implicitly measuring the morphology of tactile body representations (Green 1982; Anema et al. 2008; Spitoni et al. 2010; Longo and Haggard 2011; Knight et al. 2014; Longo et al. 2015; Miller et al. 2016; Calzolari et al. 2017). Importantly, several studies have demonstrated that the TDJ is a sensitive measure of representational plasticity (Taylor-Clarke et al. 2004; de Vignemont et al. 2005; Tajadura-Jiménez et al. 2012; Canzoneri et al. 2013; Longo and Sadibolova 2013; Miller et al. 2014; Bassolino et al. 2015; Tajadura-Jiménez et al. 2015; Miller et al. 2017). In the version of the TDJ used in the present study (Longo and Sadibolova 2013), participants verbally estimated the distance between two tactile points (separated by 4, 5, or 6 cm) applied manually to a target body surface with a stainless steel digital caliper. Tactile points were administered longitudinally to two target body surfaces—the dorsal surface of the right hand and the right cheek—in separate blocks. Stimulation lasted for approximately 1 s after which the participant verbally reported their distance estimation; we did not place any time constraints on how quickly after stimulation participants were required to make their verbal report. The experimenter manually entered each judgment into a computer once it was reported. Since the experiment was performed in the United States, many participants responded with judgments in inches (11 out of 20). These judgments were then converted offline into centimeters for all analyses. The time between tactile stimulation and the next trial (i.e., inter-trial interval) was dependent upon the time that it took the participant to make their judgment as well as the experimenter to prepare the stimulation apparatus. Each tactile distance was administered ten times for a total of 30 trials in each block; the order of presentation was pseudo-randomized. The body surface stimulated in each block was counterbalanced in ABBA fashion; the specific surface assigned to each condition was counterbalanced across participants. The latter blocks (i.e., BA) each occurred following separate instances of the tool use procedure (see below).

Tool use procedure

Participants wore a custom-built plastic hand-shaped tool that was modeled after a human hand (Fig. 1, left panel). Each finger of the tool was composed of three plastic “phalanges” connected via fully adjustable rubber “joints”. The participant’s fingers rested in leather straps attached to the tool’s fingers, allowing for their individual control; movement of each strap led to a concurrent movement of the corresponding finger of the tool. This finger-tool connection ensured that the functional precision of the tool and the user’s own hand was similar. The tool was approximately 21 cm wide, as measured from the base of the pointer finger to the base of the pinky, and 45 cm long, as measured from the base of the tool to the tip of the middle finger.

Fig. 1
figure 1

Hand-shaped tool and object-interaction task. Participants used the hand-shaped tool to pick up balloons and place them into a bucket, a task that we have previously used to study tool embodiment (Miller et al. 2014). This procedure was self-paced and occurred for a total of 8 min

The participants’ task with the tool (Fig. 1, right panels) was to grasp balloons (placed approximately 75 cm from the midline of their trunk) and transport them into a bucket (placed approximately 75 cm laterally from the right of the body). The balloons were then moved back into position by the experimenter before the next action commenced. All participants were able to quickly master the use of the tool to perform this task. The task was self-paced, and participants were encouraged to take breaks if they felt fatigued. This procedure lasted for 8 min and was done before each of the last two experimental blocks.

Goodness of fit

It is important that any changes that we observe in tactile perception are not due to fluctuations in the ability of participants to perform the TDJ task. We therefore set two exclusion metrics prior to conducting the experiment, which were assessed within each participant for each body part (hand and cheek) and time (pre- and post-tool use). First, we assessed whether participants’ judgments changed as a function of tactile distance by examining the slopes of regression lines fit to their judgments. All participants had positive slopes in all conditions (global mean 1.13). However, one participant had an extremely shallow slope in one condition (0.17) and was therefore excluded. Second, we used R 2 values as a measure of goodness of fit between estimated and actual tactile distances in each condition. We set an R 2 cutoff of 0.7 before beginning the experiment. One participant had a low goodness of fit in one condition (R 2 < 0.5) and was therefore excluded. The 18 remaining participants included in the analysis had high R 2 values in all conditions (global mean 0.97). Crucially, tool use did not affect the goodness of fit for judgments on the hand (p = 0.32) or cheek (p = 0.67). Furthermore, the exclusion of the two participants did not affect the counterbalancing of the experiment or the pattern of our results.

Quantifying embodiment

We quantified the magnitude of tool embodiment in two distinct ways. We first compared the actual verbal distance judgments for each body part before and after tool use. However, this method is potentially problematic because, assuming that the size of a body representation is scaled by a single value, the amount of modulation in verbal distance estimation will be dependent on stimulus distance. Therefore, to get a more meaningful measure of tool embodiment as it relates to the change in represented body part size, we extracted the percentage change in responses following tool use for each stimulus distance. This approach has two main advantages: first, it removes baseline individual differences in tactile distance perception between participants and second, it provides a meaningful quantification of embodiment as it is more directly related to the magnitude of change in body representation size (Miller et al. 2014, 2017) and is independent of stimulus distance. Several previous studies have found that tool use compresses (i.e., decreases) tactile distance perception in the longitudinal orientation (Canzoneri et al. 2013; Miller et al. 2014, 2017). We therefore present percentage change in units of compression, where positive values indicate perceptual compression and negative values indicate perceptual expansion.

Results

We first investigated the presence of tool-induced modulations in verbal distance estimates (Fig. 2) with a 3 (stimulus distance: 4, 5, 6 cm) × 2 (body part: hand, cheek) × 2 (tool use: pre, post) repeated measures ANOVA. We found a significant main effect of stimulus distance [F(1.35,22.92) = 241.43, p < 0.0001, \(\eta_{\text{p}}^{2}\) = 0.93], demonstrating that participants did indeed increase their judged distance as the actual stimulus distance increased. We also found significant main effects of time [F(1,17) = 8.00, p = 0.012, \(\eta_{\text{p}}^{2}\) = 0.32] and limb [F(1,17) = 9.36, p = 0.007, \(\eta_{\text{p}}^{2}\) = 0.36]. As the present study was concerned with whether tool embodiment was limb specific, the interactions between factors provided the crucial tests of our hypotheses. Indeed, we found a significant interaction between the factors body part and tool use [F(1,17) = 15.52, p = 0.001, \(\eta_{\text{p}}^{2}\) = 0.48], driven by a significant tool-induced compression in tactile distance perception on the hand [mean 0.57 cm, SEM 0.11; t(17) = 5.00, p < 0.001, d z = 1.18] but not the cheek [mean −0.005 cm, SEM 0.13; t(17) = 0.04, p = 0.97, d z < 0.01]. There were no other significant interactions (all p > 0.2).

Fig. 2
figure 2

Modulation of verbal distance estimates on hand and face. Using the hand-shaped tool recalibrated the perceived distance between two points of touch on the hand (left). The estimated distance post-tool use (green) significantly decreased by an average of 0.57 cm. In contrast, we did not find evidence for recalibration on the cheek (right). This is clearly demonstrated by the overlapping distance estimations pre- and post-tool use (dark blue) (color figure online)

We next investigated the percent recalibration in tactile distance perception (Fig. 3) with a 3 (stimulus distance: 4, 5, 6 cm) × 2 (body part: hand, cheek) repeated measures ANOVA. It should be noted that as this analysis removed baseline differences in pre-tool use distance perception, a main effect of body part is analogous to the interaction between body part and tool use in the previous ANOVA. As expected, we indeed found a significant main effect of body part [F(1,17) = 14.74, p = 0.001, \(\eta_{\text{p}}^{ 2}\) = 0.46], replicating our previous analysis. However, we did find not a significant main effect of stimulus distance [F(1.98,33.58) = 0.23, p > 0.7, \(\eta_{\text{p}}^{ 2}\) = 0.01] or a significant interaction between body part and stimulus distance [F(1.59,27.04) = 0.93, p = 0.39, \(\eta_{\text{p}}^{ 2}\) = 0.05], demonstrating that the magnitude of perceptual recalibration did not depend upon the stimulus distance (Fig. 3a). This is expected if tool use led to a scalar recalibration in the represented hand size. We therefore collapsed across stimulus levels into a single measure of tool-induced recalibration for each body part (Fig. 3b). This revealed that tool use led to a significant compression in tactile distance perception on the hand (mean: 13.25%, SEM 2.92) but not the cheek (mean −0.05%, SEM 3.54).

Fig. 3
figure 3

Percent compression of tactile distance perception. a Magnitude of compression across each stimulus level for touch on the hand (green) and face (dark blue). Compression did not vary systematically as a function of stimulus distance. Importantly, however, significant difference between the perceptual recalibration on each body part is evident. This is also evident in b, where we have collapsed across each stimulus distance into a single measure of compression. The complete lack of evidence of recalibration on the face provides evidence that tool embodiment is limb specific. Error bars correspond to one SEM. ***p < 0.001 (color figure online)

The effect of perceptual compression (conceptualized as a % compression above 0) on the hand was highly robust. When collapsed across stimulus levels—as we did in our analysis above—we found that compression was present to some extent in 17 out of 18 participants (Fig. 4a). This result was also found when looking at each stimulus level (40 cm: 16 out of 18 participants; 50 cm: 17 out of 18; and 60 cm: 15 out of 18). Similar proportions of participants exhibiting effects of tool embodiment have been reported previously (Cardinali et al. 2011). Conversely, positive levels of compression on the face were only found in 9 out of 18 participants. Similar results were found when looking at each stimulus level (40 cm: 10 out of 18 participants; 50 cm: 9 out of 18; and 60 cm: 9 out of 18). Furthermore, compression on the hand was larger than on the face for the majority of our participants (16 out of 18; Fig. 4b).

Fig. 4
figure 4

Embodiment for each participant. a Rank-ordered magnitude of compression found on the hand (left) and face (right) for each participant. Only one participant (out of 18 total) did not show some degree of perceptual compression on the hand following tool use, attesting to the strength of the effect of tool embodiment. This pattern of results was not found on the face (right), where the mean compression was essentially zero. b Within-subject comparison of compression on the cheek (x-axis) and hand (y-axis). The majority of points (16 out of 18; green) were above the equality line (dashed gray line), meaning that these participants had larger compression on the hand than the cheek. Points below the equality line (i.e., greater compression on the cheek than the hand) are colored blue (color figure online)

Each analysis thus far has found no evidence that tool use recalibrated tactile distance perception on the cheek, although, as noted above, half of the participants did show some level of perceptual compression on the cheek. One possible explanation for this is that task-irrelevant individual differences (e.g., fatigue, mind-wandering, etc.) acted as sources of noise in our measurements of tactile distance perception between blocks. It would therefore be expected that half of the participants would appear to exhibit compression on the cheek when none occurred. Alternatively, this may instead reflect a real relationship between the magnitude of compression on the hand and the cheek. For example, if the relationship was positive, participants with greater compression on the hand would exhibit greater compression on the cheek. A finding such as this would provide evidence that tool-induced modulations of cortical activity in the SI hand representation do—to some extent and in some conditions—spreads to the neighboring cheek representation.

To adjudicate between these two possibilities, we performed two exploratory analyses on the relationship between compression on the hand and cheek: first, we investigated whether the magnitude of perceptual compression on each body part was correlated (Fig. 5a). This analysis revealed a weak relationship between the two body parts [r(16) = 0.438, p = 0.069] that was largely driven by a single participant who had high compression on both the hand and the cheek; this relationship became much weaker when this participant was removed [r(15) = 0.18, p = 0.49]. Second, we performed a median-split analysis, where we divided participants into groups based on the magnitude of perceptual compression found on their hand; importantly, we then statistically evaluated compression on their cheek. Participants were divided into one of two groups: small compression on the hand (“small” group; mean 4.05%, SEM 3.02, range −18.8 to 10.9%) or large compression on the hand (“large” group; mean 22.45%, SEM 2.46, range 16.2–39.4%). The compression on the cheek observed in each group (Fig. 5b) was not significantly different than zero (both ps > 0.4). Furthermore, while there was a numerical difference between the small group (mean −3.31, SEM 4.41) and the large group (mean 3.20%, SEM 5.70), this difference was not statistically significant [t(16) = 0.92, p = 0.37, d = 0.43]. These results provide further evidence that embodiment on the hand does not spread to the cheek. Task-irrelevant sources of noise are therefore a more likely explanation for the observed compression on the cheek in half of our participants.

Fig. 5
figure 5

Exploratory analyses searching for compression on the cheek. a Correlation between the perceptual compression on the cheek and the hand after tool use. We found a trending significant relationship between the two variables (black line). This was largely driven by a single participant (red box) that had high compression on both body parts. When this participant was removed, the relationship became much weaker (gray line). b Median split analysis: participants were divided into two groups, corresponding to the magnitude of compression on the hand; we termed these groups small (white) and large (gray; see the main text for more details). We statistically evaluated compression on the cheek for each group. Perceptual compression on the cheek was not significant for either group. Furthermore, the magnitude of the compression between each group was not significantly different. Error bars correspond to one SEM

Discussion

In the present study, we investigated whether perceptual recalibration following tool use remains isolated to the user’s hand or transfers to their cheek, body parts that are adjacently represented in SI. Using a well-established behavioral paradigm, we replicated our previous finding that a hand-shaped tool modulates tactile perception on the hand (Miller et al. 2014). The high proportion of participants exhibiting evidence of perceptual modulation (17 out of 18) further attest to embodiment as a robust sensorimotor phenomenon. In stark contrast, and despite our initial hypothesis, we found little to no evidence for a similar modulation on the user’s cheek. These findings provide further evidence that embodiment is body-part specific and gives clues to the mechanisms of tool-induced perceptual recalibration in humans.

The body-part specificity of tool embodiment has received little attention in the literature. However, several recent studies have found that effects of tool use do not spread from the body part embodying the tool (e.g., the forearm) to an anatomically adjacent body part (e.g., the hand) (Cardinali et al. 2009; Miller et al. 2014; Cardinali et al. 2016a). This lack of spread is somewhat surprising, given that each body part tested in these studies played a role in the transport and manipulation of the tool. Therefore, factors other than anatomical proximity must be constraining which body parts are modulated by tool use.

The present study instead investigated the role of cortical proximity in the spread of embodiment effects, leveraging the small physical distance between the hand and face regions of SI. Previous psychophysical studies measuring tactile spatial acuity (Serino et al. 2009; Muret et al. 2014), perceptual changes following anesthesia (Gandevia and Phegan 1999; Paqueron et al. 2003) and reafferentation (Farnè et al. 2002), and phantom limbs (Ramachandran and Rogers-Ramachandran 1996)—all showing a close relationship between perception on the hand and cheek—provide precedence for the possibility that a modulation of the hand representation spreads to the neighboring cheek representation. Given the findings of these studies, we hypothesized that using a hand-shaped tool would recalibrate tactile distance perception on both the hand and the cheek. However, we found no evidence for this hypothesis; recalibration was specific to the user’s hand, whereas tactile perception on the cheek was essentially identical before and after tool use. In the context of our previous study (Miller et al. 2014), the results of the present study suggest that embodiment is body-part specific and independent of both the anatomical (e.g., hand vs. arm) and cortical (i.e., hand vs. cheek in SI) proximity of two body parts.

The present study found no evidence that tool uses recalibrated tactile perception on the cheek. It is possible, however, that our failure to find an effect was due to our choice of task. The specific TDJ paradigm that we used only measures perceived distance between two tactile points. It therefore provides a coarse characterization of tactile distance perception across the entire surface of the body part and will only be sensitive to rather uniform modulations in the underlying tactile body representation. Longo and Golubova (2017) recently developed a novel variant of the TDJ that can be used to map the tactile space of a body part on a fine spatial scale. In their paradigm, tactile distance perception is measured between all possible pairwise stimulations of an n by m grid (e.g., 4 × 4) drawn on the surface of a body part. Multidimensional scaling is then used to reconstruct the geometry of its tactile space. Similar tactile localization paradigms have also been used for fine-grained mapping of tactile space (Mancini et al. 2011; Ferre et al. 2013). Applying these methods to the cheek may provide a more sensitive test of whether tool use recalibrates a tactile representation of the cheek, an effect that is likely to be subtle if it exists.

If not anatomical or cortical proximity, what aspects of tool use constrain the extent of embodiment? It has recently been proposed that morpho-functional similarities between a body part and tool may be one key component in this process (Miller et al. 2014; Cardinali et al. 2016a). For example, using a long mechanical grabber (>30 cm) modulates motor and tactile representations of the arm but not other body parts (Cardinali et al. 2009, 2011; Miller et al. 2014; Jovanov et al. 2015). Conversely, when there is a close morpho-functional correspondence between the control of a tool and the fingers—as was the case for the tool used in the present study (see “Materials and methods”)—embodiment targets the hand (Miller et al. 2014; Cardinali et al. 2016a). The results of the present study are in line with this hypothesized constraint; there was high morpho-functional similarity between the tool and the hand and high dissimilarity between the tool and the cheek. It should be noted that in all published studies to date, including the present study, this link is correlational. Further research exploring this constraint should actively manipulate morpho-functional correspondences between the tool and body parts to more rigorously test its importance in embodiment.

It is a well-known feature of SI that neurons coding for touch on the hand closely neighbor those coding for touch on the face (Penfield and Boldrey 1937; Yang et al. 1993). Interaction between these neurons is one hypothesized reason that perceptual effects on the hand transfer to the face (Farnè et al. 2002; Serino et al. 2009; Muret et al. 2014). Despite some preliminary evidence that tool use modulates the hand representation in SI (Schaefer et al. 2004), we did not find a perceptual modulation on the tool user’s cheek. However, the study by Schaefer and colleagues found a modulation of the hand representation during tool use but not rest, whereas we measured tactile perception during rest after participants had finished the tool use task. This suggests, albeit indirectly, that the lasting perceptual recalibration that we observe is—at least partly—independent of changes in SI. Indeed, body representations in posterior parietal regions likely underlie tactile distance perception (Spitoni et al. 2010; Longo and Haggard 2011; Spitoni et al. 2013; Miller et al. 2014). This is consistent with the known dissociation between tactile distance perception and tactile spatial acuity (Taylor-Clarke et al. 2004; Miller et al. 2016). Furthermore, the results of Schaefer and the present study suggest that the specific mechanisms of embodiment may change over different timescales, a hypothesis that has some recent empirical support (Ganesh et al. 2014).

The computational and neural mechanisms underlying the process of tool embodiment are not well characterized. There is strong multimodal evidence in macaques for the involvement of posterior parietal regions coding for multisensory body representations (Iriki et al. 1996; Hihara et al. 2006; Quallo et al. 2009); a recent study using repetitive transcranial magnetic stimulation has extended this finding to humans (Giglia et al. 2015). Given the hypothesized involvement of these multisensory body representations in the perception of tactile distance (Taylor-Clarke et al. 2004; de Vignemont et al. 2005; Tajadura-Jiménez et al. 2012; Longo and Sadibolova 2013; Tajadura-Jiménez et al. 2015; Miller et al. 2016, 2017), it is possible that the modulation of activity in posterior parietal regions (Spitoni et al. 2013) underlies the observed perceptual recalibration on the hand. Future neuroimaging work with humans is needed to determine the neural correlates of tool embodiment. However, the present findings suggest that the observed long-lasting perceptual recalibration of tactile distance perception occurs at a level of somatosensory processing outside of SI.

Conclusion

In conclusion, the present study provides further support that embodiment of a tool is body-part specific. We found that using a hand-shaped tool recalibrated tactile perception on the user’s hand, but not their cheek. Given the close proximity between hand and cheek neurons in SI, our results suggest that embodiment occurs at a level of somatosensory processing, where both body parts have become functionally disentangled. Future work is needed to understand the neural and computational mechanisms underlying tool embodiment and its body-part specificity.