Introduction

Social interactions grounds on human capability to recognize others’ emotions: facial expressions are a primary means [1] and their discrimination is an automatic, unintentional and unconscious process [2]. It promotes communication, empathy and social cognition [1, 2]. Facial emotion recognition (FER) capability was extensively investigated not only in clinical diseases, such as autism spectrum disorders [3] and schizophrenia [4, 5], but also in the case of clinical symptoms of anxiety [6, 7] and depression [7, 8]. However, previous and few studies investigated FER capability of adults suffering from obesity, reporting heterogeneous results. Bergmann et al. [9] observed no difference between women affected by obesity and healthy-weight women. Instead, Cserjesi et al. [10] reported reduced capacity in recognizing negative facial expressions, as well as Giel et al. [11] suggested that individuals affected by obesity perceived negative facial emotions as less intensive when compared to healthy-weight individuals. Individuals affected by obesity seem to report scarce emotional awareness, difficulties in adopting emotion regulation strategies, and suppression of emotional expressions [12], with effects not only on social interactions and quality of life, but also the amount of weight and the long-term effects of weight loss rehabilitation [13, 14]. Interestingly, the basis emotions of fear and anger seem to play a crucial role in the dysfunctional eating behaviors [7]. However, these emotions are very different: indeed, the recognition of fear has described to have a crucial role for the individuals’ survival, since it actually serves an important role in keeping individuals safe as it mobilizes them to cope with potential external danger [15,16,17]. On the other hand, the emotion of anger has a critical social meaning when it is recognized on others’ face, suggesting that individuals would confront themselves with someone who is expressing negative and potentially threatening feelings towards us [18].

FER capability is traditionally investigated requiring individuals to explicitly report what it is the emotion expressed by the others’ face. However, in this experimental procedure, participants might be aware of the experimental question, increasing not only the risk of controlled responses but also the role of personal motivation and attitudes. To partly solve this issue, implicit tasks—in which the performance is rated from an automatic and unaware participant’s behavior—might be preferable. For example, Cserjesi et al. [10] used a task grounded on the implicit priming effect, which grounds on the fact that the processing of an emotional target is generally influenced by the emotional stimuli presented after (i.e., the prime). However, it should be noticed that in Cserjesi and colleagues’ work, schematic (and not real human) facial expressions were used as prime, while emotional words were used as target; thus, it would be argued that in this study, the FER capability was not directly investigated.

Recently, we presented a FER task grounded on the implicit “redundant target effect” [19]: individuals generally respond faster when two identical targets are presented simultaneously rather than when presented alone; moreover, the competitive presence of a distractor (that is another emotion or a neutral expression) affects the correct recognition of the target. Here, we took advantage of this attentional mechanism to investigate the capability of individuals affected by obesity in recognizing the facial emotion of fear and anger in comparison with a sample of healthy-weight individuals. If the capability to recognize fear and anger in individuals affected by obesity was preserved, they would show the redundant target effect as well as healthy-weight participants.

Methods

Participants

This study was approved by Ethics Committee of the I.R.C.C.S Istituto Auxologico Italiano (Italy). Subjects participated voluntarily; they gave informed written consent, were free to withdraw at will and were naïve to the rationale of the experiment. Twenty female individuals affected by obesity (Ob) took part at this experiment. They were inpatients consecutively recruited at admission to the involved institution. All participants were right-handed. All individuals were nonsmokers and free from gastrointestinal, cardiovascular, psychiatric, or metabolic disorders or any concurrent medical condition not related to obesity. Weight and height were measured to the nearest 0.1 kg and 0.1 cm, respectively, using standard methods. BMI was expressed as body mass (kg)/height (m)2. Obesity was defined for any BMI over 30 kg/m2.

Twenty female normal-weight individuals (He) were recruited as control group. The previous ìnclusion/exclusion criteria were used, except for the BMI that had to be in a range of 18.5–24.9. Demographic and clinical data are reported in Table 1.

Table 1 Mean (M) and standard deviation (SD; in parenthesis) relative to the demographical characteristics of Age and Education, the clinical measure of BMI, and the scores at the psychological questionnaires for the two groups are reported

Psychological questionnaire

In this study, self-report questionnaires were used to describe the psychological components of depression, anxiety and alexithymia in our sample. Specifically, the level of depressive symptoms was measured through the Beck Depression Inventory [20 Italian version 21]. It consists of 21 items; each item has a list of four or more statements arranged in increasing severity regarding a particular symptom of depression. The seminal article reported acceptable internal consistency (α = 0.86) as well as acceptable test–retest reliability (r = 0.93) [20]. Moreover, participants were asked to fill out the State–Trait Anxiety Inventory (STAI) [22 Italian version 23]. This questionnaire is used to assess the current state of anxiety, asking how respondents feel “right now,” using items that measure subjective feelings of apprehension, tension, nervousness, worry, and activation/arousal of the autonomic nervous system (i.e., trait scale). Moreover, the questionnaire evaluates relatively stable aspects of “anxiety proneness,” including general states of calmness, confidence, and security (i.e., state scale). Overall, the questionnaire consists of 40 items, 20 items for each scale. In terms of reliability, it was reported α = 0.90 for the trait scale, and α = 0.93 for the state scale; moreover, test–retest reliability ranged from 0.73 to 0.86 and 0.16 to 0.62 for scores on the trait and state scales, respectively [22]. Finally, the level of alexithymia was measured through the Toronto Alexithymia Scale-20 [24 Italian version 25]: it consists of 20 items. Individuals indicated the extent to which they agreed or disagreed with each statement on a five-point Likert scale. It provides a total score and three subscale scores of difficulties in identifying feelings; difficulty in describing feelings; and externally oriented thinking. The TAS-20 demonstrated acceptable internal consistency (α = 0.81); the test–retest realibility was of 0.77 [24].

Implicit facial emotion recognition task

The emotions of fear and anger were studied independently. We adopted the same task described in Scarpina et al. [18]. It was a recognition go–no go task of emotional visual stimuli grounded on the redundant target effect [26]. Stimuli consisted of photographs of faces of one male and one female [27] with either an angry, a fearful, or a neutral expression., showed in four different conditions: (1) single: the target (i.e., the face expressing the target emotion) was presented on the right OR left of a fixation cross; (2) congruent: the target was presented simultaneously on the right AND left of the fixation cross; (3) emotional incongruent: the target was presented on the right OR left of the fixation cross along with a different emotion (4) neutral incongruent: the target was presented on the right OR left of the fixation cross along with a neutral expression (Fig. 1).

Fig. 1
figure 1

Experimental task: examples of visual stimuli are shown for each of the four conditions, when fear is the target

Moreover, catch trials were randomly presented. In these trials, a distractor (which was a neutral trials in half of the trials, and a contrasting emotion in the other half) was presented unilaterally, bilaterally, or in opposition to a neutral and another emotion stimuli. Thus, in these trials, the target was never shown. During the task, participants were seated at a distance of ~ 60 cm from a computer screen of which the vertical midline lay on the sagittal midplane of their trunk and head. Participants were required to respond as soon as they noticed the target. The stimuli stayed until the participants responded or for a duration of 1500 ms. The inter-stimulus interval varied randomly between 650 and 950 ms. The emotions of fear and anger were studied independently in different blocks. For each experimental condition (unilateral, bilateral, neutral incongruent; emotional incongruent), 32 valid trials and 16 catch trials were presented in 4 blocks (ABBA: anger, fear, fear, anger). Overall, 768 trials were administered. There was a 2- to 3-min break between blocks. We computed the level of Accuracy (number of hits − number of false alarms) expressed in percentage, and the reaction rime (RTs) in millisecond (ms) from stimuli onset relative to the valid trials (i.e., when the target emotion was correctly detected). Valid trials in which answer was provided below 50 ms were not considered in the statistical analyses.

Analyses

Independent sample t test was run to verify any difference between groups in demographical characteristics as well as in the psychological questionnaires’ scores. About the FER task, we analyzed the data relative to the emotion of fear and the emotion of anger independently. For both RTs and Accuracy, we run a mixed ANOVA with the within factors of condition (single, congruent emotional incongruent, neutral incongruent) and gender (male vs female) and the between factor of group (obesity vs control). Estimated marginal mean Bonferroni-corrected comparisons were used as post-hoc analyses in the case of a significant interaction.

Results

Demographical aspects

Participants affected by obesity were significantly older and less educated in comparison with the controls. Moreover, they reported higher BMI in comparison with controls (Table 1), as expected.

Psychological questionnaires

The two groups reported similar scores in the psychological questionnaires, except in the subscale “Externally oriented thinking” of TAS-20 [22], in which affected individuals reported higher score than controls (Table 1): this result suggested higher difficulties in this specific component.

Implicit facial emotion recognition task

All participants completed the experimental test without any complaints about difficulties in their ability to look at the screen. In Table 2, mean and standard deviation of the RT and the level of accuracy for all tested experimental conditions were reported for both the groups.

Table 2 Mean (M) and standard deviation (SD) of the reaction time (RT) in millisecond (ms) and the level of accuracy expressed in percentage (%), for each experimental condition and gender (female vs male) are reported for the two groups

Reaction time

About the emotion of fear, no main effect of Group emerged (Ob M = 391; SD = 110; He M = 389; SD = 107) [F(1,38) = 0.005; p = 0.94; η 2p  < 0.001]. Instead, a significant main effect of Condition was found [F(3, 114) = 3.88; p = 0.011; η 2p  = 0.093], as well as a significant interaction Condition * Group [F(3, 114) = 7.41; p < 0.001; η 2p  = 0.163], as shown in Fig. 2—left side.

Fig. 2
figure 2

Mean and standard deviation (vertical lines) relative to the RTs expressed in ms (left part) and the level of accuracy expressed in percentage (right part) for each experimental condition, split for the two groups (healthy-weight individuals vs individuals with obesity) are shown about the emotion of fear. * when p value < 0.05; **p value < 0.001

However, no main effect of Gender emerged [F(1,38) = 3.56; p = 0.06; η 2p  = 0.08], neither a significant interaction with the within factor of Condition [F(3,114) = 1.26; p = 0.28; η 2p  = 0.03] or the between factor of Group [F(1,38) = 0.1; p = 0.7; η 2p  = 0.003]. Finally, the second-level interaction Group * Condition* Gender [F(3,114) = 1.58; p = 0.19; η 2p  = 0.4] was not significant. Thus, when fear was the target of the experimental task, the expected redundant target effect emerged for the control group, but not for the group of affected participants.

About anger, no main effect of Group (Ob M = 382; SD = 99; He M = 401; SD = 114) [F(1,38) = 0.33; p = 0.56; η 2p  = 0.93] emerged. A significant main effect of Condition (Co M = 381; SD = 105; InE M = 400; SD = 109; InN M = 401; SD = 104; S M = 383; SD = 111) was found [F(3,114) = 3.27; p = 0.024; η 2p  = 0.07]; however, no significant difference emerged between the experimental conditions at the post hoc test. The interaction Group * Condition was not significant [(F(3,114) = 2.27; p = 0.08; η 2p  = 0.05]. No main effect of Gender emerged [F(1,38) = 0.77; p = 0.38; η 2p  = 0.02], neither a significant interaction with the within factor of Condition [F(3,114) = 0.52; p = 0.66; η 2p  = 0.01] or the between factor of Group [F(1,38) = 0.52; p = 0.66; η 2p  = 0.01]. Finally, the second-level interaction Group * Condition * Gender [F(3,114) = 0.1; p = 0.95; η 2p  = 0.003] was not significant. Overall, these results suggested that for both groups, no redundant target effect emerged.

Accuracy

About fear, a significant main effect of Group [F(1,38) = 14.43; p = 0.001; η 2p  = 0.91] emerged, since the affected group (M = 41.57; SD = 3.5) was less accurate in comparison with the control group (M = 60.39; SD = 3.5). A significant main effect of Condition [F(3,114) = 7.86; p < 0.001; η 2p  = 0.17]; as shown in Fig. 3—left side, a significant lower level of accuracy emerged in the congruent condition in comparison with the emotional incongruent and the single conditions, without any other significant differences.

Fig. 3
figure 3

Mean and standard deviation (vertical lines) for each experimental conditions are shown about the level of accuracy expressed in percentage (%) relative to the emotion of fear (left part) and anger (right part). *p value < 0.05; ** < 0.001

Moreover, a significant interaction Condition * Group [F(3,114) = 28.21; p < 0.001; η 2p  = 0.42] emerged: the control group showed the expected redundant target effect; instead the affected group was less accurate in the congruent condition in comparison with the other experimental conditions (Fig. 2, right side). A significant main effect of Gender (males M = 60.83; SD = 20.02; female M = 41.14; SD = 29.4) [F(1,38) = 88.39; p < 0.001; η 2p  = 0.69] as well as a significant interaction with Group [F(1,38) = 29.12; p < 0.001; η 2p  = 0.43] were found: while the control group reported similar level of accuracy between female and male faces [p = 0.1], the affected group was less accurate for female faces compared with male faces (Table 2) [p < 0.001]. No significant interaction Gender * Condition emerged [F(3,114) = 1.23; p = 0.3; η 2p  = 0.03] as well as no significant second-level interaction Group * Gender * Condition [F(3,114) = 1.18; p = 0.31; η 2p  = 0.03]

About anger, a significant main effect of Group emerged [F(1,38) = 9.55; p = 0.004; η 2p  = 0.2]: the affected group (M = 31.55; SD = 4.42) was less accurate than the control group (M = 50.87; SD = 4.42). A significant main effect of Gender [F(1,38) = 19.78; p < 0.001; η 2p  = 0.34] with lower level of accuracy for male (M = 35.95; SD = 3.82) than female (M = 46.47; SD = 2.77) faces emerged. The effect of Condition [F(3,114) = 33.03; p < 0.001; η 2p  = 0.46] emerged: as shown in Fig. 3—right side, the expected redundant target effect emerged. Indeed, in both incongruent (emotional and neutral) conditions, individuals were less accurate in comparison with the single and the congruent conditions. Moreover, the interaction Gender * Condition (Fig. 4) [F(3,114) = 5.71; p = 0.001; η 2p  = 0.13] was significant: the redundant target effect emerged clearly for female faces, while for the male faces we observed a lower level of accuracy in incongruent neutral conditions respect to the others, without any other significant difference [p ≥ 0.07]. Moreover, in the single [p < 0.001] and in the congruent [p = 0.005] conditions, higher level of accuracy emerged for female faces respect to males, without any other significant difference [p ≥ 0.08]. Interestingly, the expected redundant target effect emerged, but overall the level of accuracy was higher for female than male faces. Finally, no significant interaction Condition * Group [F(3,114) = 1.008; p = 0.39; η 2p  = 0.026], Gender * Group [F(1,38) = 0.68; p = 0.41; η 2p  = 0.01] or Condition * Gender * Group [F(3,114) = 0.72; p = 0.5; η 2p  = 0.01] emerged.

Fig. 4
figure 4

Mean and standard deviation (vertical lines) relative to the level of accuracy in the case of the emotion of anger are shown for the four experimental conditions split to the gender (female vs male) of the showed picture. **p value < 0.001; *p value < 0.05

The effect of age and education on the main results

According to the main analyses reported in Table 1, the two groups were significantly different in terms of demographical characteristics. Thus, the main analyses concerning the experimental task were run again introducing both Age and Education as covariates.

Reaction time

About the analyses relative to the emotion of fear, we confirmed the absence of significant effect of Group [F(1,36) = 1.01; p = 0.32; η 2p  = 0.02]: the corrected mean relative to Ob group was 408 (SD = 23.91) and for He group was 372 (SD = 23.91). Nor Age [F(1,36) = 3.78; p = 0.059; η 2p  = 0.09] nor Education [F(1,36) = 0.38; p = 0.57; η 2p  = 0.01] was significant. The analyses confirmed the significant main effect of Condition [F(3,108) = 2.87; p = 0.039; η 2p  = 0.07] and the significant interaction Group * Condition [F(3,108) = 6.14; p = 0.001; η 2p  = 0.14]. Neither Age [F(3,108) = 1.61; p = 0.19; η 2p  = 0.04] nor Education [F(3,108) = 1.96; p = 0.12; η 2p  = 0.05] interacted significantly with Condition.

About the emotion of anger, we confirmed no significant main effect of Group [F(1,36) = 0.05; p = 0.82; η 2p  = 0.001]: the corrected mean relative to Ob group was 395 (SD = 23.46) and for He group was 387 (SD = 23.46). Age [F(1,36) = 5.97; p = 0.02; η 2p  = 0.14] had a significant main effect; instead Education [F(1,36) = 0.11 p = 0.7; η 2p  = 0.003] was not significant. The analyses confirmed the absence of any main effect of Condition [F(3,108) = 0.18; p = 0.9; η 2p  = 0.005] or significant interaction Group * Condition [F(3,108) = 1.16; p = 0.32; η 2p  = 0.031]. Neither Age [F(3,108) = 0.28; p = 0.83; η 2p  = 0.008] nor Education [F(3,108) = 0.72; p = 0.54; η 2p  = 0.02] interacted significantly with the factor Condition.

Accuracy

When the results relative to the emotion of fear were analyzed, we confirmed the significant main effect of Group [F(1,36) = 7.47; p = 0.01; η 2p  = 0.17]: the corrected mean relative to Ob group was 43.4 (SD = 3.71) and for He group was 58.56 (SD = 3.71). Neither Age [F(1,36) = 0.6; p = 0.4 η 2p  = 0.01] nor Education [F(1,36) = 1.043; p = 0.31; η 2p  = 0.02] was significant. The analyses confirmed no significant main effect of Condition [F(3,108) = 0.36; p = 0.77; η 2p  = 0.01]; interestingly, the significant interaction Group * Condition was confirmed [F(3,108) = 20.87; p < 0.001; η 2p  = 0.36]. Nor Age [F(3,108) = 0.56; p = 0.63; η 2p  = 0.01] nor Education [F(3,108) = 1.12; p = 0.34; η 2p  = 0.03] interacted significantly with Condition. We did not replicate the significant main effect of Gender [F(1,36) = 2.43; p = 0.12 η 2p  = 0.06], but we confirmed the significant interaction with Group [F(1,36) = 17.61; p < 0.001; η 2p  = 0.32]. However, no significant interaction Gender * Age [F(1,36) = 0.18; p = 0.67; η 2p  = 0.05] or Gender * Education [F(1,36) = 1.54; p = 0.22; η 2p  = 0.04] emerged. No other significant interaction was found [p ≥ 0.08].

About the emotion of anger, we confirmed the significant main effect of Group [F(1,36) = 4.28; p = 0.04; η 2p  = 0.1]: the corrected mean relative to Ob group was 34.04 (SD = 4.64) and for He group was 48.38 (SD = 4.64). Nor Age [F(1,36) = 2.33; p = 0.13 η 2p  = 0.06] nor Education [F(1,36) = 0.19; p = 0.66; η 2p  = 0.005] was significant. We did not replicate the significant main effect of Gender [F(1,36) = 0.45; p = 0.5 η 2p  = 0.01] or the significant interaction with Group [F(1,36)  < 0.0005; p = 0.98; η 2p  < 0.001]. However, no significant interaction Gender * Age [F(1,36) = 0.6; p = 0.44; η 2p  = 0.01] or Gender * Education [F(1,36) = 1.67; p = 0.2; η 2p  = 0.04] emerged: thus, the different pattern of results was not related to the demographical factors. Moreover, we did not confirmed the main effect of Condition [F(3,108) = 2.12; p = 0.1; η 2p  = 0.05] or the interaction Gender * Condition [F(3,108) = 0.16; p = 0.9; η 2p  = 0.004]. However, the interaction Condition * Age [F(3,108) = 0.97; p = 0.4; η 2p  = 0.02] or Condition * Education [F(3,108) = 0.61; p = 0.6; η 2p  = 0.01] were found. Thus, again, the different pattern of results was not related to the demographical factors. No other significant interaction was found [p ≥ 0.5].

Discussion

In this study, we compared individuals with obesity and healthy-weight individuals in their capability to recognize facial expressions of fear and anger. We adopted an implicit task grounded on the cognitive phenomenon of the redundant target effect [18]. About fear, we reported the effect in terms of velocity and accuracy for controls, as expected [18]. Interestingly, this effect was not observed in affected individuals’ performance: this result might be suggestive of difficulties in recognizing fearful expressions in obesity. It seems to agree with the hypothesis according which affected individuals showed difficulties in being attentive to negative facial emotions [10, 11]. However, we might hypothesize this effect as emotion dependent: the difficulty might pertain to fearful expressions. Indeed, when we considered the emotion of anger, affected individuals reported a lower level of accuracy, in line with some previous studies [10, 11]; however, overall, they seemed to be sensitive to the implicit effect, in line with the controls’ performance. The different behaviour observed when fear or anger was tested might be supportive of methodological approaches investigating basis emotions [1] independently.

We observed a gender effect on the level of accuracy, for both the emotion of fear and anger, suggesting that—at least—for female individuals, the detection of the facial emotional expressions (as indicated by the level of accuracy) might be influenced by the other’s gender. Such an effect was not specifically reported in the seminal article of the implicit facial emotion recognition task [18], although when the redundant target effect was used in the context of the facial emotion recognition [e.g., 28, 29] the performance was not assessed in terms of the level of accuracy, limiting the interpretability of our results. Indeed, the behavioral performance is traditionally scored in terms of reaction time, which refers to the ability to detect correctly the target. Our results (i.e., no effect of gender on the reaction time) seemed to be in agreement with the previous evidence according to which the stimuli detection is enhanced only by the emotional congruency between targets, in the absence of any effect related to the face’s gender [28, 29]. Nevertheless, females and males differ in emotional experience and expression [30]; moreover, some studies seem to suggest the advantage of women in decoding emotions in comparison with males [31]. Thus, the role of gender should be carefully considered when investigating FER capability—and overall—emotional processing. Here, only female individuals were assessed; thus, it would be interesting to replicate this experiment focusing on the males’ behavior.

Finally, we might observe that the behavioral result was not in agreement with the subjective evaluation of emotional capability [22]: indeed, the two groups reported similar scores in terms of global level of alexithymia. Thus, a phenomenological approach in which emotional processing is investigated through implicit cognitive mechanisms (as done in this paper as well as in Cserjesi et al. [10]), together with the traditional self-report assessment, might disentangle subjective judgments from objective measurements of emotional processing.

Efficient and satisfying social interactions are mediated by appropriate capability in recognizing others’ emotional state. At the light of the evidence about interpersonal difficulties, and the emerging of social complex phenomena like bullying and social stigma, we might encourage future research about FER capability in obesity.

What is already known on this subject?

Previous research about FER capability in individuals affected by obesity is scarce. We investigated it in a sample of affected individuals, through a cognitive task.

What does this study add?

Affected individuals had difficulties in processing fear when expressed by others’ face. This impairment might impact on social relationships. Thus, this emotion should be carefully taken into account in psychological interventions.