Abstract
Background
Previous evidence about facial emotion recognition capability in obesity is few and not conclusive.
Objective
We investigated the capability of female individuals affected by obesity to recognize the emotions of fear and anger through a facial emotion recognition task grounded on the implicit redundant target effect.
Methods
20 women affected by obesity and 20 healthy-weight women were enrolled. We administered an implicit facial emotion recognition task. Both reaction time and level of accuracy were computed. Moreover, the level of alexithymia was measured through the standard questionnaire.
Results
Selective difficulties in recognizing the emotion of fear were observed in participants with obesity, when their performance was contrasted with healthy-weight controls. Instead, they showed the implicit redundant target effect when anger was the target. However, the two groups reported globally similar scores at the standard questionnaire relative to the level of alexithymia.
Conclusions
Our result might agree with the hypothesis about affected individuals’ difficulties in being attentive to negative facial emotions, and specifically in the case of fearful expression. This study might encourage future research in which emotional processing will be investigated through subjective judgments and implicit/objective measurements.
Level I
Experimental study.
Avoid common mistakes on your manuscript.
Introduction
Social interactions grounds on human capability to recognize others’ emotions: facial expressions are a primary means [1] and their discrimination is an automatic, unintentional and unconscious process [2]. It promotes communication, empathy and social cognition [1, 2]. Facial emotion recognition (FER) capability was extensively investigated not only in clinical diseases, such as autism spectrum disorders [3] and schizophrenia [4, 5], but also in the case of clinical symptoms of anxiety [6, 7] and depression [7, 8]. However, previous and few studies investigated FER capability of adults suffering from obesity, reporting heterogeneous results. Bergmann et al. [9] observed no difference between women affected by obesity and healthy-weight women. Instead, Cserjesi et al. [10] reported reduced capacity in recognizing negative facial expressions, as well as Giel et al. [11] suggested that individuals affected by obesity perceived negative facial emotions as less intensive when compared to healthy-weight individuals. Individuals affected by obesity seem to report scarce emotional awareness, difficulties in adopting emotion regulation strategies, and suppression of emotional expressions [12], with effects not only on social interactions and quality of life, but also the amount of weight and the long-term effects of weight loss rehabilitation [13, 14]. Interestingly, the basis emotions of fear and anger seem to play a crucial role in the dysfunctional eating behaviors [7]. However, these emotions are very different: indeed, the recognition of fear has described to have a crucial role for the individuals’ survival, since it actually serves an important role in keeping individuals safe as it mobilizes them to cope with potential external danger [15,16,17]. On the other hand, the emotion of anger has a critical social meaning when it is recognized on others’ face, suggesting that individuals would confront themselves with someone who is expressing negative and potentially threatening feelings towards us [18].
FER capability is traditionally investigated requiring individuals to explicitly report what it is the emotion expressed by the others’ face. However, in this experimental procedure, participants might be aware of the experimental question, increasing not only the risk of controlled responses but also the role of personal motivation and attitudes. To partly solve this issue, implicit tasks—in which the performance is rated from an automatic and unaware participant’s behavior—might be preferable. For example, Cserjesi et al. [10] used a task grounded on the implicit priming effect, which grounds on the fact that the processing of an emotional target is generally influenced by the emotional stimuli presented after (i.e., the prime). However, it should be noticed that in Cserjesi and colleagues’ work, schematic (and not real human) facial expressions were used as prime, while emotional words were used as target; thus, it would be argued that in this study, the FER capability was not directly investigated.
Recently, we presented a FER task grounded on the implicit “redundant target effect” [19]: individuals generally respond faster when two identical targets are presented simultaneously rather than when presented alone; moreover, the competitive presence of a distractor (that is another emotion or a neutral expression) affects the correct recognition of the target. Here, we took advantage of this attentional mechanism to investigate the capability of individuals affected by obesity in recognizing the facial emotion of fear and anger in comparison with a sample of healthy-weight individuals. If the capability to recognize fear and anger in individuals affected by obesity was preserved, they would show the redundant target effect as well as healthy-weight participants.
Methods
Participants
This study was approved by Ethics Committee of the I.R.C.C.S Istituto Auxologico Italiano (Italy). Subjects participated voluntarily; they gave informed written consent, were free to withdraw at will and were naïve to the rationale of the experiment. Twenty female individuals affected by obesity (Ob) took part at this experiment. They were inpatients consecutively recruited at admission to the involved institution. All participants were right-handed. All individuals were nonsmokers and free from gastrointestinal, cardiovascular, psychiatric, or metabolic disorders or any concurrent medical condition not related to obesity. Weight and height were measured to the nearest 0.1 kg and 0.1 cm, respectively, using standard methods. BMI was expressed as body mass (kg)/height (m)2. Obesity was defined for any BMI over 30 kg/m2.
Twenty female normal-weight individuals (He) were recruited as control group. The previous ìnclusion/exclusion criteria were used, except for the BMI that had to be in a range of 18.5–24.9. Demographic and clinical data are reported in Table 1.
Psychological questionnaire
In this study, self-report questionnaires were used to describe the psychological components of depression, anxiety and alexithymia in our sample. Specifically, the level of depressive symptoms was measured through the Beck Depression Inventory [20 Italian version 21]. It consists of 21 items; each item has a list of four or more statements arranged in increasing severity regarding a particular symptom of depression. The seminal article reported acceptable internal consistency (α = 0.86) as well as acceptable test–retest reliability (r = 0.93) [20]. Moreover, participants were asked to fill out the State–Trait Anxiety Inventory (STAI) [22 Italian version 23]. This questionnaire is used to assess the current state of anxiety, asking how respondents feel “right now,” using items that measure subjective feelings of apprehension, tension, nervousness, worry, and activation/arousal of the autonomic nervous system (i.e., trait scale). Moreover, the questionnaire evaluates relatively stable aspects of “anxiety proneness,” including general states of calmness, confidence, and security (i.e., state scale). Overall, the questionnaire consists of 40 items, 20 items for each scale. In terms of reliability, it was reported α = 0.90 for the trait scale, and α = 0.93 for the state scale; moreover, test–retest reliability ranged from 0.73 to 0.86 and 0.16 to 0.62 for scores on the trait and state scales, respectively [22]. Finally, the level of alexithymia was measured through the Toronto Alexithymia Scale-20 [24 Italian version 25]: it consists of 20 items. Individuals indicated the extent to which they agreed or disagreed with each statement on a five-point Likert scale. It provides a total score and three subscale scores of difficulties in identifying feelings; difficulty in describing feelings; and externally oriented thinking. The TAS-20 demonstrated acceptable internal consistency (α = 0.81); the test–retest realibility was of 0.77 [24].
Implicit facial emotion recognition task
The emotions of fear and anger were studied independently. We adopted the same task described in Scarpina et al. [18]. It was a recognition go–no go task of emotional visual stimuli grounded on the redundant target effect [26]. Stimuli consisted of photographs of faces of one male and one female [27] with either an angry, a fearful, or a neutral expression., showed in four different conditions: (1) single: the target (i.e., the face expressing the target emotion) was presented on the right OR left of a fixation cross; (2) congruent: the target was presented simultaneously on the right AND left of the fixation cross; (3) emotional incongruent: the target was presented on the right OR left of the fixation cross along with a different emotion (4) neutral incongruent: the target was presented on the right OR left of the fixation cross along with a neutral expression (Fig. 1).
Moreover, catch trials were randomly presented. In these trials, a distractor (which was a neutral trials in half of the trials, and a contrasting emotion in the other half) was presented unilaterally, bilaterally, or in opposition to a neutral and another emotion stimuli. Thus, in these trials, the target was never shown. During the task, participants were seated at a distance of ~ 60 cm from a computer screen of which the vertical midline lay on the sagittal midplane of their trunk and head. Participants were required to respond as soon as they noticed the target. The stimuli stayed until the participants responded or for a duration of 1500 ms. The inter-stimulus interval varied randomly between 650 and 950 ms. The emotions of fear and anger were studied independently in different blocks. For each experimental condition (unilateral, bilateral, neutral incongruent; emotional incongruent), 32 valid trials and 16 catch trials were presented in 4 blocks (ABBA: anger, fear, fear, anger). Overall, 768 trials were administered. There was a 2- to 3-min break between blocks. We computed the level of Accuracy (number of hits − number of false alarms) expressed in percentage, and the reaction rime (RTs) in millisecond (ms) from stimuli onset relative to the valid trials (i.e., when the target emotion was correctly detected). Valid trials in which answer was provided below 50 ms were not considered in the statistical analyses.
Analyses
Independent sample t test was run to verify any difference between groups in demographical characteristics as well as in the psychological questionnaires’ scores. About the FER task, we analyzed the data relative to the emotion of fear and the emotion of anger independently. For both RTs and Accuracy, we run a mixed ANOVA with the within factors of condition (single, congruent emotional incongruent, neutral incongruent) and gender (male vs female) and the between factor of group (obesity vs control). Estimated marginal mean Bonferroni-corrected comparisons were used as post-hoc analyses in the case of a significant interaction.
Results
Demographical aspects
Participants affected by obesity were significantly older and less educated in comparison with the controls. Moreover, they reported higher BMI in comparison with controls (Table 1), as expected.
Psychological questionnaires
The two groups reported similar scores in the psychological questionnaires, except in the subscale “Externally oriented thinking” of TAS-20 [22], in which affected individuals reported higher score than controls (Table 1): this result suggested higher difficulties in this specific component.
Implicit facial emotion recognition task
All participants completed the experimental test without any complaints about difficulties in their ability to look at the screen. In Table 2, mean and standard deviation of the RT and the level of accuracy for all tested experimental conditions were reported for both the groups.
Reaction time
About the emotion of fear, no main effect of Group emerged (Ob M = 391; SD = 110; He M = 389; SD = 107) [F(1,38) = 0.005; p = 0.94; η 2p < 0.001]. Instead, a significant main effect of Condition was found [F(3, 114) = 3.88; p = 0.011; η 2p = 0.093], as well as a significant interaction Condition * Group [F(3, 114) = 7.41; p < 0.001; η 2p = 0.163], as shown in Fig. 2—left side.
However, no main effect of Gender emerged [F(1,38) = 3.56; p = 0.06; η 2p = 0.08], neither a significant interaction with the within factor of Condition [F(3,114) = 1.26; p = 0.28; η 2p = 0.03] or the between factor of Group [F(1,38) = 0.1; p = 0.7; η 2p = 0.003]. Finally, the second-level interaction Group * Condition* Gender [F(3,114) = 1.58; p = 0.19; η 2p = 0.4] was not significant. Thus, when fear was the target of the experimental task, the expected redundant target effect emerged for the control group, but not for the group of affected participants.
About anger, no main effect of Group (Ob M = 382; SD = 99; He M = 401; SD = 114) [F(1,38) = 0.33; p = 0.56; η 2p = 0.93] emerged. A significant main effect of Condition (Co M = 381; SD = 105; InE M = 400; SD = 109; InN M = 401; SD = 104; S M = 383; SD = 111) was found [F(3,114) = 3.27; p = 0.024; η 2p = 0.07]; however, no significant difference emerged between the experimental conditions at the post hoc test. The interaction Group * Condition was not significant [(F(3,114) = 2.27; p = 0.08; η 2p = 0.05]. No main effect of Gender emerged [F(1,38) = 0.77; p = 0.38; η 2p = 0.02], neither a significant interaction with the within factor of Condition [F(3,114) = 0.52; p = 0.66; η 2p = 0.01] or the between factor of Group [F(1,38) = 0.52; p = 0.66; η 2p = 0.01]. Finally, the second-level interaction Group * Condition * Gender [F(3,114) = 0.1; p = 0.95; η 2p = 0.003] was not significant. Overall, these results suggested that for both groups, no redundant target effect emerged.
Accuracy
About fear, a significant main effect of Group [F(1,38) = 14.43; p = 0.001; η 2p = 0.91] emerged, since the affected group (M = 41.57; SD = 3.5) was less accurate in comparison with the control group (M = 60.39; SD = 3.5). A significant main effect of Condition [F(3,114) = 7.86; p < 0.001; η 2p = 0.17]; as shown in Fig. 3—left side, a significant lower level of accuracy emerged in the congruent condition in comparison with the emotional incongruent and the single conditions, without any other significant differences.
Moreover, a significant interaction Condition * Group [F(3,114) = 28.21; p < 0.001; η 2p = 0.42] emerged: the control group showed the expected redundant target effect; instead the affected group was less accurate in the congruent condition in comparison with the other experimental conditions (Fig. 2, right side). A significant main effect of Gender (males M = 60.83; SD = 20.02; female M = 41.14; SD = 29.4) [F(1,38) = 88.39; p < 0.001; η 2p = 0.69] as well as a significant interaction with Group [F(1,38) = 29.12; p < 0.001; η 2p = 0.43] were found: while the control group reported similar level of accuracy between female and male faces [p = 0.1], the affected group was less accurate for female faces compared with male faces (Table 2) [p < 0.001]. No significant interaction Gender * Condition emerged [F(3,114) = 1.23; p = 0.3; η 2p = 0.03] as well as no significant second-level interaction Group * Gender * Condition [F(3,114) = 1.18; p = 0.31; η 2p = 0.03]
About anger, a significant main effect of Group emerged [F(1,38) = 9.55; p = 0.004; η 2p = 0.2]: the affected group (M = 31.55; SD = 4.42) was less accurate than the control group (M = 50.87; SD = 4.42). A significant main effect of Gender [F(1,38) = 19.78; p < 0.001; η 2p = 0.34] with lower level of accuracy for male (M = 35.95; SD = 3.82) than female (M = 46.47; SD = 2.77) faces emerged. The effect of Condition [F(3,114) = 33.03; p < 0.001; η 2p = 0.46] emerged: as shown in Fig. 3—right side, the expected redundant target effect emerged. Indeed, in both incongruent (emotional and neutral) conditions, individuals were less accurate in comparison with the single and the congruent conditions. Moreover, the interaction Gender * Condition (Fig. 4) [F(3,114) = 5.71; p = 0.001; η 2p = 0.13] was significant: the redundant target effect emerged clearly for female faces, while for the male faces we observed a lower level of accuracy in incongruent neutral conditions respect to the others, without any other significant difference [p ≥ 0.07]. Moreover, in the single [p < 0.001] and in the congruent [p = 0.005] conditions, higher level of accuracy emerged for female faces respect to males, without any other significant difference [p ≥ 0.08]. Interestingly, the expected redundant target effect emerged, but overall the level of accuracy was higher for female than male faces. Finally, no significant interaction Condition * Group [F(3,114) = 1.008; p = 0.39; η 2p = 0.026], Gender * Group [F(1,38) = 0.68; p = 0.41; η 2p = 0.01] or Condition * Gender * Group [F(3,114) = 0.72; p = 0.5; η 2p = 0.01] emerged.
The effect of age and education on the main results
According to the main analyses reported in Table 1, the two groups were significantly different in terms of demographical characteristics. Thus, the main analyses concerning the experimental task were run again introducing both Age and Education as covariates.
Reaction time
About the analyses relative to the emotion of fear, we confirmed the absence of significant effect of Group [F(1,36) = 1.01; p = 0.32; η 2p = 0.02]: the corrected mean relative to Ob group was 408 (SD = 23.91) and for He group was 372 (SD = 23.91). Nor Age [F(1,36) = 3.78; p = 0.059; η 2p = 0.09] nor Education [F(1,36) = 0.38; p = 0.57; η 2p = 0.01] was significant. The analyses confirmed the significant main effect of Condition [F(3,108) = 2.87; p = 0.039; η 2p = 0.07] and the significant interaction Group * Condition [F(3,108) = 6.14; p = 0.001; η 2p = 0.14]. Neither Age [F(3,108) = 1.61; p = 0.19; η 2p = 0.04] nor Education [F(3,108) = 1.96; p = 0.12; η 2p = 0.05] interacted significantly with Condition.
About the emotion of anger, we confirmed no significant main effect of Group [F(1,36) = 0.05; p = 0.82; η 2p = 0.001]: the corrected mean relative to Ob group was 395 (SD = 23.46) and for He group was 387 (SD = 23.46). Age [F(1,36) = 5.97; p = 0.02; η 2p = 0.14] had a significant main effect; instead Education [F(1,36) = 0.11 p = 0.7; η 2p = 0.003] was not significant. The analyses confirmed the absence of any main effect of Condition [F(3,108) = 0.18; p = 0.9; η 2p = 0.005] or significant interaction Group * Condition [F(3,108) = 1.16; p = 0.32; η 2p = 0.031]. Neither Age [F(3,108) = 0.28; p = 0.83; η 2p = 0.008] nor Education [F(3,108) = 0.72; p = 0.54; η 2p = 0.02] interacted significantly with the factor Condition.
Accuracy
When the results relative to the emotion of fear were analyzed, we confirmed the significant main effect of Group [F(1,36) = 7.47; p = 0.01; η 2p = 0.17]: the corrected mean relative to Ob group was 43.4 (SD = 3.71) and for He group was 58.56 (SD = 3.71). Neither Age [F(1,36) = 0.6; p = 0.4 η 2p = 0.01] nor Education [F(1,36) = 1.043; p = 0.31; η 2p = 0.02] was significant. The analyses confirmed no significant main effect of Condition [F(3,108) = 0.36; p = 0.77; η 2p = 0.01]; interestingly, the significant interaction Group * Condition was confirmed [F(3,108) = 20.87; p < 0.001; η 2p = 0.36]. Nor Age [F(3,108) = 0.56; p = 0.63; η 2p = 0.01] nor Education [F(3,108) = 1.12; p = 0.34; η 2p = 0.03] interacted significantly with Condition. We did not replicate the significant main effect of Gender [F(1,36) = 2.43; p = 0.12 η 2p = 0.06], but we confirmed the significant interaction with Group [F(1,36) = 17.61; p < 0.001; η 2p = 0.32]. However, no significant interaction Gender * Age [F(1,36) = 0.18; p = 0.67; η 2p = 0.05] or Gender * Education [F(1,36) = 1.54; p = 0.22; η 2p = 0.04] emerged. No other significant interaction was found [p ≥ 0.08].
About the emotion of anger, we confirmed the significant main effect of Group [F(1,36) = 4.28; p = 0.04; η 2p = 0.1]: the corrected mean relative to Ob group was 34.04 (SD = 4.64) and for He group was 48.38 (SD = 4.64). Nor Age [F(1,36) = 2.33; p = 0.13 η 2p = 0.06] nor Education [F(1,36) = 0.19; p = 0.66; η 2p = 0.005] was significant. We did not replicate the significant main effect of Gender [F(1,36) = 0.45; p = 0.5 η 2p = 0.01] or the significant interaction with Group [F(1,36) < 0.0005; p = 0.98; η 2p < 0.001]. However, no significant interaction Gender * Age [F(1,36) = 0.6; p = 0.44; η 2p = 0.01] or Gender * Education [F(1,36) = 1.67; p = 0.2; η 2p = 0.04] emerged: thus, the different pattern of results was not related to the demographical factors. Moreover, we did not confirmed the main effect of Condition [F(3,108) = 2.12; p = 0.1; η 2p = 0.05] or the interaction Gender * Condition [F(3,108) = 0.16; p = 0.9; η 2p = 0.004]. However, the interaction Condition * Age [F(3,108) = 0.97; p = 0.4; η 2p = 0.02] or Condition * Education [F(3,108) = 0.61; p = 0.6; η 2p = 0.01] were found. Thus, again, the different pattern of results was not related to the demographical factors. No other significant interaction was found [p ≥ 0.5].
Discussion
In this study, we compared individuals with obesity and healthy-weight individuals in their capability to recognize facial expressions of fear and anger. We adopted an implicit task grounded on the cognitive phenomenon of the redundant target effect [18]. About fear, we reported the effect in terms of velocity and accuracy for controls, as expected [18]. Interestingly, this effect was not observed in affected individuals’ performance: this result might be suggestive of difficulties in recognizing fearful expressions in obesity. It seems to agree with the hypothesis according which affected individuals showed difficulties in being attentive to negative facial emotions [10, 11]. However, we might hypothesize this effect as emotion dependent: the difficulty might pertain to fearful expressions. Indeed, when we considered the emotion of anger, affected individuals reported a lower level of accuracy, in line with some previous studies [10, 11]; however, overall, they seemed to be sensitive to the implicit effect, in line with the controls’ performance. The different behaviour observed when fear or anger was tested might be supportive of methodological approaches investigating basis emotions [1] independently.
We observed a gender effect on the level of accuracy, for both the emotion of fear and anger, suggesting that—at least—for female individuals, the detection of the facial emotional expressions (as indicated by the level of accuracy) might be influenced by the other’s gender. Such an effect was not specifically reported in the seminal article of the implicit facial emotion recognition task [18], although when the redundant target effect was used in the context of the facial emotion recognition [e.g., 28, 29] the performance was not assessed in terms of the level of accuracy, limiting the interpretability of our results. Indeed, the behavioral performance is traditionally scored in terms of reaction time, which refers to the ability to detect correctly the target. Our results (i.e., no effect of gender on the reaction time) seemed to be in agreement with the previous evidence according to which the stimuli detection is enhanced only by the emotional congruency between targets, in the absence of any effect related to the face’s gender [28, 29]. Nevertheless, females and males differ in emotional experience and expression [30]; moreover, some studies seem to suggest the advantage of women in decoding emotions in comparison with males [31]. Thus, the role of gender should be carefully considered when investigating FER capability—and overall—emotional processing. Here, only female individuals were assessed; thus, it would be interesting to replicate this experiment focusing on the males’ behavior.
Finally, we might observe that the behavioral result was not in agreement with the subjective evaluation of emotional capability [22]: indeed, the two groups reported similar scores in terms of global level of alexithymia. Thus, a phenomenological approach in which emotional processing is investigated through implicit cognitive mechanisms (as done in this paper as well as in Cserjesi et al. [10]), together with the traditional self-report assessment, might disentangle subjective judgments from objective measurements of emotional processing.
Efficient and satisfying social interactions are mediated by appropriate capability in recognizing others’ emotional state. At the light of the evidence about interpersonal difficulties, and the emerging of social complex phenomena like bullying and social stigma, we might encourage future research about FER capability in obesity.
What is already known on this subject?
Previous research about FER capability in individuals affected by obesity is scarce. We investigated it in a sample of affected individuals, through a cognitive task.
What does this study add?
Affected individuals had difficulties in processing fear when expressed by others’ face. This impairment might impact on social relationships. Thus, this emotion should be carefully taken into account in psychological interventions.
Change history
08 October 2020
A Correction to this paper has been published: https://doi.org/10.1007/s40519-020-01042-y
References
Ekman P (1992) Are there basic emotions? Psychol Rev 99:550–553. https://doi.org/10.1037/0033-295X.99.3.550
Vuilleumier P, Armony JL, Clarke K, Husain M, Driver J, Dolan RJ (2002) Neural response to emotional faces with and without awareness: event-related fMRI in a parietal patient with visual extinction and spatial neglect. Neuropsychologia 40:2156–2166. https://doi.org/10.1016/s0028-3932(02)00045-3
Harms MB, Martin A, Wallace GL (2010) Facial emotion recognition in autism spectrum disorders: a review of behavioral and neuroimaging studies. Neuropsychol Rev 20:290–322. https://doi.org/10.1007/s11065-010-9138-6
Kohler CG, Walker JB, Martin EA, Healey KM, Moberg PJ (2010) Facial emotion perception in schizophrenia: a meta-analytic review. Schizophr Bull 36:1009–1019. https://doi.org/10.1093/schbul/sbn192
Edwards J, Jackson HJ, Pattison PE (2002) Emotion recognition via facial expression and affective prosody in schizophrenia: a methodological review. Clin Psychol Rev 22:789–832. https://doi.org/10.1016/S0272-7358(02)00130-7
Surcinelli P, Codispoti M, Montebarocci O, Rossi N, Baldaro B (2006) Facial emotion recognition in trait anxiety. J Anxiety Disord 20:110–117. https://doi.org/10.1016/j.janxdis.2004.11.010
Demenescu LR, Kortekaas R, den Boer JA, Aleman A (2010) Impaired attribution of emotion to facial expressions in anxiety and major depression. PLoS One 5:e15058. https://doi.org/10.1371/journal.pone.0015058
Bourke C, Douglas K, Porter R (2010) Processing of facial emotion expression in major depression: a review. Aust N Z J Psychiatry 44:681–696. https://doi.org/10.3109/00048674.2010.496359
Bergmann S, von Klitzing K, Keitel-Korndörfer A, Wendt V, Grube M, Herpertz S, Schütz A, Klein AM (2016) Emotional availability, understanding emotions, and recognition of facial emotions in obese mothers with young children. J Psychosom Res 80:44–52. https://doi.org/10.1016/j.jpsychores.2015.11.005
Cserjési R, Vermeulen N, Lénárd L, Luminet O (2011) Reduced capacity in automatic processing of facial expression in restrictive anorexia nervosa and obesity. Psychiatry Res 188:253–257. https://doi.org/10.1016/j.psychres.2010.12.008
Giel KE, Hartmann A, Zeeck A, Jux A, Vuck A, Gierthmuehlen PC, Wetzler-Burmeister E, Sandholz A, Marjanovic G, Joos A (2016) Decreased emotional perception in obesity. Eur Eat Disord Rev 24:341–346. https://doi.org/10.1002/erv.2444
Fernandes J, Ferreira-Santos F, Miller K, Torres S (2018) Emotional processing in obesity: a systematic review and exploratory meta-analysis. Obes Rev 19:111–120. https://doi.org/10.1111/obr.12607
Pink E, Lee M, Price M, Williams C (2019) A serial mediation model of the relationship between alexithymia and BMI: the role of negative affect, negative urgency and emotional eating. Appetite 133:270–278. https://doi.org/10.1016/j.appet.2018.11.014
Altamura M, Porcelli P, Fairfiled B, Malerba S, Carnevale R, Balzotti A, Rossi G, Vendemiale G, Bellomo A (2018) Alexithymia predicts attrition and outcome in weight-loss obesity treatment. Front Psychol 9:2432. https://doi.org/10.3389/fpsyg.2018.02432
Pinaquy S, Chabrol H, Simon C, Louvet JP, Barbe P (2003) Emotional eating, alexithymia, and binge‐eating disorder in obese women. Obes Res 11:195–201. https://doi.org/10.1038/oby.2003.31
MacFarland DDJ (1981) Oxford companion to animal behavior. Oxford University Press, Oxford
Blair RJR (2012) Considering anger from a cognitive neuroscience perspective. Wiley Interdiscip Rev Cogn Sci 3:65–74. https://doi.org/10.1002/wcs.154
Scarpina F, Melzi L, Castelnuovo G, Mauro A, Marzoli SB, Molinari E (2018) Explicit and implicit components of the emotional processing in non-organic vision loss: Behavioral evidence about the role of fear in functional blindness. Front Psychol 9:494. https://doi.org/10.3389/fpsyg.2018.00494
Leehr EJ, Krohmer K, Schag K, Dresler T, Zipfel S, Giel KE (2015) Emotion regulation model in binge eating disorder and obesity-a systematic review. Neurosci Biobehav Rev 49:125–134. https://doi.org/10.1016/j.neubiorev.2014.12.008
Beck AT, Ward CH, Mendelson M, Mock J, Erbaugh J (1961) An inventory for measuring depression. Arch Gen Psychiatry 4:561–571. https://doi.org/10.1001/archpsyc.1961.01710120031004
Sica C, Ghisi M (2007) The Italian versions of the Beck Anxiety Inventory and the Beck Depression Inventory-II: psychometric properties and discriminant power. In: Lange MA (ed) Leading-edge psychological tests and testing research. Nova Science Publishers, Hauppauge, pp 27–50
Spielberger CD, Gorsuch RL, Lushene R, Vagg PR, Jacobs GA (1983) Manual for the State-Trait Anxiety Inventory. Consulting Psychologists Press, Palo Alto
Macor A, Pedrabissi L, Santinello M (1990) Ansia di stato e di tratto: ulteriore contributo alla verifica della validità psicometrica e teorica dello S.T.A.I. forma Y di Spielberger. Psicol Soc 15:67–74
Bagby RM, Parker JD, Taylor GJ (1994) The twenty-item Toronto Alexithymia Scale–I. Item selection and cross-validation of the factor structure. J Psychosom Res 38:23–32. https://doi.org/10.1016/0022-3999(94)90005-1
Todarello O, Pace V (2010) Le scale di valutazione dell’alessitimia. Stato dell’arte dell’assessment eNOOs 3:171–187
Miniussi C, Girelli M, Marzi CA (1998) Neural site of the redundant target effect electrophysiological evidence. J Cogn Neurosci 10:216–230. https://doi.org/10.1162/089892998562663
Ekman P, Friesen WV (1976) Measuring facial movement. J Nonverbal Behav 1:56–75. https://doi.org/10.1007/BF01115465
Tamietto M, Adenzato M, Geminiani G, de Gelder B (2007) Fast recognition of social emotions takes the whole brain: interhemispheric cooperation in the absence of cerebral asymmetry. Neuropsychologia 45:836–843. https://doi.org/10.1016/j.neuropsychologia.2006.08.012
Tamietto M, Corazzini LL, de Gelder B, Geminiani G (2006) Functional asymmetry and interhemispheric cooperation in the perception of emotions from facial expressions. Exp Brain Res 171:389–404. https://doi.org/10.1007/s00221-005-0279-4
Brody LR, Hall JA (1993) Gender and emotion. In: Lewis M, Haviland JM (eds) Handbook of emotions. Guilford Press, New York, pp 447–460
Abbruzzese L, Magnani N, Robertson IH, Mancuso M (2019) Age and gender differences in emotion recognition. Front Psychol 10:2371. https://doi.org/10.3389/fpsyg.2019.02371
Funding
No funding was received in supporting this study.
Author information
Authors and Affiliations
Contributions
FS designed the study, managed the literature searches, performed the experiment and the statistical analyses, wrote the first draft of the manuscript. GV recruited participants and provide the psychological assessment. PC and AM supervised the recruitment. EM and GC supervised the psychological assessment. AM supervised the entire project. All authors contributed to and have approved the final manuscript.
Corresponding author
Ethics declarations
Conflict of interest
The authors declare no conflict of interest for this study.
Ethical approval
This study was approved by Ethics Committee of the I.R.C.C.S Istituto Auxologico Italiano (Italy) and it was conducted in accordance with the Declaration of Helsinki.
Informed consent
Subjects participated voluntarily and they gave informed written consent.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Scarpina, F., Varallo, G., Castelnuovo, G. et al. Implicit facial emotion recognition of fear and anger in obesity. Eat Weight Disord 26, 1243–1251 (2021). https://doi.org/10.1007/s40519-020-01010-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40519-020-01010-6