Limits...
Angry facial expressions bias gender categorization in children and adults: behavioral and computational evidence.

Bayet L, Pascalis O, Quinn PC, Lee K, Gentaz É, Tanaka JW - Front Psychol (2015)

Bottom Line: Angry faces are perceived as more masculine by adults.Based on several computational simulations of gender categorization (Experiment 3), we further conclude that (1) the angry-male bias results, at least partially, from a strategy of attending to facial features or their second-order relations when categorizing face gender, and (2) any single choice of computational representation (e.g., Principal Component Analysis) is insufficient to assess resemblances between face categories, as different representations of the very same faces suggest different bases for the angry-male bias.Taken together, the evidence suggests considerable stability in the interaction between some facial dimensions in social categorization that is present prior to the onset of formal schooling.

View Article: PubMed Central - PubMed

Affiliation: Laboratoire de Psychologie et Neurocognition, University of Grenoble-Alps Grenoble, France ; Laboratoire de Psychologie et Neurocognition, Centre National de la Recherche Scientifique Grenoble, France.

ABSTRACT
Angry faces are perceived as more masculine by adults. However, the developmental course and underlying mechanism (bottom-up stimulus driven or top-down belief driven) associated with the angry-male bias remain unclear. Here we report that anger biases face gender categorization toward "male" responding in children as young as 5-6 years. The bias is observed for both own- and other-race faces, and is remarkably unchanged across development (into adulthood) as revealed by signal detection analyses (Experiments 1-2). The developmental course of the angry-male bias, along with its extension to other-race faces, combine to suggest that it is not rooted in extensive experience, e.g., observing males engaging in aggressive acts during the school years. Based on several computational simulations of gender categorization (Experiment 3), we further conclude that (1) the angry-male bias results, at least partially, from a strategy of attending to facial features or their second-order relations when categorizing face gender, and (2) any single choice of computational representation (e.g., Principal Component Analysis) is insufficient to assess resemblances between face categories, as different representations of the very same faces suggest different bases for the angry-male bias. Our findings are thus consistent with stimulus-and stereotyped-belief driven accounts of the angry-male bias. Taken together, the evidence suggests considerable stability in the interaction between some facial dimensions in social categorization that is present prior to the onset of formal schooling.

No MeSH data available.


Related in: MedlinePlus

Example stimuli used in Experiments 1–3 (A) and in the control study (B). The identity of the faces used in Experiments 1–3 and in the control study were identical, but in the control study all faces were in neutral expression while faces in Experiments 1–3 had either angry, smiling or neutral expressions. Sixteen of the 120 faces from Experiments 1–3 had no neutral pose in the database.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
getmorefigures.php?uid=PMC4374394&req=5

Figure 1: Example stimuli used in Experiments 1–3 (A) and in the control study (B). The identity of the faces used in Experiments 1–3 and in the control study were identical, but in the control study all faces were in neutral expression while faces in Experiments 1–3 had either angry, smiling or neutral expressions. Sixteen of the 120 faces from Experiments 1–3 had no neutral pose in the database.

Mentions: One hundred twenty face stimuli depicting unique identities were selected from the Karolinska Directed Emotional Face database (Lundqvist et al., 1998; Calvo and Lundqvist, 2008), the NimStim database (Tottenham et al., 2002, 2009), and the Chinese Affective Picture System (Lu et al., 2005) database in their frontal view versions. Faces were of different races (Caucasian, Chinese), genders (female, male), and expressions (angry, neutral, smiling). Faces were gray scaled and placed against a white background; external features were cropped using GIMP. Luminance, contrast, and placement of the eyes were matched using SHINE (Willenbockel et al., 2010) and the Psychomorph software (Tiddeman, 2005, 2011). Emotion intensity and recognition accuracy were matched across races and genders and are summarized in Supplementary Table 1. See Figure 1A for examples of the stimuli used. Selecting 120 emotional faces depicting unique identities for the high validity of their emotional expressions might lead to a potential selection bias, e.g., the female faces that would display anger most reliably might also be the most masculine female faces. To resolve this issue, a control study (Supplementary Material) was conducted in which gender typicality ratings were obtained for the neutral poses of the same 120 faces. See Figure 1B for examples of the stimuli used in the control study.


Angry facial expressions bias gender categorization in children and adults: behavioral and computational evidence.

Bayet L, Pascalis O, Quinn PC, Lee K, Gentaz É, Tanaka JW - Front Psychol (2015)

Example stimuli used in Experiments 1–3 (A) and in the control study (B). The identity of the faces used in Experiments 1–3 and in the control study were identical, but in the control study all faces were in neutral expression while faces in Experiments 1–3 had either angry, smiling or neutral expressions. Sixteen of the 120 faces from Experiments 1–3 had no neutral pose in the database.
© Copyright Policy - open-access
Related In: Results  -  Collection

License
Show All Figures
getmorefigures.php?uid=PMC4374394&req=5

Figure 1: Example stimuli used in Experiments 1–3 (A) and in the control study (B). The identity of the faces used in Experiments 1–3 and in the control study were identical, but in the control study all faces were in neutral expression while faces in Experiments 1–3 had either angry, smiling or neutral expressions. Sixteen of the 120 faces from Experiments 1–3 had no neutral pose in the database.
Mentions: One hundred twenty face stimuli depicting unique identities were selected from the Karolinska Directed Emotional Face database (Lundqvist et al., 1998; Calvo and Lundqvist, 2008), the NimStim database (Tottenham et al., 2002, 2009), and the Chinese Affective Picture System (Lu et al., 2005) database in their frontal view versions. Faces were of different races (Caucasian, Chinese), genders (female, male), and expressions (angry, neutral, smiling). Faces were gray scaled and placed against a white background; external features were cropped using GIMP. Luminance, contrast, and placement of the eyes were matched using SHINE (Willenbockel et al., 2010) and the Psychomorph software (Tiddeman, 2005, 2011). Emotion intensity and recognition accuracy were matched across races and genders and are summarized in Supplementary Table 1. See Figure 1A for examples of the stimuli used. Selecting 120 emotional faces depicting unique identities for the high validity of their emotional expressions might lead to a potential selection bias, e.g., the female faces that would display anger most reliably might also be the most masculine female faces. To resolve this issue, a control study (Supplementary Material) was conducted in which gender typicality ratings were obtained for the neutral poses of the same 120 faces. See Figure 1B for examples of the stimuli used in the control study.

Bottom Line: Angry faces are perceived as more masculine by adults.Based on several computational simulations of gender categorization (Experiment 3), we further conclude that (1) the angry-male bias results, at least partially, from a strategy of attending to facial features or their second-order relations when categorizing face gender, and (2) any single choice of computational representation (e.g., Principal Component Analysis) is insufficient to assess resemblances between face categories, as different representations of the very same faces suggest different bases for the angry-male bias.Taken together, the evidence suggests considerable stability in the interaction between some facial dimensions in social categorization that is present prior to the onset of formal schooling.

View Article: PubMed Central - PubMed

Affiliation: Laboratoire de Psychologie et Neurocognition, University of Grenoble-Alps Grenoble, France ; Laboratoire de Psychologie et Neurocognition, Centre National de la Recherche Scientifique Grenoble, France.

ABSTRACT
Angry faces are perceived as more masculine by adults. However, the developmental course and underlying mechanism (bottom-up stimulus driven or top-down belief driven) associated with the angry-male bias remain unclear. Here we report that anger biases face gender categorization toward "male" responding in children as young as 5-6 years. The bias is observed for both own- and other-race faces, and is remarkably unchanged across development (into adulthood) as revealed by signal detection analyses (Experiments 1-2). The developmental course of the angry-male bias, along with its extension to other-race faces, combine to suggest that it is not rooted in extensive experience, e.g., observing males engaging in aggressive acts during the school years. Based on several computational simulations of gender categorization (Experiment 3), we further conclude that (1) the angry-male bias results, at least partially, from a strategy of attending to facial features or their second-order relations when categorizing face gender, and (2) any single choice of computational representation (e.g., Principal Component Analysis) is insufficient to assess resemblances between face categories, as different representations of the very same faces suggest different bases for the angry-male bias. Our findings are thus consistent with stimulus-and stereotyped-belief driven accounts of the angry-male bias. Taken together, the evidence suggests considerable stability in the interaction between some facial dimensions in social categorization that is present prior to the onset of formal schooling.

No MeSH data available.


Related in: MedlinePlus