Gender difference in N170 elicited under oddball task
© Choi et al.; licensee BioMed Central. 2015
Received: 6 October 2014
Accepted: 22 January 2015
Published: 4 March 2015
Some studies have reported gender differences in N170, a face-selective event-related potential (ERP) component. This study investigated gender differences in N170 elicited under oddball paradigm in order to clarify the effect of task demand on gender differences in early facial processing.
Twelve males and 10 females discriminated targets (emotional faces) from non-targets (emotionally neutral faces) under an oddball paradigm, pressing a button as quickly as possible in response to the target. Clear N170 was elicited in response to target and non-target stimuli in both males and females. However, females showed more negative amplitude of N170 in response to target compared with non-target, while males did not show different N170 responses between target and non-target.
The present results suggest that females have a characteristic of allocating attention at an early stage when responding to faces actively (target) compared to viewing faces passively (non-target). This supports previous findings suggesting that task demand is an important factor in gender differences in N170.
KeywordsGender difference N170 Event-related potential Attention Face Oddball task
Many psychological [1-3] and physiological [4-16] studies have revealed gender differences in facial processing. Wood and Eagly  argued that gender differences in behaviours might be generated from biological specialization, such as male physical attributes (size, strength and speed) and female reproductive capacity. From an anthropological perspective, gender differences in face processing are thus thought to be related to the differing social roles of males and females.
Some event-related potential (ERP) studies [4-9] have also investigated gender differences in facial processing, using N170 as the index of attention. N170 is an ERP component showing a negative peak around 170 ms after face onset in the posterior temporal area [18-22] and is thus considered a face-selective ERP component. Given that N170 is more negative when faces are attended than when faces are presented outside the attentional focus , a more negative N170 appears to reflect increased attention to faces. Sun et al.  revealed that females showed a more negative amplitude of N170 when discriminating orientations (right or left) of faces than genders of faces, while males did not. From this result, the authors suggested that the effect of task demands on N170 is more obvious in females than in males . However, to the best of our knowledge, very few studies have investigated the effect of task demand on gender differences in N170 [4,9]. This issue therefore remains unclear.
To clarify how task demand affects gender differences in N170, examining whether males and females show differences in N170 between reacting to faces actively (for example, pressing a button) and viewing faces passively seems to be appropriate. The present study thus aimed to investigate gender differences in N170 elicited under an oddball task by reanalyzing data from our previous study . The oddball task is a well-studied paradigm in which two types of stimuli are presented and the participant is usually instructed to press a button in response to one type of stimulus (the target). In the present study, target stimuli were emotional faces (happy, angry, surprised, afraid or sad), while non-target stimuli were emotionally neutral faces. We hypothesized that females, compared to males, would show greater difference in N170 when responding to target and non-target faces, given that females show greater N170 modulation by task demand .
Twenty-two healthy, right-handed undergraduate and graduate students (12 males: age range, 21 to 25 years; 10 females: age range, 22 to 28 years) participated in this study. All participants provided written informed consent. The study was approved by the ethics committee in the Department of Design at Kyushu University, Japan.
Stimuli and procedures
We selected images of 12 adult humans (six men, six women, 20 to 30 years of age, Caucasian) showing six types of facial expression (neutral, happy, angry, surprised, afraid or sad) from Karolinska Directed Emotional Faces . All images were edited to square of 300 × 400 pixels and presented in the centre of a black screen (17-inch monitor, 1,024 × 768 resolution). The distance between the participants and the monitor was 70 cm, and the images subtended approximately 6° × 6° of visual angle.
Electroencephalography (EEG) was recorded during five blocks of oddball tasks. Each block consisted of 96 trials. Non-target stimuli (presented in 75% of trials) were emotionally neutral faces in all blocks, whereas target stimuli (presented in 25% of trials) were happy, angry, surprised, afraid or sad faces in each block. Participants responded to target stimuli by pressing a button using the right hand. In each trial, a cross shape was presented (500 ms), followed by a target or non-target image (800 ms). The interstimulus interval was 1,000 ms, and targets were never presented consecutively.
ERP measurements and analysis
We recorded EEG with averaged ears as the reference using a Polymate AP1532 system (TEAC, Tokyo, Japan) from the following five sites: Fz (medial frontal), Cz (medial central), Pz (medial parietal), T5 (left posterior temporal) and T6 (right posterior temporal). We also recorded electrooculography (EOG) to detect blinking with electrodes above and below the right eye. The impedance of each electrode was kept below 10 kΩ.
EEG signals were digitized at a sampling rate of 500 Hz, and a band-pass filter of 1 to 30 Hz was applied (EMSE Suite; Source Signal Imaging, San Diego, CA, USA). Target and non-target stimulus presentation of −200 to 800 ms was averaged across blocks (baseline: −200 to 0 ms). Trials containing artefacts >50 μV and trials in which the subject did not respond were excluded from averages. For target stimuli, the mean number of trials was 106.4 (standard deviation (SD) = 8.8) and 97.1 (SD = 8.8) in males and females, respectively. For non-target stimuli, the mean number of trials was 322.3 (SD = 29.6) and 288.6 (SD = 54.5) in males and females, respectively.
We calculated N170 as the most negative potential within 140 to 200 ms at the T5 and T6 sites, where N170 amplitude has been reported to be most negative [18-22] and to show gender differences in asymmetry [5-8].
For ERP responses, we conducted repeated-measures analysis of variance (ANOVA) with gender as a between-subject factor and task (target and non-target) and site (T5 and T6) as within-subject factors. For behavioural data (response accuracies and reaction times), the independent t-test was used for comparisons between males and females.
Statistical significance was accepted at the 5% level (P < 0.05) (SPSS, Chicago, IL, USA). The Greenhouse-Geisser correction was applied where sphericity was violated. When the main effect or an interaction was significant, pairwise comparisons were performed with the Bonferroni correction.
No significant gender differences in response accuracies or reaction times were seen (response accuracies: t = −1.85; reaction times: t = 0.81; all df = 20, all P > 0.05).
The present study investigated gender differences in N170 elicited under the oddball task, in order to identify effects of task demand on gender differences in early facial processing. We found that females showed more negative N170 in response to target than non-target, whereas males did not show any difference in N170 between response to target and non-target (Figure 2). This suggests that females tend to show increased early attention when responding to faces actively (target) compared to viewing faces passively (non-target). This finding supports a previous study  that suggested females, compared to males, are more sensitive to N170 modulation by task demand.
One possible explanation of the present result is biological specialization between males and females. As mentioned in the introduction, females generally have less physical attributes (size, strength and speed) compared to males and have traditionally played a social role in raising children through gauging emotional states of infants from their faces . Thus, in terms of survival for themselves and their children, females might have needed to be especially sensitive to facial expressions requiring active response, compared to males.
On the other hand, the current results indicated that both males and females showed more negative N170 in the right posterior temporal area (T6) than in the left posterior temporal area (T5) (Figure 1). This is not consistent with previous studies that have reported gender differences in the hemispheric asymmetry of N170 [5-8]. In those studies [5,6], males showed right hemispheric dominance of N170, whereas females showed N170 over both right and left hemispheres. At this point, explaining this difference between the previous and present results relating to asymmetry in N170 seems to be difficult, and future research is thus needed to address this question.
Several limitations must be considered when discussing the present results. First, the number of participants (12 males and 10 females) was smaller than previous studies of gender difference in N170 (14 males and 14 females ; 20 males and 20 females ; 25 males and 25 females ). Second, target faces were emotional faces and non-target faces were emotionally neutral faces in the present study. The possibility must be considered that increased attention to targets in females might be affected by not only the effect of target but also the effect of emotion. Further research is warranted to clarify this issue.
In conclusion, we found more negative N170 elicited by target than non-target in females but not in males. This suggests that only females might show increased early-stage attention when actively responding to faces than when viewing faces passively. Task demand thus seems to be an important factor in gender differences in N170, as suggested by previous studies.
analysis of variance
The authors sincerely thank the participants of the study.
- McClure EB. A meta-analytic review of sex differences in facial expression processing and their development in infants, children, and adolescents. Psychol Bull. 2000;126:424–53.View ArticlePubMedGoogle Scholar
- Hall JA. Gender effects in decoding nonverbal cues. Psychol Bull. 1978;85:845–57.View ArticleGoogle Scholar
- Montagne B, Kessels RP, Frigerio E, de Haan EH, Perrett DI. Sex differences in the perception of affective facial expressions: do men really lack emotional sensitivity? Cogn Process. 2005;6:136–41.View ArticlePubMedGoogle Scholar
- Sun Y, Gao X, Han S. Sex differences in face gender recognition: an event-related potential study. Brain Res. 2010;1327:69–76.View ArticlePubMedGoogle Scholar
- Proverbio AM, Brignone V, Matarazzo S, Del Zotto M, Zani A. Gender differences in hemispheric asymmetry for face processing. BMC Neurosci. 2006;7:44.View ArticlePubMed CentralPubMedGoogle Scholar
- Proverbio AM, Riva F, Martin E, Zani A. Face coding is bilateral in the female brain. PLoS One. 2010;5:e11242.View ArticlePubMed CentralPubMedGoogle Scholar
- Proverbio AM, Brignone V, Matarazzo S, Del Zotto M, Zani A. Gender and parental status affect the visual cortical response to infant facial expression. Neuropsychologia. 2006;44:2987–99.View ArticlePubMedGoogle Scholar
- Proverbio AM, Mazzara R, Riva F, Manfredi M. Sex differences in callosal transfer and hemispheric specialization for face coding. Neuropsychologia. 2012;50:2325–32.View ArticlePubMedGoogle Scholar
- Wang J, Kitayama S, Han S. Sex difference in the processing of task-relevant and task-irrelevant social information: an event-related potential study of familiar face recognition. Brain Res. 2011;1408:41–51.View ArticlePubMedGoogle Scholar
- Campanella S, Rossignol M, Mejias S, Joassin F, Maurage P, Debatisse D, et al. Human gender differences in an emotional visual oddball task: an event-related potentials study. Neurosci Lett. 2004;367:14–8.View ArticlePubMedGoogle Scholar
- Guillem F, Mograss M. Gender differences in memory processing: evidence from event-related potentials to faces. Brain Cogn. 2005;57:84–92.View ArticlePubMedGoogle Scholar
- Orozco S, Ehlers CL. Gender differences in electrophysiological responses to facial stimuli. Biol Psychiatry. 1998;44:281–9.View ArticlePubMedGoogle Scholar
- Schulte-Rüther M, Markowitsch HJ, Shah NJ, Fink GR, Piefke M. Gender differences in brain networks supporting empathy. Neuroimage. 2008;42:393–403.View ArticlePubMedGoogle Scholar
- Lee TM, Liu HL, Hoosain R, Liao WT, Wu CT, Yuen KS, et al. Gender differences in neural correlates of recognition of happy and sad faces in humans assessed by functional magnetic resonance imaging. Neurosci Lett. 2002;333:13–6.View ArticlePubMedGoogle Scholar
- Schneider F, Habel U, Kessler C, Salloum JB, Posse S. Gender differences in regional cerebral activity during sadness. Hum Brain Mapp. 2000;9:226–38.View ArticlePubMedGoogle Scholar
- Killgore WD, Yurgelun-Todd DA. Sex differences in amygdala activation during the perception of facial affect. Neuroreport. 2001;12:2543–7.View ArticlePubMedGoogle Scholar
- Wood W, Eagly AH. A cross-cultural analysis of the behavior of women and men: implications for the origins of sex differences. Psychol Bull. 2002;128:699–727.View ArticlePubMedGoogle Scholar
- Bentin S, Allison T, Puce A, Perez E, McCarthy G. Electrophysiological studies of face perception in humans. J Cogn Neurosci. 1996;8:551–65.View ArticlePubMed CentralPubMedGoogle Scholar
- Campanella S, Hanoteau C, Dépy D, Rossion B, Bruyer R, Crommelinck M, et al. Right N170 modulation in a face discrimination task: an account for categorical perception of familiar faces. Psychophysiology. 2000;37:796–806.View ArticlePubMedGoogle Scholar
- Eimer M, Holmes A. An ERP study on the time course of emotional face processing. Neuroreport. 2002;13:427–31.View ArticlePubMedGoogle Scholar
- Holmes A, Vuilleumier P, Eimer M. The processing of emotional facial expression is gated by spatial attention: evidence from event-related brain potentials. Brain Res Cogn Brain Res. 2003;16:174–84.View ArticlePubMedGoogle Scholar
- Taylor MJ, McCarthy G, Saliba E, Degiovanni E. ERP evidence of developmental changes in processing of faces. Clin Neurophysiol. 1999;110:910–5.View ArticlePubMedGoogle Scholar
- Choi D, Nishimura T, Motoi M, Egashira Y, Matsumoto R, Watanuki S. Effect of empathy trait on attention to various facial expressions: evidence from N170 and late positive potential (LPP). J Physiol Anthropol. 2014;33:18.View ArticlePubMed CentralPubMedGoogle Scholar
- Lundqvist D, Flykt A, Ohman A. The Karolinska directed emotional faces - KDEF(CD ROM). Department of Clinical Neuroscience, Psychology section, KarolinskaInstitutet: Stockholm; 2011.Google Scholar
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.