A subscription to JoVE is required to view this content. Sign in or start your free trial.
Method Article
An EEG experimental protocol is designed to clarify the interplay between conscious and non-conscious representations of emotional faces in patients with Asperger's syndrome. The technique suggests that patients with Asperger's syndrome have deficits in non-conscious representation of emotional faces, but have comparable performance in conscious representation with healthy controls.
Several neuroimaging studies have suggested that the low spatial frequency content in an emotional face mainly activates the amygdala, pulvinar, and superior colliculus especially with fearful faces1-3. These regions constitute the limbic structure in non-conscious perception of emotions and modulate cortical activity either directly or indirectly2. In contrast, the conscious representation of emotions is more pronounced in the anterior cingulate, prefrontal cortex, and somatosensory cortex for directing voluntary attention to details in faces3,4. Asperger's syndrome (AS)5,6 represents an atypical mental disturbance that affects sensory, affective and communicative abilities, without interfering with normal linguistic skills and intellectual ability. Several studies have found that functional deficits in the neural circuitry important for facial emotion recognition can partly explain social communication failure in patients with AS7-9. In order to clarify the interplay between conscious and non-conscious representations of emotional faces in AS, an EEG experimental protocol is designed with two tasks involving emotionality evaluation of either photograph or line-drawing faces. A pilot study is introduced for selecting face stimuli that minimize the differences in reaction times and scores assigned to facial emotions between the pretested patients with AS and IQ/gender-matched healthy controls. Information from the pretested patients was used to develop the scoring system used for the emotionality evaluation. Research into facial emotions and visual stimuli with different spatial frequency contents has reached discrepant findings depending on the demographic characteristics of participants and task demands2. The experimental protocol is intended to clarify deficits in patients with AS in processing emotional faces when compared with healthy controls by controlling for factors unrelated to recognition of facial emotions, such as task difficulty, IQ and gender.
Facial emotion recognition is one of the most important brain processes engaged in social communications. A variety of mental disorders are related to problems with explicit detection of facial emotions4-6. A photograph of a face contains a spectrum of spatial information that can be filtered for either the high spatial frequency (HSF) or low spatial frequency (LSF) content. HSF is related to highly detailed parts of an image, such as the edges of a face, while LSF is related to coarser or less well-defined parts such as a holistic face with LSF contents7. Any face recognition task simultaneously induces conscious and non-conscious processes8-12, and the participation of the non-conscious process occurs in the 150-250 msec post onset interval or even earlier13. In healthy controls, the non-conscious process is generally faster than the conscious process14,15. Several neuroimaging studies have suggested that the LSF in a facial stimulus (or motivationally significant stimulus) mainly activates the amygdala, pulvinar, and superior colliculus especially with fearful faces3,16. These regions constitute the limbic structure in non-conscious perception of emotions and modulate cortical activity either directly or indirectly1. In contrast, conscious representation of emotions is more pronounced in the anterior cingulate, prefrontal cortex, and somatosensory cortex for directing voluntary attention to details in the face9,17,18.
Asperger's syndrome (AS)19,20 represents an atypical mental disturbance that affects sensory, affective and communicative abilities, without interfering with normal linguistic skills and intellectual ability. Several studies have found that functional deficits in the neural circuitry important for facial emotion recognition can partly explain the social communication failure in AS21-25. Behavioral disorders observed in children with AS can be diagnosed in the first three years of life26, a period during which their voluntary (or conscious) control over behaviors is not fully developed27. In adults with AS, the behavioral disorders can be compensated for through attention regulation28. Difficulty in processing details within a certain spatial frequency range may indicate a disruption in different information processing stages. So far, no study has directly addressed evoked potentials and oscillatory activity in patients with AS during facial emotion recognition involving face stimuli in specific spatial frequency ranges. It is important to examine the functional trajectory in patients with AS when compared with healthy controls during processing facial stimuli with different spatial frequency contents by controlling for task demands and demographic effects such as gender and IQ.
In order to clarify the interplay between conscious and non-conscious representations of emotional faces, an EEG experimental protocol is designed for comparing brain evoked potentials and oscillatory activity between patients with AS and IQ/gender-matched healthy controls. A cohort of pilot participants was recruited prior to the EEG experiment for assistance with selection of the experimental stimuli and development of a scoring system in order to facilitate an evaluation of performance in patients with AS. The protocol consists of two tasks involving emotionality evaluation of either photograph or line-drawing faces. The differences between the two groups can be assessed by computing ERPs and event-related spectral perturbations (ERSPs). In the next section, the details of the experimental protocol are elaborated on, including the pilot study and EEG data processing/analysis methods, followed by the main analysis results. Finally, the critical steps in the protocol and its significance with respect to existing methods are discussed. The limitation and possible extension of the protocol to use in patients with other emotional disorders are also pointed out.
Access restricted. Please log in or start a trial to view this content.
Ethics Statement: Procedures involving human participants have been approved by the human participant research ethics committee/Institutional Review Board at the Academia Sinica, Taiwan.
1. Stimuli and Experimental Program Preparation
Figure 1. Examples of emotional face stimuli. (A) photograph faces where the hair and ears have been masked out in the black background color, and (B) line-drawing faces that are edited from (A) by graphics software. The faces show neutral, happy, and angry emotions respectively from the top to bottom rows. Please click here to view a larger version of this figure.
Figure 2. A screenshot of a face stimulus in the program. The size of the face is configured to fit the height of the screen. The empty area is filled in with the black color. Please click here to view a larger version of this figure.
Figure 3. A screenshot of the scoring system for emotionality evaluation. The scoring bar is designed to have no tick mark. The participant needs to drag the mouse to select the score assigned to a face and press the GO button to finish the task. Please click here to view a larger version of this figure.
2. EEG Recording Procedure
3. Processing EEG Data
Note: The software commands provided in this section are specific for EEGLAB.
4. Statistical Analysis
Figure 4. The channel partition. The channels are divided into eleven regions. LF: left-frontal (10 channels), MF: midline-frontal (14), RF: right-frontal (10), LT: left-temporal (13), RT: right-temporal (13), LC: left-central (9), MC: midline-central (14), RC: right-central (9), LP: left-occipital parietal (9), MP: midline-occipital parietal (12), RP: right-occipital parietal (9). Please click here to view a larger version of this figure.
Access restricted. Please log in or start a trial to view this content.
The average verbal and performance IQ scores are listed in Table 1 for the control and AS groups along with the average reaction times and average scores assigned to emotionality of faces of the two groups. In the table, none of the group differences achieves statistical significance except for the neutral faces in the line-drawing task, where the AS group has an average score near zero (p <0.001)13. Interestingly, the AS group still has slightly lo...
Access restricted. Please log in or start a trial to view this content.
The literature features studies on recognition of facial emotions in patients with autism by analysis of EEG reactions44, and on recognition of high- and low-spatial frequency contents using visual stimuli43. To the best of our knowledge, however, there is a lack of existing work on the brain oscillatory activity that combines emotion recognition with distinct spatial frequency contents. Our protocol is a first step towards estimating the influence of emotionality (positive, neutral and negative fac...
Access restricted. Please log in or start a trial to view this content.
The authors have nothing to disclose.
This research was supported by grants MOST102-2410-H-001-044 and MOST103-2410-H-001-058-MY2 to M. Liou, and RSF-14-15-00202 to A.N. Savostyanov. The support of Russian Science Foundation (RSF) was used for elaboration of experimental paradigm of face recognition.
Access restricted. Please log in or start a trial to view this content.
Name | Company | Catalog Number | Comments |
Synamps 2/RT 128-channel EEG/EP/ERP | Neuroscan | ||
Quik-CapEEG 128 electrodes | Neuroscan | ||
Gel | Quik-Gel | ||
FASTRAK 3D digitizer | Polhemus |
Access restricted. Please log in or start a trial to view this content.
Request permission to reuse the text or figures of this JoVE article
Request PermissionThis article has been published
Video Coming Soon
Copyright © 2025 MyJoVE Corporation. All rights reserved