JoVE Logo

Sign In

A subscription to JoVE is required to view this content. Sign in or start your free trial.

In This Article

  • Summary
  • Abstract
  • Introduction
  • Protocol
  • Results
  • Discussion
  • Disclosures
  • Acknowledgements
  • Materials
  • References
  • Reprints and Permissions

Summary

An EEG experimental protocol is designed to clarify the interplay between conscious and non-conscious representations of emotional faces in patients with Asperger's syndrome. The technique suggests that patients with Asperger's syndrome have deficits in non-conscious representation of emotional faces, but have comparable performance in conscious representation with healthy controls.

Abstract

Several neuroimaging studies have suggested that the low spatial frequency content in an emotional face mainly activates the amygdala, pulvinar, and superior colliculus especially with fearful faces1-3. These regions constitute the limbic structure in non-conscious perception of emotions and modulate cortical activity either directly or indirectly2. In contrast, the conscious representation of emotions is more pronounced in the anterior cingulate, prefrontal cortex, and somatosensory cortex for directing voluntary attention to details in faces3,4. Asperger's syndrome (AS)5,6 represents an atypical mental disturbance that affects sensory, affective and communicative abilities, without interfering with normal linguistic skills and intellectual ability. Several studies have found that functional deficits in the neural circuitry important for facial emotion recognition can partly explain social communication failure in patients with AS7-9. In order to clarify the interplay between conscious and non-conscious representations of emotional faces in AS, an EEG experimental protocol is designed with two tasks involving emotionality evaluation of either photograph or line-drawing faces. A pilot study is introduced for selecting face stimuli that minimize the differences in reaction times and scores assigned to facial emotions between the pretested patients with AS and IQ/gender-matched healthy controls. Information from the pretested patients was used to develop the scoring system used for the emotionality evaluation. Research into facial emotions and visual stimuli with different spatial frequency contents has reached discrepant findings depending on the demographic characteristics of participants and task demands2. The experimental protocol is intended to clarify deficits in patients with AS in processing emotional faces when compared with healthy controls by controlling for factors unrelated to recognition of facial emotions, such as task difficulty, IQ and gender.

Introduction

Facial emotion recognition is one of the most important brain processes engaged in social communications. A variety of mental disorders are related to problems with explicit detection of facial emotions4-6. A photograph of a face contains a spectrum of spatial information that can be filtered for either the high spatial frequency (HSF) or low spatial frequency (LSF) content. HSF is related to highly detailed parts of an image, such as the edges of a face, while LSF is related to coarser or less well-defined parts such as a holistic face with LSF contents7. Any face recognition task simultaneously induces conscious and non-conscious processes8-12, and the participation of the non-conscious process occurs in the 150-250 msec post onset interval or even earlier13. In healthy controls, the non-conscious process is generally faster than the conscious process14,15. Several neuroimaging studies have suggested that the LSF in a facial stimulus (or motivationally significant stimulus) mainly activates the amygdala, pulvinar, and superior colliculus especially with fearful faces3,16. These regions constitute the limbic structure in non-conscious perception of emotions and modulate cortical activity either directly or indirectly1. In contrast, conscious representation of emotions is more pronounced in the anterior cingulate, prefrontal cortex, and somatosensory cortex for directing voluntary attention to details in the face9,17,18.

Asperger's syndrome (AS)19,20 represents an atypical mental disturbance that affects sensory, affective and communicative abilities, without interfering with normal linguistic skills and intellectual ability. Several studies have found that functional deficits in the neural circuitry important for facial emotion recognition can partly explain the social communication failure in AS21-25. Behavioral disorders observed in children with AS can be diagnosed in the first three years of life26, a period during which their voluntary (or conscious) control over behaviors is not fully developed27. In adults with AS, the behavioral disorders can be compensated for through attention regulation28. Difficulty in processing details within a certain spatial frequency range may indicate a disruption in different information processing stages. So far, no study has directly addressed evoked potentials and oscillatory activity in patients with AS during facial emotion recognition involving face stimuli in specific spatial frequency ranges. It is important to examine the functional trajectory in patients with AS when compared with healthy controls during processing facial stimuli with different spatial frequency contents by controlling for task demands and demographic effects such as gender and IQ.

In order to clarify the interplay between conscious and non-conscious representations of emotional faces, an EEG experimental protocol is designed for comparing brain evoked potentials and oscillatory activity between patients with AS and IQ/gender-matched healthy controls. A cohort of pilot participants was recruited prior to the EEG experiment for assistance with selection of the experimental stimuli and development of a scoring system in order to facilitate an evaluation of performance in patients with AS. The protocol consists of two tasks involving emotionality evaluation of either photograph or line-drawing faces. The differences between the two groups can be assessed by computing ERPs and event-related spectral perturbations (ERSPs). In the next section, the details of the experimental protocol are elaborated on, including the pilot study and EEG data processing/analysis methods, followed by the main analysis results. Finally, the critical steps in the protocol and its significance with respect to existing methods are discussed. The limitation and possible extension of the protocol to use in patients with other emotional disorders are also pointed out.

Access restricted. Please log in or start a trial to view this content.

Protocol

Ethics Statement: Procedures involving human participants have been approved by the human participant research ethics committee/Institutional Review Board at the Academia Sinica, Taiwan.

1. Stimuli and Experimental Program Preparation

  1. Prepare a pool of more than 60 emotional face photographs29 categorized into three facial expressions (angry, happy, and neutral). Use graphics software to mask out hair and ear parts in the photographs with black background as shown in Figure 1A so that participants can concentrate on the facial features in the photographs.
    1. Open a photograph in the graphics software. Use the selection toolbox to draw an elliptical region and adjust the region size so that the ears and most hair do not fall in the ellipse.
    2. Invert the selected region. Click "delete" to remove the unwanted region of the photograph and replace it with the black background color.

Facial expression analysis: diagram comparing photo and sketch interpretations of emotions.
Figure 1. Examples of emotional face stimuli. (A) photograph faces where the hair and ears have been masked out in the black background color, and (B) line-drawing faces that are edited from (A) by graphics software. The faces show neutral, happy, and angry emotions respectively from the top to bottom rows. Please click here to view a larger version of this figure.

  1. Create a pilot study. Recruit pilot participants for selecting suitable stimuli from the photograph pool.
    Note: The pilot participants should not participate in the EEG experiment.
    1. Configure the stimulus presentation program beginning with the first computer screen presenting the task instruction, followed by 5 familiarization trials. Begin each trial with a fixation cross, followed by a face stimulus, and by an emotionality evaluation task. See Supplemental Code File for an example program.
      Note: The real pilot trials immediately follow the familiarization trials by selecting face photographs in a random order from the pool.
      1. Create an experimental program, including the instruction screens and a central eye-fixation screen. Create the face stimulus screen as illustrated in Figure 2 by configuring the photograph size to be 18.3 x 24.4 cm2 (width x height) with black background color, given a computer screen size 41 x 25.6 cm2 with resolution 1,680 x 1,050. See Supplemental Code File for an example program.
      2. Create a scoring system for emotionality evaluation in the program as illustrated in Figure 3. Place a horizontal line ranging from -100 to +100 in a continuous scale in the center of the screen without any tick-marks, except for the central and endpoints. Prepare the program such that participants can freely evaluate the emotionality of a photograph face by dragging the scoring cursor to the left for very angry (-100) and to the right for very happy (+100), and press the GO button.
        Note: The scoring line is designed without any tick-marks because patients with AS can easily get stuck in placing the cursor between ticks during emotionality evaluation. Therefore, a continuous scale is preferred for patients.
      3. Make sure the program records a participant's behavioral results (e.g. reaction time and emotionality scores), which are used as criteria for choosing photographs from the pool (see step 1.3.1).
    2. Recruit pilot participants (5 control and 5 AS pilot participants). Diagnose clinical patients according to Gillberg30 and DSM-IV criteria26 and administer the clinical derived short-form of Wechsler Adult Intelligence Scale (WAIS-III)31. Match the controls to their AS counterparts as closely as possible on gender, and on verbal/performance IQ scores.
    3. Run the experimental procedure in the pilot study for each individual participant. After completing the emotional face recognition task, interview each pilot AS participant on the reasonable duration of the central-eye fixation and stimulus presentation periods, difficulty of the task, ease of using the scoring system and the maximum number of trials for keeping his/her concentration, based on which the program can be reconfigured for the EEG experiment (see step 1.3.2)

Facial expression analysis; psychological study; monochrome; emotion detection; facial features.
Figure 2. A screenshot of a face stimulus in the program. The size of the face is configured to fit the height of the screen. The empty area is filled in with the black color. Please click here to view a larger version of this figure.

Emotion analysis interface; slider measures mood from angry to happy in a user input study.
Figure 3. A screenshot of the scoring system for emotionality evaluation. The scoring bar is designed to have no tick mark. The participant needs to drag the mouse to select the score assigned to a face and press the GO button to finish the task. Please click here to view a larger version of this figure.

  1. Program for Task 1: Photograph Session.
    1. Select from the pool 30 photographs, comprising 10 each for happy, angry, and neutral facial expressions (5 male and 5 female faces for each type of expressions), that give the most comparable mean reaction times and mean emotionality scores between the 5 AS and 5 control pilot participants.
    2. Update the experimental program configurations by incorporating feedback from the pilot patients, such as the optimal central eye-fixation period (i.e., 1,000 msec), duration of stimulus presentation (i.e., 1,000 msec), inter-stimulus interval (i.e., randomly assigned in-between 4 and 7 sec), and scale of the scoring system (i.e., -100 to 100). Add five familiarization trials prior to the 30 experimental trials in the program.
      1. Change the number of stimuli and time intervals in an external configuration text file associated with the experimental program.
        Note: The text file can be modified to fit different experimental conditions without intervention of software engineers.
      2. Do not count the five photographs for familiarization trials to the 30 selected photographs. Do not use the EEGs and the behavioral data recorded in familiarization trials in data analysis.
  2. Program for Task 2: Line-drawing Session.
    1. Create line-drawing pictures of the 35 photographs (5 for familiarization trials, 30 for experimental trials) used in Task 1 by tracing the edges of each face. Use graphics software to modify the grey scale photographs into black-and-white line-drawings as shown in Figure 1B.
      Note: Steps below for photograph editing is one of the possible solutions for making line-drawings.
      1. In the graphics software, adjust the brightness/contrast of the photograph so that the original grey scale intensity in the majority pixels falls in either black or white.
      2. Apply "sketch effect" in the "effect" or "filter" menu of the software to a grey scale photograph so that only contour of the high spatial frequency part is preserved, and apply "distress effect" to increase the dilation of the contour lines.
      3. Use any brush tool to enhance the contours and use an eraser tool to clean up unwanted parts. Make sure to keep important facial features by checking back and forth between the original photograph and its line-drawing counterpart.
    2. Make a copy of the program of Task 1 in step 1.3 to create a program for Task 2 and replace the 35 photographs in Task 1 with the corresponding line-drawings.

2. EEG Recording Procedure

  1. Preparations
    1. Recruit 10 healthy controls and 10 patients with AS for EEG experiments based on the guidelines of the local human participant research ethics committee/Institutional Review Board.
    2. Administer the short-form of WAIS-III31 to the patients with AS individually prior to the experiments, and find the controls who match the patients as closely as possible on gender and on the verbal/performance IQ scores.
  2. EEG Recording
    1. Seat the participant in a comfortable chair in a sound insulated (dimly lit) chamber and adjust the chair position so that the computer screen is 60 cm in front of the participant. After a tutorial on the experimental procedure, have the participant fill out the consent forms along with a few questions on his/her handedness.
    2. Use an EEG cap with 132 Ag/AgCl electrodes (including 122 10-10 system EEG, and the bipolar VEOG, HEOG, EKG, EMG electrodes, along with six facial-muscle channels) to record EEGs. Connect the cap to two 64-channel amplifiers with 0.1-100 Hz analog band-pass filter to digitize raw EEGs at 1,000 Hz sampling rate.
    3. Fit the standard 128-channel EEG cap to each participant's head. Adjust the cap so that the electrode labeled "reference" is placed at the "Cz" position, which is located relative to the anterior/posterior midline landmarks (i.e., middle of the nasion to inion distance), and to the left/right landmarks (i.e., middle of the left/right tragis), according to the EEG international 10/10 system.
    4. Gently use a blunt needle to inject conductive gel into all the electrodes. Stir with the needle slowly inside the electrode to ensure good gel contact between the scalp and the electrode (i.e., to keep the impedance below 5 kΩ). Constantly check the condition of gel contact at the electrodes labeled "reference" and "ground" on the EEG cap to make sure the impedance measurement is correct.
      1. Observe the electrode impedance by viewing the electrode impedance screen supported by the EEG recording software (e.g. SCAN 4.5 in this study) that usually goes with the EEG system. On the screen, the electrodes are shown in colors, and different colors indicate the levels of impedance.
    5. Place one HEOG electrode at the canthus of one eye (positive site), and the second electrode at the canthus of the other eye (negative site), one VEOG electrode above and the other one below the left eye, bipolar EKG electrodes on the back of the left and right hands, and bipolar EMG electrodes in the area between the thumb and index finger of the right hand, and the six facial electrodes around the eyebrow and cheek.
    6. Record in a notebook those bad channels in which the impedance is higher than 5 kΩ, or directly save the screen showing impedance at all electrodes. Use this as future reference for discarding bad channels at the stage of EEG data processing.
    7. Record resting-state EEGs after instructing the participant to close eyes for 12 min. During this time, doubly check the quality of the instant EEG stream shown on the screen supported by the EEG recording software.
      Note: There should be clear alpha waves distributed in the occipital channels during the eyes-closed condition compared with the eyes-open condition. If the alpha waves are too noisy (ignoring the bad channels) or distorted, return to step 2.2.4 and adjust the gel contact.
    8. Start the two experimental tasks in a counter-balanced order across participants. Record EEGs by clicking the record icon on the screen supported by the recording software.
      1. After reading the task instruction shown on the screen, have each participant perform the 5 familiarization trials, followed by the 30 task trials. Use the same procedure for both photograph and line-drawing tasks. In the task instruction, encourage participants to assign a score to emotionality of a face stimulus as quickly as possible.
      2. IMPORTANT: Check programs prepared in steps 1.3.2 and 1.4.2 for correctly sending events time-locked to the onset of central eye-fixation, face stimulus presentation, and pressing of the GO button to the recording software during emotionality evaluation. Those onset times are coded as numeric and can be checked on the screen supported by the recording software.
        Note: The participant can take a break in between the two tasks. There is no EEG recording during the break.
    9. Use a digitizer (e.g. the Polhemus FASTRAK 3D digitizer in this study) to record the 3D positions of electrodes and save it in a file (e.g. .3dd or .dat file) for co-registering EEG caps across participants in data analysis.
    10. After the EEG experiment, have the participant fill out a 35 question inventory on his/her behaviors and feelings during the EEG experiment (e.g., have negative emotions, almost fell into sleep), and provide them payment for participating in the experiment.
    11. Bring the participant to the washroom to clean/dry his/her hair.
    12. Clean and sterilize the EEG cap according to clinical instructions.

3. Processing EEG Data

Note: The software commands provided in this section are specific for EEGLAB.

  1. Filter the EEG signals using a high-pass filter of 1 Hz and a low-pass filter of 50 Hz by calling the pop_eegfilt.m function32.
    Note: Use a low-pass filter of 40 Hz for some countries that have 50 Hz electrical grid frequency
  2. Discard bad channels with impedance higher than 5 kΩ after checking the electrode impedance recorded in step 2.2.6. Discard those bad channels with very different power spectrum compared with the neighboring channels by visual inspection of the power spectrum features (e.g., the maximum value, the curvature, etc.) in each channel.
    1. Calculate and plot the power spectrum of the EEG signal by calling the pop_spectopo.m function32.
  3. Re-reference the EEG signals with the average of brain channels without the bad channels by calling the pop_reref.m function.
  4. Segment EEGs into stimulus-locked epochs, each of which ranges from -2.0 sec pre- to 1.5 sec post-stimulus onset. Correct for baseline (-2.0 to -1.2 sec before the stimulus onset) by removing the average of baseline values from each epoch.
    1. Call the pop_epoch.m and pop_rmbase.m functions, respectively. Choose the interval of baseline prior to the central eye-fixation period and the onset of the face stimulus.
  5. Mark bad epochs that appear to contain artifacts. Discard the bad epochs while reserving the epochs contaminated by eye blinks. The epochs with artifacts usually look noisy or have extremely high peak value (e.g. higher than 100 µV) compared with typical epochs.
    1. Call the pop_rejmenu.m function to launch a semi-automatic procedure. An interaction window will pop out to re-confirm auto-selected bad epochs by the user via visual inspection. Though a majority of epochs are contaminated by eye blinks, these epochs can be tentatively reserved for later removal by independent component analysis (ICA)33 in step 3.8.
  6. After discarding bad channels and bad epochs, run ICA on the pruned EEG data using the pop_runica.m function.
  7. Among the estimated independent components (ICs), identify artifacts resulting from eye movement/blink, muscle activity, heartbeat, and line noise32.
    Note: A significantly high correlation (R2 >0.9) between IC scores of a component and those of all reference channels (VEOG, HEOG, EKG, and facial channels) indicates that this component is mainly contributed by artifacts. The estimated IC scores explained by the artifacts can be cleaned up using multiple regression analysis.
  8. Remove artifact ICs and estimate the clean EEGs which are derived by the product of the ICA mixing matrix and artifact-cleaned IC score matrix. Save the clean EEGs for further analysis.
    1. Keep the residuals of predicting artifact ICs (R2 >0.9) from the reference VEOG, HEOG, EKG and facial channels in the IC score matrix. Remove other artifact ICs by the pop_subcomp.m function. The function returns the artifact-cleaned EEGs.

4. Statistical Analysis

  1. Partition EEG channels into eleven homogeneous regions to reduce the number of statistical comparisons in ERP and ERSP analyses, that is, left- (10 channels), midline- (14), and right-frontal (10); left- (13) and right-temporal (13); left- (9), midline- (14) and right-central (9); left- (9), midline- (12) and right-occipital parietal (9) as shown in Figure 4. These regions are defined according to the functional anatomy of cortex34. Functional homogeneity of EEG signals in these regions has been validated in different experiments13,35,36.

Linguistic regional dialect map; circular diagram indicating language distribution by color-coded areas.
Figure 4. The channel partition. The channels are divided into eleven regions. LF: left-frontal (10 channels), MF: midline-frontal (14), RF: right-frontal (10), LT: left-temporal (13), RT: right-temporal (13), LC: left-central (9), MC: midline-central (14), RC: right-central (9), LP: left-occipital parietal (9), MP: midline-occipital parietal (12), RP: right-occipital parietal (9). Please click here to view a larger version of this figure.

  1. Load the clean EEGs in step 3.8. Compute the channel ERP by averaging signals across epochs in each channel, and regional ERP by averaging ERPs within the same region.
    Note: When EEGs are loaded using the pop_loadset.m function in EEGLAB, the signals are stored in the structure variable "EEG.data" in a channel-by-time-by-epoch array.
    1. In the Matlab command window, compute the channel ERP by averaging EEG.data across epochs for every channel (e.g., channelERP = mean(EEG.dat,3)). Compute the regional ERP by averaging the channel ERPs within each region according to the partition in 4.1 (e.g., regionalERP = mean(channelERP(index,:),1), where "index" stands for the channel indices in a given region).
  2. Compute the channel ERSPs by applying a time-frequency transform (e.g. Wavelet transform) to epoch signals in each channel, and regional ERSPs by averaging channel ERSPs in the same region.
    1. Perform the time-frequency transform by calling the pop_newtimef.m function.
      Note: In this study, the "wavelet cycles" entry is set to [1, 0.5] and "baseline" is set to [-2,000 to -1,200] msec. The resulting channel ERSPs will be stored in a frequency-by-time-by-channel array.
    2. In the Matlab command window, compute the regional ERSP by averaging ERSPs across channels within each region according to the partition in 4.1 (e.g., regionalERSP = mean(channelERSP(:,:,index),3), where "channelERSP" is the output from the pop_newtimef.m function, and "index" stands for the channel indices in a given region).
  3. Calculate mean values in different time intervals (e.g. 50-150, 150-250, 250-350, 350-450 msec) for regional ERPs. Calculate mean values in different time-frequency intervals (e.g. 50-150, 150-250, 250-350, 350-450, 450-800 msec in 1-7 Hz, and 200-800 msec in 8-30 Hz) for regional ERSPs.
  4. Apply MANOVA in statistical software (e.g. IBM SPSS ) to the mean values of regional ERPs and ERSPs to evaluate main effects for the task (photograph vs. line-drawing), region (eleven scalp regions), and group (AS vs. control), as well as the interaction effects among the task, region, and group.
    1. In the statistical analysis, consider gender (male vs. female) as a covariate, and estimate the main and interaction effects by holding the gender effect constant.

Access restricted. Please log in or start a trial to view this content.

Results

The average verbal and performance IQ scores are listed in Table 1 for the control and AS groups along with the average reaction times and average scores assigned to emotionality of faces of the two groups. In the table, none of the group differences achieves statistical significance except for the neutral faces in the line-drawing task, where the AS group has an average score near zero (p <0.001)13. Interestingly, the AS group still has slightly lo...

Access restricted. Please log in or start a trial to view this content.

Discussion

The literature features studies on recognition of facial emotions in patients with autism by analysis of EEG reactions44, and on recognition of high- and low-spatial frequency contents using visual stimuli43. To the best of our knowledge, however, there is a lack of existing work on the brain oscillatory activity that combines emotion recognition with distinct spatial frequency contents. Our protocol is a first step towards estimating the influence of emotionality (positive, neutral and negative fac...

Access restricted. Please log in or start a trial to view this content.

Disclosures

The authors have nothing to disclose.

Acknowledgements

This research was supported by grants MOST102-2410-H-001-044 and MOST103-2410-H-001-058-MY2 to M. Liou, and RSF-14-15-00202 to A.N. Savostyanov. The support of Russian Science Foundation (RSF) was used for elaboration of experimental paradigm of face recognition.

Access restricted. Please log in or start a trial to view this content.

Materials

NameCompanyCatalog NumberComments
Synamps 2/RT 128-channel EEG/EP/ERPNeuroscan
Quik-CapEEG 128 electrodesNeuroscan
GelQuik-Gel
FASTRAK 3D digitizerPolhemus 

References

  1. Tamietto, M., De Gelder, B. Neural bases of the non-conscious perception of emotional signals. Nat Rev Neurosci. 11, 697-709 (2010).
  2. Harms, M. B., Martin, A., Wallace, G. L. Facial Emotion Recognition in Autism Spectrum Disorders: A Review of Behavioral and Neuroimaging Studies. Neuropsychol Rev. 20, 290-322 (2010).
  3. Vuilleumier, P., Armony, J. L., Driver, J., Dolan, R. J. Distinct spatial frequency sensitivities for processing faces and emotional expressions. Nat Neurosci. 6, 624-631 (2003).
  4. Phan, K. L., Wager, T., Taylor, S. F., Liberzon, I. Functional neuroanatomy of emotion: A meta-analysis of emotion activation studies in PET and fMRI. Neuroimage. 16, 331-348 (2002).
  5. Kano, M., et al. Specific brain processing of facial expressions in people with alexithymia: an (H2O)-O-15-PET study. Brain. 126, 1474-1484 (2003).
  6. Williams, L. M., et al. Fronto-limbic and autonomic disjunctions to negative emotion distinguish schizophrenia subtypes. Psychiat Res-Neuroim. 155, 29-44 (2007).
  7. Goffaux, V., et al. From coarse to fine? Spatial and temporal dynamics of cortical face processing. Cereb Cortex. , (2010).
  8. Balconi, M., Lucchiari, C. EEG correlates (event-related desynchronization) of emotional face elaboration: A temporal analysis. Neurosci Lett. 392, 118-123 (2006).
  9. Balconi, M., Lucchiari, C. Consciousness and emotional facial expression recognition - Subliminal/Supraliminal stimulation effect on n200 and p300 ERPs. J Psychophysiol. 21, 100-108 (2007).
  10. Balconi, M., Pozzoli, U. Face-selective processing and the effect of pleasant and unpleasant emotional expressions on ERP correlates. Int J Psychophysiol. 49, 67-74 (2003).
  11. Balconi, M., Pozzoli, U. Event-related oscillations (EROs) and event-related potentials (ERPs) comparison in facial expression recognition. J Neuropsychol. 1, 283-294 (2007).
  12. Balconi, M., Pozzoli, U. Arousal effect on emotional face comprehension Frequency band changes in different time intervals. Physiol Behav. 97, 455-462 (2009).
  13. Tseng, Y. L., Yang, H. H., Savostyanov, A. N., Chien, V. S., Liou, M. Voluntary attention in Asperger's syndrome: Brain electrical oscillation and phase-synchronization during facial emotion recognition. Res Autism Spectr Disord. 13, 32-51 (2015).
  14. Goffaux, V., Rossion, B. Faces are" spatial"--holistic face perception is supported by low spatial frequencies. J Exp Psychol Hum Percept Perform. 32, 1023(2006).
  15. Knyazev, G. G., Bocharov, A. V., Levin, E. A., Savostyanov, A. N., Slobodskoj-Plusnin, J. Y. Anxiety and oscillatory responses to emotional facial expressions. Brain Res. 1227, 174-188 (2008).
  16. Adolphs, R. Recognizing emotion from facial expressions: psychological and neurological mechanisms. Behav Cogn Neurosci Rev. 1, 21-62 (2002).
  17. Acar, Z. A., Makeig, S. Neuroelectromagnetic Forward Head Modeling Toolbox. J Neurosci Methods. 190, 258-270 (2010).
  18. Balconi, M. Neuropsychology of facial expressions. The role of consciousness in processing emotional faces. Neuropsychol Trends. 11, 19-40 (2012).
  19. Gross, T. F. The perception of four basic emotions in human and nonhuman faces by children with autism and other developmental disabilities. J Abnorm Child Psychol. 32, 469-480 (2004).
  20. Behrmann, M., Thomas, C., Humphreys, K. Seeing it differently: visual processing in autism. Trends in cognitive sciences. 10, 258-264 (2006).
  21. Holroyd, S., Baron-Cohen, S. Brief report: How far can people with autism go in developing a theory of mind? J Autism Dev Disord. 23, 379-385 (1993).
  22. Duverger, H., Da Fonseca, D., Bailly, D., Deruelle, C. Theory of mind in Asperger syndrome. Encephale. 33, 592-597 (2007).
  23. Wallace, S., Sebastian, C., Pellicano, E., Parr, J., Bailey, A. Face processing abilities in relatives of individuals with ASD. Autism Res. 3, 345-349 (2010).
  24. Weigelt, S., Koldewyn, K., Kanwisher, N. Face identity recognition in autism spectrum disorders: a review of behavioral studies. Neurosci Biobehav Rev. 36, 1060-1084 (2012).
  25. Wilson, C., Brock, J., Palermo, R. Attention to social stimuli and facial identity recognition skills in autism spectrum disorder. J Intellect Disabil Res. 54, 1104-1115 (2010).
  26. American_Psychiatric_Association. The Diagnostic and Statistical Manual of Mental Disorders: DSM 5. , bookpointUS. (2013).
  27. Dahlgee, S., Gilberg, C. Symptoms in the First two years of Life. A Priliminary. Population Study of Infantile Autism European archives of Psychiatry and Neurology. Sciences. , (1989).
  28. Basar-Eroglu, C., Kolev, V., Ritter, B., Aksu, F., Basar, E. EEG, auditory evoked potentials and evoked rhythmicities in three-year-old children. Int J Neurosci. 75, 239-255 (1994).
  29. Ekman, P., Friesen, W. V. Pictures of Facial Affect. , Consulting Psychologist Press. (1976).
  30. Gillberg, C. Autism and Asperger's Syndrome. , Cambridge University Press. 122-146 (1991).
  31. Chiang, S. K., Tam, W. C., Pan, N. C., Chang, C. C., Chen, Y. C., Pyng, L. Y., Lin, C. Y. The appropriateness of Blyler's and four subtests of the short form of the Wechsler Adult Intelligence Scale-III for chronic schizophrenia. Taiwanese J Psychiatr. 21, 26-36 (2007).
  32. Delorme, A., Makeig, S. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J Neurosci Methods. 134, 9-21 (2004).
  33. Makeig, S., Bell, A. J., Jung, T. P., Sejnowski, T. J. Independent component analysis of electroencephalographic data. Adv Neural Inf Process Syst. 8, 145-151 (1996).
  34. Başar, E. Brain Function and Oscillations: Volume I: Brain Oscillations. Principles and Approaches. , Springer Science & Business Media. (2012).
  35. Tsai, A. C., et al. Recognizing syntactic errors in Chinese and English sentences: Brain electrical activity in Asperger's syndrome. Res Autism Spectr Disord. 7, 889-905 (2013).
  36. Savostyanov, A. N., et al. EEG-correlates of trait anxiety in the stop-signal paradigm. Neurosci Lett. 449, 112-116 (2009).
  37. Ashwin, C., Baron-Cohen, S., Wheelwright, S., O'Riordan, M., Bullmore, E. T. Differential activation of the amygdala and the 'social brain' during fearful face-processing in Asperger Syndrome. Neuropsychologia. 45, 2-14 (2007).
  38. Kevin, K. Y., Cheung, C., Chua, S. E., McAlonan, G. M. Can Asperger syndrome be distinguished from autism? An anatomic likelihood meta-analysis of MRI studies. J Psychiatry Neurosci. 36, 412(2011).
  39. Piggot, J., et al. Emotional attribution in high-functioning individuals with autistic spectrum disorder: A functional imaging study. J Am Acad Child Adolesc Psychiatry. 43, 473-480 (2004).
  40. Ilyutchenok, R. Y. Emotions and conditioning mechanisms. Integr Physiol Behav Sci. 16, 194-203 (1981).
  41. Kleinhans, N. M., et al. fMRI evidence of neural abnormalities in the subcortical face processing system in ASD. Neuroimage. 54, 697-704 (2011).
  42. Toivonen, M., Rama, P. N400 during recognition of voice identity and vocal affect. Neuroreport. 20, 1245-1249 (2009).
  43. Deruelle, C., Rondan, C., Gepner, B., Tardif, C. Spatial frequency and face processing in children with autism and Asperger syndrome. J Autism Dev Disord. 34, 199-210 (2004).
  44. Bentin, S., Deouell, L. Y. Structural encoding and identification in face processing: ERP evidence for separate mechanisms. Cogn Neuropsychol. 17, 35-55 (2000).
  45. Vuilleumier, P., Pourtois, G. Distributed and interactive brain mechanisms during emotion face perception: evidence from functional neuroimaging. Neuropsychologia. 45, 174-194 (2007).
  46. Basar, E., Guntekin, B., Oniz, A. Principles of oscillatory brain dynamics and a treatise of recognition of faces and facial expressions. Prog Brain Res. 159, 43-62 (2006).
  47. Basar, E., Schmiedt-Fehr, C., Oniz, A., Basar-Eroglu, C. Brain oscillations evoked by the face of a loved person. Brain Res. 1214, 105-115 (2008).
  48. Başar, E. Brain Function and Oscillations: Volume II: Integrative Brain Function. Neurophysiology and Cognitive Processes. , Springer Science & Business Media. (2012).
  49. Anokhin, A., Vogel, F. EEG alpha rhythm frequency and intelligence in normal adults. Intelligence. 23, 1-14 (1996).
  50. Klimesch, W. EEG alpha and theta oscillations reflect cognitive and memory performance: a review and analysis. Brain Res Rev. 29, 169-195 (1999).
  51. Knyazev, G. G., Slobodskoj-Plusnin, J. Y., Bocharov, A. V. Event-Related Delta and Theta Synchronization during Explicit and Implicit Emotion Processing. Neuroscience. 164, 1588-1600 (2009).
  52. Klimesch, W., Sauseng, P., Hanslmayr, S. EEG alpha oscillations: The inhibition-timing hypothesis. Brain Res Rev. 53, 63-88 (2007).
  53. Knyazev, G. G., Slobodskoj-Plusnin, J. Y. Behavioural approach system as a moderator of emotional arousal elicited by reward and punishment cues. Pers Individ Dif. 42, 49-59 (2007).
  54. Balconi, M., Brambilla, E., Falbo, L. Appetitive vs. defensive responses to emotional cues. Autonomic measures and brain oscillation modulation. Brain Res. 1296, 72-74 (2009).
  55. Dakin, S., Frith, U. Vagaries of visual perception in autism. Neuron. 48, 497-507 (2005).
  56. Curby, K. M., Schyns, P. G., Gosselin, F., Gauthier, I. Face-selective fusiform activation in Asperger's Syndrome: A matter of tuning to the right (spatial) frequency. Poster presented at Cogn Neurosci, New York, , (2003).
  57. American_Psychiatric_Association. Diagnostic and statistical manual of mental disorders. , (1994).
  58. Dougherty, D. M., Bjork, J. M., Moeller, F. G., Swann, A. C. The influence of menstrual-cycle phase on the relationship between testosterone and aggression. Physiol Behav. 62, 431-435 (1997).
  59. Van Goozen, S. H., Wiegant, V. M., Endert, E., Helmond, F. A., Van de Poll, N. E. Psychoendocrinological assessment of the menstrual cycle: the relationship between hormones, sexuality, and mood. Arch Sex Behav. 26, 359-382 (1997).
  60. Winward, J. L., Bekman, N. M., Hanson, K. L., Lejuez, C. W., Brown, S. A. Changes in emotional reactivity and distress tolerance among heavy drinking adolescents during sustained abstinence. Alcohol Clin Exp Res. 38, 1761-1769 (2014).

Access restricted. Please log in or start a trial to view this content.

Reprints and Permissions

Request permission to reuse the text or figures of this JoVE article

Request Permission

Explore More Articles

Asperger s SyndromeEmotional Face ProcessingEEG ExperimentConscious And Non conscious RepresentationsSocial Communication DeficitsFace StimuliEmotion EvaluationLine DrawingsEdge DetectionImage Processing

This article has been published

Video Coming Soon

JoVE Logo

Privacy

Terms of Use

Policies

Research

Education

ABOUT JoVE

Copyright © 2025 MyJoVE Corporation. All rights reserved