eye movements during emotion recognition in...

16
Eye movements during emotion recognition in faces M. W. Schurgin $ Department of Psychological & Brain Sciences, Johns Hopkins University, Baltimore, MD, USA J. Nelson Loyola University Chicago, Chicago, IL, USA S. Iida Nagoya University, Chikusa-ku, Nagoya, Japan H. Ohira Nagoya University, Chikusa-ku, Nagoya, Japan J. Y. Chiao Northwestern University, Evanston, IL, USA S. L. Franconeri Northwestern University, Evanston, IL, USA When distinguishing whether a face displays a certain emotion, some regions of the face may contain more useful information than others. Here we ask whether people differentially attend to distinct regions of a face when judging different emotions. Experiment 1 measured eye movements while participants discriminated between emotional (joy, anger, fear, sadness, shame, and disgust) and neutral facial expressions. Participant eye movements primarily fell in five distinct regions (eyes, upper nose, lower nose, upper lip, nasion). Distinct fixation patterns emerged for each emotion, such as a focus on the lips for joyful faces and a focus on the eyes for sad faces. These patterns were strongest for emotional faces but were still present when viewers sought evidence of emotion within neutral faces, indicating a goal-driven influence on eye-gaze patterns. Experiment 2 verified that these fixation patterns tended to reflect attention to the most diagnostic regions of the face for each emotion. Eye movements appear to follow both stimulus-driven and goal-driven perceptual strategies when decoding emotional information from a face. Introduction Humans are remarkable in their ability to rapidly and efficiently decode emotional expressions, reflecting their importance for successful social interaction (Schyns, Petro, & Smith, 2009; Smith, Cottrell, Gosselin, & Schyns, 2005). Deficits in the ability to accurately discern and respond to the emotional state of others are associated with a range of socioemotional disorders, from autism to psychopathy (Baron-Cohen & Wheelwright, 2004; Blair, 2005; Marsh, Kozak, & Ambady, 2007; Sasson et al., 2007). Across cultures, people recognize at least six primary expressions of emotion from the face, including joy, sadness, fear, anger, disgust, and surprise (Ekman & Friesen, 1975; but see Jack, Blais, Scheepers, Schyns, & Caldara, 2009, for evidence of diversity in these abilities), as well as facial expressions of self-conscious emotions, such as shame or embarrassment (Hejmadi, Davidson, & Rozin, 2000; Keltner & Buswell, 1997) and pride (Tracy & Robins, 2004). Facial expressions of basic emotion are produced with characteristic configurations of facial muscle movements that provide the perceptual basis for discriminating between distinct types of emotional expressions (Ekman & Friesen, 1978). For instance, the facial expression of fear consists of a widening of the eyes and flexing of mouth muscles whereas the facial expression of joy consists of a restriction of the eyes and an alternative flexing of mouth muscles (Ekman & Friesen, 1978). Emotional expressions may have originated as functional adaptations to benefit the expresser and only become communicative as a secondary function through heredity and continued practice (Darwin, 1872). For example, when subjects posed fear expressions, patterns of eye movements and Citation: Schurgin, M. W., Nelson, J., Iida, S., Ohira, H., Chiao, J.Y., & Franconeri, S. L. (2014). Eye movements during emotion recognition in faces. Journal of Vision, 14(14):14, 1–16, http://www.journalofvision.org/content/14/13/14, doi:10.1167/14.13. 14. Journal of Vision (2014) 14(13):14, 1–16 1 http://www.journalofvision.org/content/14/13/14 doi: 10.1167/14.13.14 ISSN 1534-7362 Ó 2014 ARVO Received August 2, 2013; published November 18, 2014 Downloaded From: http://jov.arvojournals.org/pdfaccess.ashx?url=/data/Journals/JOV/933685/ on 07/20/2015

Upload: others

Post on 13-Mar-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Eye movements during emotion recognition in faces

M W Schurgin $Department of Psychological amp Brain SciencesJohns Hopkins University Baltimore MD USA

J Nelson Loyola University Chicago Chicago IL USA

S Iida Nagoya University Chikusa-ku Nagoya Japan

H Ohira Nagoya University Chikusa-ku Nagoya Japan

J Y Chiao Northwestern University Evanston IL USA

S L Franconeri Northwestern University Evanston IL USA

When distinguishing whether a face displays a certainemotion some regions of the face may contain moreuseful information than others Here we ask whetherpeople differentially attend to distinct regions of a facewhen judging different emotions Experiment 1measured eye movements while participantsdiscriminated between emotional (joy anger fearsadness shame and disgust) and neutral facialexpressions Participant eye movements primarily fell infive distinct regions (eyes upper nose lower nose upperlip nasion) Distinct fixation patterns emerged for eachemotion such as a focus on the lips for joyful faces and afocus on the eyes for sad faces These patterns werestrongest for emotional faces but were still presentwhen viewers sought evidence of emotion within neutralfaces indicating a goal-driven influence on eye-gazepatterns Experiment 2 verified that these fixationpatterns tended to reflect attention to the mostdiagnostic regions of the face for each emotion Eyemovements appear to follow both stimulus-driven andgoal-driven perceptual strategies when decodingemotional information from a face

Introduction

Humans are remarkable in their ability to rapidlyand efficiently decode emotional expressions reflectingtheir importance for successful social interaction(Schyns Petro amp Smith 2009 Smith Cottrell

Gosselin amp Schyns 2005) Deficits in the ability toaccurately discern and respond to the emotional stateof others are associated with a range of socioemotionaldisorders from autism to psychopathy (Baron-Cohenamp Wheelwright 2004 Blair 2005 Marsh Kozak ampAmbady 2007 Sasson et al 2007) Across culturespeople recognize at least six primary expressions ofemotion from the face including joy sadness fearanger disgust and surprise (Ekman amp Friesen 1975but see Jack Blais Scheepers Schyns amp Caldara 2009for evidence of diversity in these abilities) as well asfacial expressions of self-conscious emotions such asshame or embarrassment (Hejmadi Davidson ampRozin 2000 Keltner amp Buswell 1997) and pride (Tracyamp Robins 2004)

Facial expressions of basic emotion are producedwith characteristic configurations of facial musclemovements that provide the perceptual basis fordiscriminating between distinct types of emotionalexpressions (Ekman amp Friesen 1978) For instance thefacial expression of fear consists of a widening of theeyes and flexing of mouth muscles whereas the facialexpression of joy consists of a restriction of the eyesand an alternative flexing of mouth muscles (Ekman ampFriesen 1978) Emotional expressions may haveoriginated as functional adaptations to benefit theexpresser and only become communicative as asecondary function through heredity and continuedpractice (Darwin 1872) For example when subjectsposed fear expressions patterns of eye movements and

Citation Schurgin M W Nelson J Iida S Ohira H Chiao J Y amp Franconeri S L (2014) Eye movements during emotionrecognition in faces Journal of Vision 14(14)14 1ndash16 httpwwwjournalofvisionorgcontent141314 doi101167141314

Journal of Vision (2014) 14(13)14 1ndash16 1httpwwwjournalofvisionorgcontent141314

doi 10 1167 14 13 14 ISSN 1534-7362 2014 ARVOReceived August 2 2013 published November 18 2014

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

nasal volume suggest a perceptual enhancement ofonersquos environment whereas the opposite pattern wasobserved for disgust (Susskind et al 2008)

Different regions of a face contain more or lessinformation required for categorization of facialemotion (Smith et al 2005 Spezio Adolphs Hurleyamp Piven 2007) as well as facial identity (Gosselin ampSchyns 2001 Zhao Chellappa Phillips amp Rosenfeld2003) Numerous studies show evidence for strategicdeployment of attention to more diagnostic regionstypically indexed by eye movements an overt reflectionof attentional deployment (Kowler Anderson Dosheramp Blaser 1995) For example rhesus monkeys spendsignificantly more time fixating at the eyes whileviewing threatening faces in comparison to other facessuch as a lip smack or yawn for which they morestrongly fixate at the mouth (Nahm Perret Amaral ampAlbright 1997) Humans display individual differencesin fixation patterns when viewing faces and thesepatterns also differed with a single person acrossdifferent tasks yet these patterns were surprisinglyreliable across examples for a given participant andtask (Walker-Smith Gale amp Findlay 1977)

While it is not always clear whether such attentionalstrategies actually improve performance there aresome cases in which selecting specific regions of the faceappears necessary for successful emotion recognitionFor example patients with bilateral amygdala damageare relatively inaccurate at recognizing fear from theface compared to healthy controls (Adolphs TranelDamasio amp Damasio 1994) and a large contributor tothis deficit may be a lack of attention to the eye regionof the face Remarkably when given explicit instruc-tion to look or move their eyes toward the eye region ofa facial expression amygdala damage patients are ableto recognize fear (Adolphs et al 2005) suggesting thatthe patientrsquos core deficit is not in fear recognition perse but rather in selectively looking at the regions of theface most diagnostic for successful fear recognition Asa potentially similar example autistic individualstypically look longer at nondiagnostic relative todiagnostic (eg eyes nose mouth) regions of the faceduring emotion recognition likely contributing toemotion recognition deficits (Pelphrey et al 2002)Other work on a typical population showed that wheneye-gaze patterns were restricted memory encoding offace identity was impaired relative to a condition inwhich gaze was allowed to range freely (HendersonWilliams amp Falk 2005)

Strategic deployment of attention within a face isdriven not just by low-level visual properties but alsoby the traits and goals of the observer Spider phobicsdisplay slower eye-movement patterns when viewingfear-relevant stimuli (Pflugshaupt et al 2007) relativeto controls Pessimists spend more time looking atnegative scenes such as images of skin cancer

(Segerstrom 2001) relative to optimists People high inneuroticism look longer at the eye region of fear faces(Perlman et al 2009) relative to those low inneuroticism Easterners are more likely to fixate on theeye region of the face to a greater extent compared toWesterners who are more likely to sample more evenlyacross facial regions (Jack et al 2009) The affectivecontext in which identical faces are embedded has alsobeen shown to dramatically alter emotional perception(Aviezer et al 2008) These findings reveal individualand group variation in eye movements during emotionrecognition driven by goal-driven biases in cognitiveprocessing

The processing demands of emotion recognitionappear to trigger specific patterns of attention across aface Here we explore how attentional deployment asreflected by eye-movement patterns differ duringdetection of six classic emotional expressions We alsotest whether these patterns are driven in a stimulus-driven (properties of the emotional face stimulus itself)versus goal-driven (perceptual strategies that wouldoccur even on a neutral face) manner

Experiment 1 Eye tracking ofemotional judgments

We recorded eye movements as participants dis-criminated neutral from emotional expressions ofparticular emotions We blocked these decisions byemotion type so that we could examine fixationpatterns on emotional faces (reflecting both stimulus-driven and goal-driven patterns) and neutral faces(reflecting only goal-driven patterns)

Methods

Participants

Fifty-one college-aged participants (23 male 28female) completed Experiment 1 All participants gaveconsent and received course credit for their participa-tion Results from an additional 16 participants werenot analyzed due to unreliable eye recordings typicallydue to interference from eyeglasses contact lenses ormascara

Stimuli

Stimuli for the current experiment consisted of 228gray scale photographs taken from the Montreal Set ofFacial Displays of Emotion (Beaupre amp Hess 2005)This facial photo set was comprised of 12 unique facialidentities each posing in one of six emotionalexpressions as well as neutral These identities consisted

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 2

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

of an equal number of African Americans Asians andCaucasians distributed evenly across gender (six malesix female) to maximize the generizability of ourfindings across these variables which can stronglyaffect face recognition performance (Elfenbein ampAmbady 2002 Jack et al 2009 OrsquoToole Deffen-bacher Valentin amp Abdi 1994 Wrist amp Sladden2003)

For each of these identities we created six linearinterpolation morphs between the neutral face and eachemotion (Figure 1A) at four levels 0 emotional (onlythe neutral face with no contribution from theemotional face) and 20 40 and 60 emotionalintensity (contribution from the emotional face) Thismanipulation ensured that the judgment task would bedifficult maximizing the importance of selecting themost diagnostic regions of the face All photographswere standardized for size (800 middot 600 pixels) andbackground color The majority of the face portion ofthe image subtended an average of 88 in width and 108in height A more conservative measure of theboundaries of a lsquolsquofacersquorsquo (ie the vertical measure of theface includes the entire forehead up to the hairline)results in an image 988 in width and 1228 in heightBoth values roughly conform to the visual angle of aface in normal human interactions (see also Hendersonet al 2005 Hsiao amp Cottrell 2008)

Apparatus

The experiment took place in a dimly lit roomStimuli were presented on a 17-in CRT monitor(resolution of 800 middot 600 pixels 85-Hz refresh rate 255

pixels8 of visual angle) located approximately 63 cmfrom participantsrsquo eyes and were generated using SRResearch Experiment Builder on Windows XP View-ing distance was maintained throughout the experi-ment Eye movements were recorded at a sampling rateof 1000 Hz (pupil-only mode) with an EyeLink 1000eye tracker (SR Research 0158 resolution) using adesktop camera-mount illuminator Normal sensitivitysettings were used setting the saccade threshold ateither 308s for velocity or 80008ss for acceleration Afixation was defined as any period that is not a blink orsaccade Responses were collected using a USBSidewinder gamepad

Procedure

This experiment consisted of a total of 144 trials(Figure 1B) presented as six blocks of trials (one blockof trials per emotion type) shown in one of six possiblepseudorandom counterbalanced orders Each blockconsisted of 24 trials per block each consisting of 12neutral and 12 emotional face trials (equally dividedinto 20 40 or 60) presented in randomized orderThe 12 facial identities could not be fully crossed withineach block or participant but were counterbalancedsuch that they were equally frequently associated witheach emotional intensity level across participants Theexperiment began with an eye-tracking calibrationscreen consisting of nine dots spread around a screenThis calibration was repeated as needed throughoutthe experiment

At the beginning of each trial participants fixated ata central cross on a gray screen Participants started the

Figure 1 Illustration of face stimuli (A) and experimental paradigm (B) Participants were presented with blocks of trials containing

half neutral faces (0 intensity) and half emotional faces (varying from 20ndash60 intensity) and were asked to judge whether each

face had any amount of a particular emotion present (eg fear)

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 3

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

trial by pressing a button on the gamepad while fixatingwithin a required 28 radius of screen center A photowould then appear in the center of the screen for 3 sfollowed by a blank gray screen Participants thenjudged whether the photo depicted a neutral face or aface with even a slight amount of emotion Theyindicated their responses by pressing one of twobuttons on the gamepad

Eye-tracking analysis

For each facial stimulus 21 face regions were definedaccording to the template shown in Figure 2 Allanalyses used percentage of total fixation time overthese regions and Figure 2 also shows overall fixationrates for each region across all conditions Substitutingthe number of fixations for total fixation duration didnot change the patterns of results reported below thenumber of fixations and total fixation duration werehighly correlated All percentages omit fixations thatbegin before the presentation of the image and fixationsthat were beyond the bounds of the face image Foranalyses examining the time course of fixation rates thedependent measure is the percentage of all fixationswithin a given region within a given time window (ie3) This measure was not normalized relative to the sizeof each region because there appears to be littlerelationship between region size and fixation table (seeFigure 2) Fixation rates are not directly comparedacross regions because these rates are not independent

of each other Instead for each region fixation ratesare compared across different conditions and timewindows The distribution of fixation time acrossregions yielded five main facial regions (eyes uppernose lower nose upper lip nasion) that togetheraccounted for 8803 of all fixations These highfixation frequencies were not a result of these areasbeing physically larger Some of the largest definedareas within the face showed extremely low fixationrates including the right and left cheek (26 22)the forehead (09) the hair (02) and the largestregion the background (023) Hence we restrictedour subsequent statistical analyses on eye-movementdata to these five main facial regions

Results

Behavioral results Emotional intensity ratings

We first examined the percentage of trials thatparticipants rated as lsquolsquoemotionalrsquorsquo as a function ofemotion type and intensity (see Figure 3) A one-wayANOVA with the six emotions as a factor revealed asignificant effect of emotion type on the emotionaljudgment task F(5 245) frac14 10 p 0001 This effectwas driven by higher overall emotional ratings duringblocks containing sad faces (Mfrac14 662) relative to allother emotion blocks all ts(49) 42 all ps 0001Across all trials joy (M frac14 622) and disgust (M frac14623) were rated slightly lower in emotional intensity

Figure 2 Illustration of 21 facial regions of interest (ROIs) We identified five main facial ROIsmdasheyes (green) upper nose (blue) lower

nose (orange) upper lip (red) and nasion (purple)mdashthat accounted for more than 88 of all fixations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 4

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

compared to sadness both ts(49) 46 ps 0001 andhigher compared to fear and sadness both ts(49) 38both ps 0001 Shame (M frac14 585) and anger (M frac14589) were slightly lower but still higher than fear (Mfrac14 548) both ts(49) 32 both ps 0002

As a measure of the effect of emotional intensitymanipulation on ratings we compared the slope of thefunction relating emotion intensity (0 20 4060) in the face image to ratings We omitted 60emotion from this slope because ratings overall wereabove 90 and produced little slope variance Amongemotion blocks there were differences in the slope fromzero to 40 emotion F(5 245)frac14 177 p 0001 Joy(Mfrac14 77 difference in rating value) and disgust (Mfrac1474 difference) were higher than fear (M frac14 62difference) t(49) 24 p 0021 Anger (M frac14 47difference) sadness (M frac14 45) and shame (M frac14 43difference) were all lower than fear all ts(49) 247 p 001

Behavioral results Emotional judgment accuracy

We then evaluated whether these emotional judg-ments were lsquolsquocorrectrsquorsquo according to our definition thatany face containing 0 intensity of the emotional facewas lsquolsquoneutralrsquorsquo and any face containing 20ndash60intensity of the emotional face should be judged aslsquolsquoemotionalrsquorsquo This definition of lsquolsquocorrectrsquorsquo is as arbitraryas our threshold for emotional content The goal of thetask was to engage the participant in recognizing aparticular emotion and our analysis of performance isonly intended as confirmation that participants per-formed the task and that performance was at neitherfloor nor ceiling level

Our analysis confirmed these assumptions Partici-pantsrsquo overall accuracy ranged from 660 to 868with a mean of 755 Emotion judgment accuracy

differed across the six emotion block conditions F(5245)frac14 96 p 0001 Participants recognized joy (Mfrac14827 correct) and disgust (M frac14 795) with higheraccuracy than all others t(49) 23 all ps 003Participants showed lower accuracy when recognizingfear (M frac14 762) and anger (M frac14 735) Howeverparticipants were more accurate in recognizing fearcompared to sadness (M frac14 707) and shame (M frac14704) both t(49) 26 both p 0013 Theseaccuracy rates suggest that although there weredifferences among emotion blocks overall performancewas roughly similar and far from floor or ceiling levelsconfirming that participants performed the judgmenttask as designed

Gaze patterns across emotions

The following analyses only included the first fourfixations of each trial with the lsquolsquofirstrsquorsquo fixation definedas the first new landing of the eye after the appearanceof the image As will be detailed later participantsmade an average of 83 fixations per trial and we foundthe first few fixations to be the most diagnostic in termsof capturing differences across emotions Althoughsome of our later analyses suggest that only the firsttwo fixations are most critical for emotion recognitionhere we conservatively analyze the first four andexamine the changes in pattern across this initial rangein a later section For the following analyses isolatingthe data to the first four fixations produces a pattern ofresults that is qualitatively similar to using more (egeight) fixations All analyses were performed irrespec-tive of whether a response on a trial was correct orincorrect

Figure 3 Percentage of trials rated as lsquolsquoemotionalrsquorsquo as a function of emotion type and intensity These data demonstrate the varying

difficulty of the task which was meant to encourage selection of the most diagnostic regions of the face Across all emotions

performance did not reach above chance until 40 intensity and was near ceiling for 60 intensity images

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 5

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Emotional faces

For each region fixation rates on emotional faces(those with 20 40 or 60 emotional intensity) weresubmitted to one-way ANOVA with emotion block asa factor In these trials eye movements can be affectedby both the emotional content of the face images(stimulus-driven information) and the fact that theobserver is currently seeking that emotion type (goal-driven information) Figure 4A shows fixation rates for

each face region across the six emotions We summarizethe major findings at the end of this section withadditional information and statistical analyses detailedin the supplemental materials

For the eyes there was a main effect of face emotionF(5 250) frac14 272 p 0001 whereby participantslooked longer at the eye region in facial expressions ofanger (M frac14 352) fear (M frac14 308) sadness (M frac14342) and shame (Mfrac14 343) and looked less withinthe eye region for disgust (M frac14 197) and joy (M frac14

Figure 4 (A) For each emotional face fixation time spent within each main ROI (B) For each neutral face fixation time spent within

each main ROI Designates t test p 005 relative to the mean for each emotion

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 6

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

195) faces relative to the mean (M frac14 289) allts(50) 82 ps 0001 For the upper nose there wasan effect of emotion condition F(5 250) frac14 47 p

0001 whereby participants looked marginally less atjoyful faces (M frac14 187) relative to the mean (M frac14218) all ts(50) 158 all ps 01 For the lowernose there was no effect of emotion condition F 1For the upper lip there was an effect of emotioncondition F(5 250)frac14 272 p 0001 wherebyparticipants looked longer at the upper lip for disgust(Mfrac14 155) and joy faces (Mfrac14 209) and less at theupper lip for anger (M frac14 82) and sad (M frac14 68)faces relative to the mean (Mfrac14121) all ts(50) 24all ps 002 For the nasion there was an effect ofemotion condition F(5 250) frac14 31 p 001 wherebyparticipants looked less within the nasion for fear faces(M frac14 38) relative to the mean (M frac14 61) t(50) frac1439 p 0001

To better visualize the differences among emotionconditions Figure 5 graphically illustrates fixationrates for each of the top five regions across the sixemotion conditions relative to the average rate ofregion fixation collapsed across all emotions The toprow of the figure depicts a correlate of the mostdiagnostic information in the image that an idealobserver might use to distinguish between emotionaland neutral facesmdashan image subtraction of the 60emotional image from the neutral image for a singleface identity (this face was chosen because it haduniquely low head movement across photographsrequired for this technique to reveal a useful contrast)Areas of greater difference are depicted with white andlower difference with black respectively

The middle row of the figure graphically illustratesfixation rates for each of the top five regions across our

Figure 5 Top row Image subtractions between the emotional and neutral versions of a particular face from the image set used in

Experiment 1 revealing areas of maximal diagnostic information (white) for judging presence of emotion in the face Middle row

Emotional fixation rates for each of the top five regions across the six emotion conditions relative to the average rate of region

fixation collapsed across all emotional face trials Bottom row Diagnostic information for judgment of each emotion type using the

lsquolsquoBubblesrsquorsquo technique (adapted with permission from Smith et al 2005)

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 7

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

six emotion conditions relative to the average rate ofregion fixation collapsed across all emotions

The bottom row of the figure contains images fromSmith and colleagues (2005) using the lsquolsquobubblesrsquorsquotechnique to find the maximally diagnostic regions offaces for judging each emotion The similarities withincolumns of this figure are striking despite theindependence of the data sources for each Forexample in the middle row note the relatively highfixation rates on the lips for joy and disgust in contrastto the relatively high fixation rates on the eyes foranger sadness and shame Both the top and bottomrows confirm the relatively high diagnostic value ofthese areas for those particular emotion judgments

Neutral faces

In each emotion block half of the faces werenonemotional (neutral) Because these faces wereidentical across blocks any difference in fixation ratesrepresent solely goal-driven strategies resulting fromseeking a given emotion and cannot be due to stimulus-level differences in the images The results will show thatweaker versions of emotion-specific strategies endureacross these neutral faces Figure 4B depicts fixationrates within each block across these neutral faces Foreach region of the face we report the results of a one-way ANOVA seeking differences among the six emotionblock types but focusing only on the neutral faces

For the eyes the emotion block analysis revealed aneffect of emotion condition across neutral faces F(5250) frac14 77 p 0001 whereby participants lookedlonger for anger (Mfrac14334) fear (Mfrac14343) sad (Mfrac14 36) and shame (M frac14 361) faces By contrastparticipants looked less for disgust (M frac14 286) andjoy (M frac14 291) faces relative to the mean (M frac14329) all ts(50) 103 all ps 0001 For the uppernose the emotion block analysis revealed that therewas still no effect of emotion condition across neutralfaces F 1 For the lower nose the emotion blockanalysis revealed that there was still no effect ofemotion condition across neutral faces F 1 For theupper lip the emotion block analysis reveals that therewas still an effect of emotion condition across neutralfaces due to relatively high fixation rates for the joy (Mfrac14 14) condition (disgust was no longer significant)and lower fixation rates for sadness (M frac14 79)condition (anger was no longer significant) relative tothe mean (Mfrac14 99) all ts(50) 197 all ps 0055F(5 250)frac14 86 p 0001 For the nasion the emotionblock analysis revealed that there was still a marginaleffect of emotion condition F(5 250)frac14 23 p frac14 004Although the pattern was similar the change in themean value caused a new set of emotion conditions tobe different from the mean Fear was no longer

significantly different but anger (M frac14 87) hadrelatively high fixation rates t(50) frac14 184 p frac14 007

In summary even when the neutral faces wereidentical across emotion judgment blocks (eg angerjoy) there were still effects of emotion judgment typeon patterns of fixation In fact the majority of effectsfound in the emotional faces were still present in theneutral faces although the effects were not as robustOne exception was fixations to the nasion whereneutral faces showed a pattern of fixation not presentfor emotional faces but this effect may be due to thechanging baseline of the mean number of fixations fornasion fixations overall The fact that most fixationtrends remained for neutral faces that were identicalacross blocks suggests that a substantial portion of theeffect of seeking a specific emotion is due to goal-driveninfluences on fixation patterns

Summary of intensity level analyses

In addition to dividing face trials into emotional(20 40 and 60 emotional) and neutral (0emotional) sets we also examined differences among theemotional face trials as they transitioned from 20 to60 We present a summary of this analysis here andprovide more detailed information and statisticalanalyses in the supplemental materials As the intensityof angry expressions increased fixations increased to theeye region except for the anger faces shown at 60intensity which showed a reduction in fixations to theeye region It is possible that the eyes were mostdiagnostic when the emotion was more ambiguous Asimilar pattern was observed at the nasion where moreneutral faces were fixated to a greater extent than moreemotional faces For disgust judgment blocks there wasa particularly strong tradeoff in fixation position Whenfaces were more neutral participants fixated at the eyesmore frequently However as the intensity of the disgustexpression increased participants looked less at the eyeregion and more toward the upper nose and upper lipAs the intensity of fearful expressions increasedparticipants looked toward the upper nose and upper lipand less at the nasion As the intensity of joyfulexpressions increased there was another particularlystrong effect as fixations increased to the upper lip anddecreased to the eyes and lower nose As the intensity ofsad expressions increased the distribution of fixationsover the five face regions did not significantly change Asthe intensity of shameful expressions increased at 60intensity there were more fixations over the upper liptrading off with fewer fixations over the lower nose

Two of the largest effects of emotional intensity werefor disgust and joy expressions For disgust asemotional intensity conveyed in the face increasedpeople showed greater fixation particularly to the uppernose (close to the particularly salient wrinkle that

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 8

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

appears between the eyes during an expression ofdisgust) and for joy people showed greater fixation tothe upper lip (the location of the smile) In both caseswhen faces become more neutral these fixation ratesdecrease and instead the eyes are fixated morefrequently

Gaze patterns across time

The analyses above collapse the first four fixationsacross viewing time Here we examine how thesepatterns evolve over that time Understanding the timecourse of fixation patterns can establish whether thepatterns observed above are stable and persistentacross viewing time or if fixation patterns are more

variable and dynamic We examined both the proba-bility of fixating on a given region of each type ofemotional face within each 250-ms time interval as wellas for each fixation number Although the patternswere highly similar fixation number appeared to revealmore differences especially in the earlier viewingperiod In contrast for the time interval analysisdifferences in fixation patterns among conditionsappeared to be more lsquolsquosmearedrsquorsquo across time suggestingthat the fixation number analysis might reveal moreabout participantsrsquo fixation patterns

The average duration of each fixation was 301 msand participants made between one and 15 fixationsAlthough participants made up to 15 fixations therewere few trials with this many fixations Because theaverage number of fixations was 83 per image weexamined only the first through the eighth fixation Foreach of the following analyses we use the termlsquolsquofixation ratersquorsquo defined as lsquolsquofor all of the Nth fixationson an image in the current condition the percentagethat are within a given regionrsquorsquo This manner ofcalculating fixation rates removed the influence of thegeneral drop in the number of total fixations availableas fixation number increases (ie there are always morethird fixations than fourth fixations)

Across previous data sets using similar images andtasks as well as the present data set we observed thatfixation patterns differed primarily among the first onethrough four fixations The first few fixations weretypically more variable across time and condition andthe remaining fixation rates tended to stabilize Theselater fixations reveal differences among conditions butvary far less over time Therefore we decided to focusour analyses on the first four fixations (a detailedanalysis across all eight fixations is included in thesupplemental section)

In all of the following analyses we compare fixationrates for each region across the six emotion recognitionconditions To maximize the differences among condi-tions for these analyses involving time we use only theemotional face images and omit the neutral images Aseparate analysis of the neutral emotion imagesrevealed similar patterns but to a weaker extent similarto the neutral image analysis reported above

Patterns of gaze across time by emotion

Figure 6 provides fixation distributions for the firstthree emotional fixations (20ndash60) made to aparticular face presented in the study Overall the eyeswere fixated less across all fixations in the joy anddisgust conditions especially during the first threefixations In contrast the eyes were fixated at uniformlyhigh rates across time in the anger and sadnessconditions and at increasing rates across time in thefear and shame conditions The upper nose was a

Figure 6 Fixation distributions for the first three fixations on

emotional (20ndash60) face image trials for one face identity As

participants made more fixations distinct patterns emerged

across emotions in what regions were fixated most Specifically

at the first fixation the upper nose was a particular popular

fixation location although there was still considerable variability

at the region across emotions As participants made additional

fixations a general pattern emerged such that there was

increased fixation at the upper lip for joy and disgust whereas

there was increased fixation at the eyes for anger sadness and

shame

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 9

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

consistently high target across all emotions during thefirst fixation However despite this popularity demon-strated there was still considerable variability infixation patterns across emotions The lower noseshowed wide variability across all emotions at the firstfixation but afterward fixation rates remained muchmore consistent across emotions The upper lip wasfixated at particularly high rates during the secondfixation for joy and disgust and to a weaker degreefear The nasion showed wide variability in fixationpatterns across emotions although there was a highfixation preference at the first fixation for shame

Asymmetry analysis

Previous research has shown that initial fixationstoward faces are biased toward the left side of the facefollowed by a rightward fixation (Everdell MarshYurick Munhall amp Pare 2007) We tested for thiseffect in the present data by coding each fixation asbeing toward the left or right excluding center regionsof the face (UN LN UL LL) A one-way ANOVAwith fixation number (one through eight) and blocktype (emotional or neutral) as factors showed a maineffect of fixation number F(1 7) frac14 4208 p 0001This effect was driven by the first fixation andremoving that fixation from the analysis eliminatedthe main effect F(1 6) frac14 0579 p frac14 0747 The firstfixation during trials across block type had a strongbias toward the left side of the screen (M frac14 581) incomparison to fixations two through eight (M frac14

462) which demonstrated a slight bias toward theright side of the screen These effects confirm theinitial leftward bias in face fixation patterns Possibleroots of this effect including a learned strategy thatthe right side of the face contains the most informa-tion or a general perceptual bias toward the left visualhemifield are discussed in detail in Everdell et al(2007)

Fixation patterns predict facial emotion

If the type of facial emotion displayed (or sought bythe observer) systematically affects the observerrsquospattern of eye movements it should be possible topredict the type of face displayed (or sought) bystatistical analysis of the eye movement sequence forany given trial We separately submitted the first 1ndashNfixation regions (where N frac14 1 to 8) from every trial inthe data set to a naive Bayesian classifier for categoricaldata (the NaiveBayes class of Matlab) using two thirdsof the data for training and one third for testing Wecalculated the average classification accuracy across100 fittest iterations selecting traintest sets indepen-dently for each iteration for (a) all trials (b) emotionalimagendashonly trials and (c) neutral trials

Figure 6 depicts classification accuracy for eachcondition as well as chance prediction (for sixemotions 166) Emotional-image trial classifica-tion accuracy of test trials was highest (with a peakof around 25 accuracy) for the emotional imagendashonly trial set consistent with the stronger fixationpatterns described in our other analyses Neutral-image trial classification accuracy was lower but stillwell above chance performance (with a peak ofaround 20 accuracy) Including all trials led toperformance in between these two (with a peak ofaround 23 accuracy) consistent with Hsiao andCottrell (2008) who showed that the first one to twofixations were most critical for face processing thediagnostic value of our fixation data was primarily inthe first two fixations It is clear from Figure 7 thatthe bulk of predictive power is reached by the secondfixation

Finally we examined the confusion matrix pro-duced by the classifier (Figure 8) When the wrongemotion is predicted which emotions are systemat-ically confused because they are most similar Wefocused here on two subsets of trials the highestemotion (60) images and the neutral image trialsFor the highest emotion images two pairs ofemotions stood out as having the most confusablefixation patterns joy and disgust and anger andsadness This confusability is consistent with theresults depicted in Figure 5 which reveals strongsimilarities between the fixation patterns produced by

Figure 7 Classification accuracy for each condition (emotional

face trials neutral face trials both trial types and chance

prediction) Classification accuracy for emotional trials was

highest consistent with the stronger fixation patterns observed

in our other analyses Neutral trial classification accuracy was

lower but still well above chance Consistent with the observed

fixation patterns of humans the diagnostic value of fixation

data was primarily reached in the first two fixations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 10

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

these pairs of images Two pairs of emotions stoodout as the least confusable joy and sadness and joyand anger This is also consistent with the resultsdepicted in Figure 5 in which the fixation pattern forjoy contrasts strongly with the pattern for anger andsadness For the neutral-trial subset the mostconfusable image pair was again joy and disgust andthe least confusable was again joy and sadness Otherpairings did not stand out as strong outliers for thissubset

Experiment 1 summary

Participants moved their eyes in distinctive patternsfor each type of emotional image These patternsremained (although weaker) for neutral images in eachemotionrsquos block revealing that these patterns were atleast partially goal-directed The pattern of fixationsalone could predict the type of image being sought(even if it was not present) at above chance levels in anaive Bayesian classifier

Participants preferentially fixate certain regions thatthey at least implicitly think are more diagnostic forrecognition of different types of emotions Experiment2 verifies that the two broadest differentiable regionsmdashthe eyes and the mouthmdashdo indeed differ in how wellthey signal particular emotions

Experiment 2 Varying diagnosticinformation of faces altersperformance

In order to verify whether different areas of our facestimuli contain more or less diagnostic information foreach emotion we asked a new set of participants tomake similar judgments of the emotional content offaces but we occluded information either from the eyeregion or the mouth region We predicted that foremotion judgments for which participants in Experi-ment 1 fixated the eye region at relatively high rates(ie anger and shame) occluding that region woulddecrease emotion detection accuracy In contrast foremotions for which the mouth region was particularlyhighly fixated (ie joy and disgust) occluding thatregion would decrease emotion detection accuracy

Methods

Participants

Ten college-aged participants completed Experiment2 All participants gave consent and received coursecredit for their participation

Figure 8 Confusion matrices for the Bayesian classifier including information from the first eight fixations from trials with

photographs showing the highest emotion level (left column) and trials with neutral emotion photographs (right column) The top

row contains full confusion matrices with bold type indicating significant differences from expectations of chance (1667)

conservatively Bonferroni corrected for 36 comparisons To facilitate visual inspection cell values are redundantly color coded for

magnitude within each table The bottom row removes bold formatting and folds each table along its identity diagonal to depict

mutual confusability regardless of whether a pair of classifications is actual and predicted or predicted and actual respectively Boxed

values are highlighted and discussed in the text

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 11

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Stimuli

Stimuli were identical to that of Experiment 1 withthe following exceptions For emotional faces onlyone emotional morph was used per condition Angerdisgust fear and sadness consisted of 40 emotionalintensity morphs whereas joy consisted of only 20and shame 60 We determined the specific emotionalintensity per emotion during piloting in order toensure participants could perform above chance butnot at ceiling (70 correct) The values used areconsistent with the performance observed in Experi-ment 1 (see Figure 3) Additionally all face stimulihad a black box (838 middot 288) covering either the eyesor the mouth

Apparatus

All of the stimuli were created and displayed usingMATLAB with the Psychophysics Toolbox (Brainard1997 Pelli 1997) on an Intel Macintosh running OS X106 All stimuli were displayed on a 17-in View-SonicE70fB CRT monitor (1024 middot 786 resolution 75Hz 2897 pixels8 of visual angle) The viewing distancewas approximately 56 cm

Procedure

This experiment consisted of a total of 72 trialspresented as six blocks of trials (one block of trials peremotion type) shown in random order repeated twicefor a total of 144 trials per participant Each blockconsisted of 12 trials per block consisting of sixneutral and six emotional face trials presented inrandomized order At the beginning of each trialparticipants saw a fixation cross for 1 s followed by abriefly presented face for 100 ms For half of the trials

a black box covered either the eyes or the mouth (seeFigure 9) Participants then judged whether the photodepicted a specific emotion or not (ie angry or notangry)

Results

We examined whether or not overall performancediffered according to whether the mouth (ie eyescovered by black box) or eyes (ie mouth covered byblack box) were visible For joy we found performancewas significantly better when the mouth was visible (Mfrac14 633) relative to when it was covered (Mfrac14 517)t(9)frac14 281 pfrac14 002 A similar pattern was observed fordisgust (Mfrac14 758 Mfrac14 601) t(9)frac14 338 p 001Conversely we found performance was significantlybetter for anger when the eyes were visible (Mfrac14 692)compared to when they were covered (Mfrac14 608) t(9)frac14 335 p 001 A similar trend was observed forshame (M frac14 817 M frac14 683) t(9)frac14 181 p frac14 010We failed to find any significant differences for fear (Mfrac14 65 M frac14 675) and sadness (M frac14 717 M frac14717)

Discussion

Emotion judgment performance depended on whichregions of the face were covered in a mannerconsistent with the fixation patterns found in Exper-iment 1 Performance was best for joy and disgustwhen the eyes were covered and the mouth was visibleconsistent with the eye fixation rates we observed inExperiment 1 in which observers tended to fixate moreat the upper lip and less at the eyes relative to otheremotions Figure 9 provides a strong illustration ofthis effect Joy is easy to detect in the left face butharder to detect in the right face We failed to find anymeaningful differences for fearful faces which isconsistent with the diagnostic regions identified inExperiment 1 the eyes and upper lip are notdifferentially important to identify fear For angerperformance was significantly better when the eyeswere visible compared to when they were covered Asimilar trend was also observed for shameful facesBoth of these effects are consistent with the diagnosticareas identified in Experiment 1 We did not see anysignificant differences for detecting sadness whencovering the mouth versus the eyes even thoughExperiment 1 did suggest different levels of diagnos-ticity for those regions Overall five out of the sixemotion types showed results consistent with thepredictions given the diagnostic areas of emotionalfaces identified in Experiment 1

Figure 9 For all trials participants were briefly shown a face

with either the eyes or mouth occluded The task was to judge

whether the face had a particular emotion or not (A) An

example of a joyful face with the eyes covered (B) an example

of the same face but with the mouth covered

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 12

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

General discussion

Perceiving emotion expressed in a face is associatedwith characteristic patterns of attention both whenthat emotion is visible and when it is merely antici-pated Although previous research has segmented partsof the face into three or four distinct regions (seeAviezer et al 2008 Buchan Pare amp Munhall 2007Everdell et al 2007 Henderson Falk Minut Dyer ampMahadevan 2000) the present study offers a moredetailed analysis using 21 defined regions Our resultsshow that of these regions the five main facial regions(eyes upper nose lower nose upper lip nasion)accounted for 8803 of all fixations any other facialregion accounted for at most 3 of fixations Thesepatterns suggest that these five facial regions may be themost critical for emotional recognition within faces

Despite the dominance of these five regions therewere consistent differences in fixation patterns whenseeking different emotional cues within a face Figure 5summarizes the regions that were highly fixated foreach emotion (relative to the other emotions) and offersa comparison of these patterns to past work exploringwhich regions are most diagnostic for evaluating agiven emotion (see Figure 5 Smith et al 2005)

For joy participants fixated the least at the eyesand spent the most time fixating at the upper lipsThis is likely driven by the importance of the upperlip relative to the smile the most salient facialfeature of joy (see Figure 5) Importantly thesefixation differences were significant across emo-tional and neutral stimuli suggesting a particularlystrong goal-driven strategy for perceiving joy Foremotional faces there also appeared to be amarginal difference in which the upper nose wasfixated less frequently These differences wereprimarily driven by the first few fixations

For disgust participants fixated more often at theupper lip and less often at the eyes Thesedifferences are likely due to the importance of thefurrowing of the nose and mouth when making thedisgust facial expression thus making the upper lipmore salient These patterns were primarily drivenby the first few fixations

For fear participants fixated more at the eyes andrelatively less at the nasion (see Figure 5) Theredid not appear to be any significant differencesrelative to other emotions in fixations at the uppernose lower nose and upper lip These patternswere fairly consistent across time for the nasionalthough the eyes became increasingly fixated overtime

For anger participants fixated most at the eyesand least at the upper lip When emotion was notpresent participants fixated significantly more at

the nasion as well There did not appear to be anysignificant fixation differences at the upper andlower nose (see Figure 5) These results areconsistent with previous findings suggesting thediagnostic value of the eyes and nasion fordetecting anger (Smith et al 2005) These patternswere consistent across time

For sadness participants fixated most at the eyesIn contrast fixation time was significantly lower atthe upper lip These fixation patterns weresignificant across both emotional and neutralstimuli for both the eyes and upper lip suggestingthis pattern may be driven by a goal-drivenstrategy These patterns were consistent acrosstime

For shame participants fixated to a greater extentat the eyes This pattern was most pronouncedafter the first fixation For all other regions theredid not appear to be any significant differences (seeFigure 5)

Across all blocks it is interesting that there were nosignificant differences found in fixation time spent atthe lower nose whether for emotional or neutral facesit appears to serve as a central fixation point across aface Another finding in our data is that dynamicdifferences are apparent even within the first fixationfor example in the much greater percentage of fixationsto the eye region for anger relative to joyful expres-sions This provides further support for a goal-drivenstrategy during emotional recognition of faces becausesuch differences were typically maintained acrossemotional and neutral stimuli The values and differ-ences within many regions across emotion type werelarger and more pronounced while viewing emotionalversus neutral stimuli still suggesting an important rolefor stimulus-driven factors on eye movements

Gaze patterns were particularly variable acrossemotions in the first few fixations and became lessvariable for the remainder of the viewing time Theupper nose was fixated to the greatest extent as the firstfixation regardless of emotion type and then fixationrates remained relatively variable depending on theemotion type for the remainder of the trial In contrastfor the lower nose the first fixation demonstratedvariability across different emotion types but was areliable target at the second fixation regardless ofemotion For all emotions the upper lip was consis-tently a weaker target at the first fixation and fixatedmost during the second fixation There was no generalfixation pattern at the nasion demonstrating widevariability across emotions over time We also observeda significant left dominance for the initial fixationregardless of stimulus type

Although participants made an average of 83fixations per trial the prediction rates for the Bayesianclassifier showed that the first two fixation locations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 13

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

were most predictive of the type of emotion sought bythe observer This result is consistent with previouswork showing that face recognition can be completedafter only one to two eye movements (Hsiao amp Cottrell2008) although the number of fixations needed torecognize emotional content may be greater

Building on the diagnostic regions of the emotionalfaces identified in Experiment 1 Experiment 2 offered adirect evaluation to whether those regions wereimportant by covering either the mouth or the eyesduring an emotional judgment task Consistent with thepredictions of Experiment 1 emotional judgmentperformance was impaired for emotions for which themouth was more diagnostic (joy disgust) as well as foremotions for which the eyes were more diagnostic(anger shame) Although the present data cannotdirectly confirm that eye movements to these morediagnostic regions helped performance they do showthat attentional strategies were aligned with locationsthat carry diagnostic information

Conclusion

When asked to evaluate the presence of a specificemotion within a face participants focused on acommon set of face regions but also used emotion-specific eye-movement strategies in both emotional andneutral faces These results are consistent with the ideathat focusing attention on certain diagnostic regions isbeneficial for emotion processing and that thesestrategies may be driven by both stimulus-driven andgoal-driven factors The evidence for goal-drivenstrategies is consistent with previous research suggest-ing that the goals and characteristics of the perceivermay influence how eye movements are deployed duringfacial emotion recognition (Jack et al 2009 Perlman etal 2009) Face and emotion recognition are even morebroadly affected by a facersquos gender (Wright amp Sladden2003) race or culture (Jack et al 2009 OrsquoToole et al1994) and individual facial morphology (Oosterhof ampTodorov 2009) These effects likely interact with thegoal-driven factors observed in the current study andwe hope that future research explores such interactions

The presence of a goal-driven strategy duringemotional recognition has numerous practical impli-cations specifically to individuals with disorders thatmay result in social deficits Patients with certaindisorders such as autism (Pelphrey et al 2002 SpezioAdolphs et al 2007 Spezio Huang Castelli ampAdolphs 2007) schizophrenia (Loughland Williamsamp Gordon 2002) social phobias (Horley WilliamsGonsalvez amp Gordon 2004) and Alzheimerrsquos disease(Hargrave Maddock amp Stone 2002 Ogrocki Hills ampStrauss 2000) demonstrate eye-movement patterns

that are remarkably different from normal participantsand these anomalous patterns may impair their abilityto correctly identify the emotional valence of facesThese individuals typically focus longer at nondiag-nostic regions (such as the brow or cheeks) relative todiagnostic regions of the face (eyes mouth nose)during emotional recognition These abnormal eye-movement patterns contribute to social or emotionaldeficits that accompany abnormal face recognition Bycomparing such abnormal eye-movement patterns toour results it may be possible to identify ways in whichaltered eye-movement patterns could improve emotionrecognition

Keywords face recognition emotion eye movementsattention fixation

Acknowledgments

We thank Trixie Lipke for assistance with datacollection and Sumeeth Jonathan for programming andhelp with movie creation

Commercial relationships noneCorresponding author Mark W SchurginEmail maschurginjhueduAddress Department of Psychological amp Brain Sci-ences Johns Hopkins University Baltimore MDUSA

References

Adolphs R Gosselin F Buchanan T W TranelD Schyns P amp Damasio A R (2005) Amechanism for impaired fear recognition afteramygdala damage Nature 433(7021) 68ndash72

Adolphs R Tranel D Damasio H amp Damasio A(1994) Impaired recognition of emotion in facialexpressions following bilateral damage to thehuman amygdala Nature 372 669ndash672

Aviezer H Hassin R R Ryan J Grady CSusskind J Anderson A Bentin S (2008)Angry disgusted or afraid Studies on the mal-leability of emotion perception PsychologicalScience 19(7) 724ndash732

Baron-Cohen S amp Wheelwright S (2004) Theempathy quotient An investigation of adults withAsperger syndrome or high functioning autism andnormal sex differences Journal of Autism andDevelopmental Disorders 34(2) 163ndash175

Beaupre M G amp Hess U (2005) Cross-culturalemotion recognition among Canadian ethnic

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 14

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

groups Journal of Cross-Cultural Psychology36(3) 355ndash370

Blair R J R (2005) Responding to the emotions ofothers Dissociating forms of empathy through thestudy of typical and psychiatric populationsConsciousness and Cognition 14(4) 698ndash718

Brainard D H (1997) The Psychophysics ToolboxSpatial Vision 10 443ndash446

Buchan J N Pare M amp Munhall K G (2007)Spatial statistics of gaze fixations during dynamicface processing Social Neuroscience 2(1) 1ndash13

Darwin C (1872) The expression of emotions in manand animals New York Philosophical Library

Ekman P amp Friesen W V (1975) Unmasking theface Englewood Cliffs NJ Prentice Hall

Ekman P amp Friesen W V (1978) The Facial ActionCoding System (FACS) A technique for themeasurement of facial action Palo Alto CAConsulting Psychologists Press

Elfenbein H A amp Ambady N (2002) Is there an in-group advantage in emotion recognition

Everdell I T Marsh H Yurick M D Munhall KG amp Pare M (2007) Gaze behaviour inaudiovisual speech perception Asymmetrical dis-tribution of face-directed fixations Perception 361535ndash1545

Gosselin F amp Schyns P G (2001) Bubbles Atechnique to reveal the use of information inrecognition tasks Vision Research 41(17) 2261ndash2271

Hargrave R Maddock R J amp Stone V (2002)Impaired recognition of facial expressions ofemotion in Alzheimerrsquos disease Journal of Neuro-psychiatry and Clinical Neurosciences 14 64ndash71

Hejmadi A Davidson R J amp Rozin P (2000)Exploring Hindu Indian emotion expressionsEvidence for accurate recognition by Americansand Indians Psychological Science 11(3) 183ndash187

Henderson J M Falk R Minut S Dyer F C ampMahadevan S (2000) Gaze control for facelearning and recognition by humans and machinesMichigan State University Eye Movement Labora-tory Technical Report 4 1ndash14

Henderson J M Williams C C amp Falk R J (2005)Eye movements are functional during face learningMemory amp Cognition 33(1) 98ndash106

Horley K Williams L M Gonsalvez C amp GordonE (2004) Face to face Visual scanpath evidencefor abnormal processing of facial expressions insocial phobia Psychiatry Research 127 43ndash53

Hsiao J H W amp Cottrell G (2008) Two fixations

suffice in face recognition Psychological Science19(10) 998ndash1006

Jack R E Blais C Scheepers C Schyns P G ampCaldara R (2009) Cultural confusions show thatfacial expressions are not universal Current Biol-ogy 19 1ndash6

Keltner D amp Buswell B N (1997) EmbarrassmentIts distinct form and appeasement functionsPsychological Bulletin 122(3) 250ndash270

Kowler E Anderson E Dosher B amp Blaser E(1995) The role of attention in the programming ofsaccades Vision Research 35(13) 1897ndash1916

Loughland C M Williams L M amp Gordon E(2002) Schizophrenia and affective disorder showdifferent visual scanning behaviour for faces Atrait versus state-based distinction BiologicalPsychiatry 52 338ndash348

Marsh A A Kozak M N amp Ambady N (2007)Accurate identification of fear facial expressionspredicts prosocial behavior Emotion 7(2) 239

Nahm F K D Perret A Amaral D G amp AlbrightT D (1997) How do monkeys look at facesJournal of Cognitive Neuroscience 9(5) 611ndash623

Ogrocki P K Hills A C amp Strauss M E (2000)Visual exploration of facial emotion by healthyolder adults and patients with Alzheimer diseaseNeuropsychiatry Neuropsychology and BehavioralNeurology 13 271ndash278

Oosterhof N N amp Todorov A (2009) Sharedperceptual basis of emotional expressions andtrustworthiness impressions from faces Emotion9(1) 128

OrsquoToole A J Deffenbacher K A Valentin D ampAbdi H (1994) Structural aspects of face recog-nition and the other-race effect Memory ampCognition 22(2) 208ndash224

Pelli D G (1997) The VideoToolbox software forvisual psychophysics Transforming numbers intomovies Spatial Vision 10 437ndash442

Pelphrey K A Sasson N J Reznick J Paul GGoldman B D amp Piven J (2002) Visualscanning of faces in autism Journal of Autism andDevelopmental Disorders 32(4) 1573ndash3432

Perlman S B Morris J P Vander Wyk B CGreen S R Doyle J L amp Pelphrey K A(2009) Individual differences in personality shapehow people look at faces PLoS ONE 4(6) e5952

Pflugshaupt T Mosimann U P Schmitt W J vonWartburg R Wurtz P Luthi M Muri RM (2007) To look or not to look at threatScanpath differences within a group of spiderphobics Journal of Anxiety Disorders 21(3) 353ndash366

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 15

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Sasson N Tsuchiya N Hurley R Couture S MPenn D L Adolphs R amp Piven J (2007)Orienting to social stimuli differentiates socialcognitive impairment in autism and schizophreniaNeuropsychologia 45 2580ndash2588

Schyns P G Petro L S amp Smith M L (2009)Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brainBehavioral and brain evidence PLoS One 4(5)e5625

Segerstrom S C (2001) Optimism and attentionalbias for negative and positive stimuli Personalityand Social Psychology Bulletin 27 1334ndash1343

Smith M L Cottrell G W Gosselin F amp SchynsP G (2005) Transmitting and decoding facialexpressions Psychological Science 16(3) 184ndash189

Spezio M L Adolphs R Hurley R S E amp PivenJ (2007) Analysis of face gaze in autism usinglsquolsquoBubblesrsquorsquo Neuropsychologia 45 144ndash151

Spezio M L Huang P-Y S Castelli F amp Adolphs

R (2007) Amygdala damage impairs eye contactduring conversations with real people The Journalof Neuroscience 27(15) 3994ndash3997

Susskind J M Lee D H Cusi A Feiman RGrabski W amp Anderson A K (2008) Expressingfear enhances sensory acquisition Nature Neuro-science 11(7) 843ndash850

Tracy J L amp Robins R W (2004) Show your prideEvidence for a discrete emotion expression Psy-chological Science 15 194ndash197

Walker-Smith G J Gale A G amp Findlay J M(1977) Eye movement strategies involved in faceperception Perception 6(3) 313ndash326

Wright D B amp Sladden B (2003) An own genderbias and the importance of hair in face recognitionActa Psychologica 114(1) 101ndash114

Zhao W Chellappa R Phillips P J amp RosenfeldA (2003) Face recognition A literature surveyAcm Computing Surveys (CSUR) 35(4) 399ndash458

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 16

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

  • Introduction
  • Experiment 1 Eye tracking of
  • f01
  • f02
  • f03
  • f04
  • f05
  • f06
  • f07
  • Experiment 2 Varying diagnostic information
  • f08
  • f09
  • General discussion
  • Conclusion
  • Adolphs1
  • Adolphs2
  • Aviezer1
  • BaronCohen1
  • Beaupre1
  • Blair1
  • Brainard1
  • Buchan1
  • Darwin1
  • Ekman1
  • Ekman2
  • Elfenbein1
  • Everdell1
  • Gosselin1
  • Hargrave1
  • Hejmadi1
  • Henderson1
  • Henderson2
  • Horley1
  • Hsiao1
  • Jack1
  • Keltner1
  • Kowler1
  • Loughland1
  • Marsh1
  • Nahm1
  • Ogrocki1
  • Oosterhof1
  • Otoole1
  • Pelli1
  • Pelphrey1
  • Perlman1
  • Pflugshaupt1
  • Sasson1
  • Schyns1
  • Segerstrom1
  • Smith1
  • Spezio1
  • Spezio2
  • Susskind1
  • Tracy1
  • WalkerSmith1
  • Wright1
  • Zhao1

nasal volume suggest a perceptual enhancement ofonersquos environment whereas the opposite pattern wasobserved for disgust (Susskind et al 2008)

Different regions of a face contain more or lessinformation required for categorization of facialemotion (Smith et al 2005 Spezio Adolphs Hurleyamp Piven 2007) as well as facial identity (Gosselin ampSchyns 2001 Zhao Chellappa Phillips amp Rosenfeld2003) Numerous studies show evidence for strategicdeployment of attention to more diagnostic regionstypically indexed by eye movements an overt reflectionof attentional deployment (Kowler Anderson Dosheramp Blaser 1995) For example rhesus monkeys spendsignificantly more time fixating at the eyes whileviewing threatening faces in comparison to other facessuch as a lip smack or yawn for which they morestrongly fixate at the mouth (Nahm Perret Amaral ampAlbright 1997) Humans display individual differencesin fixation patterns when viewing faces and thesepatterns also differed with a single person acrossdifferent tasks yet these patterns were surprisinglyreliable across examples for a given participant andtask (Walker-Smith Gale amp Findlay 1977)

While it is not always clear whether such attentionalstrategies actually improve performance there aresome cases in which selecting specific regions of the faceappears necessary for successful emotion recognitionFor example patients with bilateral amygdala damageare relatively inaccurate at recognizing fear from theface compared to healthy controls (Adolphs TranelDamasio amp Damasio 1994) and a large contributor tothis deficit may be a lack of attention to the eye regionof the face Remarkably when given explicit instruc-tion to look or move their eyes toward the eye region ofa facial expression amygdala damage patients are ableto recognize fear (Adolphs et al 2005) suggesting thatthe patientrsquos core deficit is not in fear recognition perse but rather in selectively looking at the regions of theface most diagnostic for successful fear recognition Asa potentially similar example autistic individualstypically look longer at nondiagnostic relative todiagnostic (eg eyes nose mouth) regions of the faceduring emotion recognition likely contributing toemotion recognition deficits (Pelphrey et al 2002)Other work on a typical population showed that wheneye-gaze patterns were restricted memory encoding offace identity was impaired relative to a condition inwhich gaze was allowed to range freely (HendersonWilliams amp Falk 2005)

Strategic deployment of attention within a face isdriven not just by low-level visual properties but alsoby the traits and goals of the observer Spider phobicsdisplay slower eye-movement patterns when viewingfear-relevant stimuli (Pflugshaupt et al 2007) relativeto controls Pessimists spend more time looking atnegative scenes such as images of skin cancer

(Segerstrom 2001) relative to optimists People high inneuroticism look longer at the eye region of fear faces(Perlman et al 2009) relative to those low inneuroticism Easterners are more likely to fixate on theeye region of the face to a greater extent compared toWesterners who are more likely to sample more evenlyacross facial regions (Jack et al 2009) The affectivecontext in which identical faces are embedded has alsobeen shown to dramatically alter emotional perception(Aviezer et al 2008) These findings reveal individualand group variation in eye movements during emotionrecognition driven by goal-driven biases in cognitiveprocessing

The processing demands of emotion recognitionappear to trigger specific patterns of attention across aface Here we explore how attentional deployment asreflected by eye-movement patterns differ duringdetection of six classic emotional expressions We alsotest whether these patterns are driven in a stimulus-driven (properties of the emotional face stimulus itself)versus goal-driven (perceptual strategies that wouldoccur even on a neutral face) manner

Experiment 1 Eye tracking ofemotional judgments

We recorded eye movements as participants dis-criminated neutral from emotional expressions ofparticular emotions We blocked these decisions byemotion type so that we could examine fixationpatterns on emotional faces (reflecting both stimulus-driven and goal-driven patterns) and neutral faces(reflecting only goal-driven patterns)

Methods

Participants

Fifty-one college-aged participants (23 male 28female) completed Experiment 1 All participants gaveconsent and received course credit for their participa-tion Results from an additional 16 participants werenot analyzed due to unreliable eye recordings typicallydue to interference from eyeglasses contact lenses ormascara

Stimuli

Stimuli for the current experiment consisted of 228gray scale photographs taken from the Montreal Set ofFacial Displays of Emotion (Beaupre amp Hess 2005)This facial photo set was comprised of 12 unique facialidentities each posing in one of six emotionalexpressions as well as neutral These identities consisted

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 2

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

of an equal number of African Americans Asians andCaucasians distributed evenly across gender (six malesix female) to maximize the generizability of ourfindings across these variables which can stronglyaffect face recognition performance (Elfenbein ampAmbady 2002 Jack et al 2009 OrsquoToole Deffen-bacher Valentin amp Abdi 1994 Wrist amp Sladden2003)

For each of these identities we created six linearinterpolation morphs between the neutral face and eachemotion (Figure 1A) at four levels 0 emotional (onlythe neutral face with no contribution from theemotional face) and 20 40 and 60 emotionalintensity (contribution from the emotional face) Thismanipulation ensured that the judgment task would bedifficult maximizing the importance of selecting themost diagnostic regions of the face All photographswere standardized for size (800 middot 600 pixels) andbackground color The majority of the face portion ofthe image subtended an average of 88 in width and 108in height A more conservative measure of theboundaries of a lsquolsquofacersquorsquo (ie the vertical measure of theface includes the entire forehead up to the hairline)results in an image 988 in width and 1228 in heightBoth values roughly conform to the visual angle of aface in normal human interactions (see also Hendersonet al 2005 Hsiao amp Cottrell 2008)

Apparatus

The experiment took place in a dimly lit roomStimuli were presented on a 17-in CRT monitor(resolution of 800 middot 600 pixels 85-Hz refresh rate 255

pixels8 of visual angle) located approximately 63 cmfrom participantsrsquo eyes and were generated using SRResearch Experiment Builder on Windows XP View-ing distance was maintained throughout the experi-ment Eye movements were recorded at a sampling rateof 1000 Hz (pupil-only mode) with an EyeLink 1000eye tracker (SR Research 0158 resolution) using adesktop camera-mount illuminator Normal sensitivitysettings were used setting the saccade threshold ateither 308s for velocity or 80008ss for acceleration Afixation was defined as any period that is not a blink orsaccade Responses were collected using a USBSidewinder gamepad

Procedure

This experiment consisted of a total of 144 trials(Figure 1B) presented as six blocks of trials (one blockof trials per emotion type) shown in one of six possiblepseudorandom counterbalanced orders Each blockconsisted of 24 trials per block each consisting of 12neutral and 12 emotional face trials (equally dividedinto 20 40 or 60) presented in randomized orderThe 12 facial identities could not be fully crossed withineach block or participant but were counterbalancedsuch that they were equally frequently associated witheach emotional intensity level across participants Theexperiment began with an eye-tracking calibrationscreen consisting of nine dots spread around a screenThis calibration was repeated as needed throughoutthe experiment

At the beginning of each trial participants fixated ata central cross on a gray screen Participants started the

Figure 1 Illustration of face stimuli (A) and experimental paradigm (B) Participants were presented with blocks of trials containing

half neutral faces (0 intensity) and half emotional faces (varying from 20ndash60 intensity) and were asked to judge whether each

face had any amount of a particular emotion present (eg fear)

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 3

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

trial by pressing a button on the gamepad while fixatingwithin a required 28 radius of screen center A photowould then appear in the center of the screen for 3 sfollowed by a blank gray screen Participants thenjudged whether the photo depicted a neutral face or aface with even a slight amount of emotion Theyindicated their responses by pressing one of twobuttons on the gamepad

Eye-tracking analysis

For each facial stimulus 21 face regions were definedaccording to the template shown in Figure 2 Allanalyses used percentage of total fixation time overthese regions and Figure 2 also shows overall fixationrates for each region across all conditions Substitutingthe number of fixations for total fixation duration didnot change the patterns of results reported below thenumber of fixations and total fixation duration werehighly correlated All percentages omit fixations thatbegin before the presentation of the image and fixationsthat were beyond the bounds of the face image Foranalyses examining the time course of fixation rates thedependent measure is the percentage of all fixationswithin a given region within a given time window (ie3) This measure was not normalized relative to the sizeof each region because there appears to be littlerelationship between region size and fixation table (seeFigure 2) Fixation rates are not directly comparedacross regions because these rates are not independent

of each other Instead for each region fixation ratesare compared across different conditions and timewindows The distribution of fixation time acrossregions yielded five main facial regions (eyes uppernose lower nose upper lip nasion) that togetheraccounted for 8803 of all fixations These highfixation frequencies were not a result of these areasbeing physically larger Some of the largest definedareas within the face showed extremely low fixationrates including the right and left cheek (26 22)the forehead (09) the hair (02) and the largestregion the background (023) Hence we restrictedour subsequent statistical analyses on eye-movementdata to these five main facial regions

Results

Behavioral results Emotional intensity ratings

We first examined the percentage of trials thatparticipants rated as lsquolsquoemotionalrsquorsquo as a function ofemotion type and intensity (see Figure 3) A one-wayANOVA with the six emotions as a factor revealed asignificant effect of emotion type on the emotionaljudgment task F(5 245) frac14 10 p 0001 This effectwas driven by higher overall emotional ratings duringblocks containing sad faces (Mfrac14 662) relative to allother emotion blocks all ts(49) 42 all ps 0001Across all trials joy (M frac14 622) and disgust (M frac14623) were rated slightly lower in emotional intensity

Figure 2 Illustration of 21 facial regions of interest (ROIs) We identified five main facial ROIsmdasheyes (green) upper nose (blue) lower

nose (orange) upper lip (red) and nasion (purple)mdashthat accounted for more than 88 of all fixations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 4

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

compared to sadness both ts(49) 46 ps 0001 andhigher compared to fear and sadness both ts(49) 38both ps 0001 Shame (M frac14 585) and anger (M frac14589) were slightly lower but still higher than fear (Mfrac14 548) both ts(49) 32 both ps 0002

As a measure of the effect of emotional intensitymanipulation on ratings we compared the slope of thefunction relating emotion intensity (0 20 4060) in the face image to ratings We omitted 60emotion from this slope because ratings overall wereabove 90 and produced little slope variance Amongemotion blocks there were differences in the slope fromzero to 40 emotion F(5 245)frac14 177 p 0001 Joy(Mfrac14 77 difference in rating value) and disgust (Mfrac1474 difference) were higher than fear (M frac14 62difference) t(49) 24 p 0021 Anger (M frac14 47difference) sadness (M frac14 45) and shame (M frac14 43difference) were all lower than fear all ts(49) 247 p 001

Behavioral results Emotional judgment accuracy

We then evaluated whether these emotional judg-ments were lsquolsquocorrectrsquorsquo according to our definition thatany face containing 0 intensity of the emotional facewas lsquolsquoneutralrsquorsquo and any face containing 20ndash60intensity of the emotional face should be judged aslsquolsquoemotionalrsquorsquo This definition of lsquolsquocorrectrsquorsquo is as arbitraryas our threshold for emotional content The goal of thetask was to engage the participant in recognizing aparticular emotion and our analysis of performance isonly intended as confirmation that participants per-formed the task and that performance was at neitherfloor nor ceiling level

Our analysis confirmed these assumptions Partici-pantsrsquo overall accuracy ranged from 660 to 868with a mean of 755 Emotion judgment accuracy

differed across the six emotion block conditions F(5245)frac14 96 p 0001 Participants recognized joy (Mfrac14827 correct) and disgust (M frac14 795) with higheraccuracy than all others t(49) 23 all ps 003Participants showed lower accuracy when recognizingfear (M frac14 762) and anger (M frac14 735) Howeverparticipants were more accurate in recognizing fearcompared to sadness (M frac14 707) and shame (M frac14704) both t(49) 26 both p 0013 Theseaccuracy rates suggest that although there weredifferences among emotion blocks overall performancewas roughly similar and far from floor or ceiling levelsconfirming that participants performed the judgmenttask as designed

Gaze patterns across emotions

The following analyses only included the first fourfixations of each trial with the lsquolsquofirstrsquorsquo fixation definedas the first new landing of the eye after the appearanceof the image As will be detailed later participantsmade an average of 83 fixations per trial and we foundthe first few fixations to be the most diagnostic in termsof capturing differences across emotions Althoughsome of our later analyses suggest that only the firsttwo fixations are most critical for emotion recognitionhere we conservatively analyze the first four andexamine the changes in pattern across this initial rangein a later section For the following analyses isolatingthe data to the first four fixations produces a pattern ofresults that is qualitatively similar to using more (egeight) fixations All analyses were performed irrespec-tive of whether a response on a trial was correct orincorrect

Figure 3 Percentage of trials rated as lsquolsquoemotionalrsquorsquo as a function of emotion type and intensity These data demonstrate the varying

difficulty of the task which was meant to encourage selection of the most diagnostic regions of the face Across all emotions

performance did not reach above chance until 40 intensity and was near ceiling for 60 intensity images

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 5

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Emotional faces

For each region fixation rates on emotional faces(those with 20 40 or 60 emotional intensity) weresubmitted to one-way ANOVA with emotion block asa factor In these trials eye movements can be affectedby both the emotional content of the face images(stimulus-driven information) and the fact that theobserver is currently seeking that emotion type (goal-driven information) Figure 4A shows fixation rates for

each face region across the six emotions We summarizethe major findings at the end of this section withadditional information and statistical analyses detailedin the supplemental materials

For the eyes there was a main effect of face emotionF(5 250) frac14 272 p 0001 whereby participantslooked longer at the eye region in facial expressions ofanger (M frac14 352) fear (M frac14 308) sadness (M frac14342) and shame (Mfrac14 343) and looked less withinthe eye region for disgust (M frac14 197) and joy (M frac14

Figure 4 (A) For each emotional face fixation time spent within each main ROI (B) For each neutral face fixation time spent within

each main ROI Designates t test p 005 relative to the mean for each emotion

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 6

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

195) faces relative to the mean (M frac14 289) allts(50) 82 ps 0001 For the upper nose there wasan effect of emotion condition F(5 250) frac14 47 p

0001 whereby participants looked marginally less atjoyful faces (M frac14 187) relative to the mean (M frac14218) all ts(50) 158 all ps 01 For the lowernose there was no effect of emotion condition F 1For the upper lip there was an effect of emotioncondition F(5 250)frac14 272 p 0001 wherebyparticipants looked longer at the upper lip for disgust(Mfrac14 155) and joy faces (Mfrac14 209) and less at theupper lip for anger (M frac14 82) and sad (M frac14 68)faces relative to the mean (Mfrac14121) all ts(50) 24all ps 002 For the nasion there was an effect ofemotion condition F(5 250) frac14 31 p 001 wherebyparticipants looked less within the nasion for fear faces(M frac14 38) relative to the mean (M frac14 61) t(50) frac1439 p 0001

To better visualize the differences among emotionconditions Figure 5 graphically illustrates fixationrates for each of the top five regions across the sixemotion conditions relative to the average rate ofregion fixation collapsed across all emotions The toprow of the figure depicts a correlate of the mostdiagnostic information in the image that an idealobserver might use to distinguish between emotionaland neutral facesmdashan image subtraction of the 60emotional image from the neutral image for a singleface identity (this face was chosen because it haduniquely low head movement across photographsrequired for this technique to reveal a useful contrast)Areas of greater difference are depicted with white andlower difference with black respectively

The middle row of the figure graphically illustratesfixation rates for each of the top five regions across our

Figure 5 Top row Image subtractions between the emotional and neutral versions of a particular face from the image set used in

Experiment 1 revealing areas of maximal diagnostic information (white) for judging presence of emotion in the face Middle row

Emotional fixation rates for each of the top five regions across the six emotion conditions relative to the average rate of region

fixation collapsed across all emotional face trials Bottom row Diagnostic information for judgment of each emotion type using the

lsquolsquoBubblesrsquorsquo technique (adapted with permission from Smith et al 2005)

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 7

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

six emotion conditions relative to the average rate ofregion fixation collapsed across all emotions

The bottom row of the figure contains images fromSmith and colleagues (2005) using the lsquolsquobubblesrsquorsquotechnique to find the maximally diagnostic regions offaces for judging each emotion The similarities withincolumns of this figure are striking despite theindependence of the data sources for each Forexample in the middle row note the relatively highfixation rates on the lips for joy and disgust in contrastto the relatively high fixation rates on the eyes foranger sadness and shame Both the top and bottomrows confirm the relatively high diagnostic value ofthese areas for those particular emotion judgments

Neutral faces

In each emotion block half of the faces werenonemotional (neutral) Because these faces wereidentical across blocks any difference in fixation ratesrepresent solely goal-driven strategies resulting fromseeking a given emotion and cannot be due to stimulus-level differences in the images The results will show thatweaker versions of emotion-specific strategies endureacross these neutral faces Figure 4B depicts fixationrates within each block across these neutral faces Foreach region of the face we report the results of a one-way ANOVA seeking differences among the six emotionblock types but focusing only on the neutral faces

For the eyes the emotion block analysis revealed aneffect of emotion condition across neutral faces F(5250) frac14 77 p 0001 whereby participants lookedlonger for anger (Mfrac14334) fear (Mfrac14343) sad (Mfrac14 36) and shame (M frac14 361) faces By contrastparticipants looked less for disgust (M frac14 286) andjoy (M frac14 291) faces relative to the mean (M frac14329) all ts(50) 103 all ps 0001 For the uppernose the emotion block analysis revealed that therewas still no effect of emotion condition across neutralfaces F 1 For the lower nose the emotion blockanalysis revealed that there was still no effect ofemotion condition across neutral faces F 1 For theupper lip the emotion block analysis reveals that therewas still an effect of emotion condition across neutralfaces due to relatively high fixation rates for the joy (Mfrac14 14) condition (disgust was no longer significant)and lower fixation rates for sadness (M frac14 79)condition (anger was no longer significant) relative tothe mean (Mfrac14 99) all ts(50) 197 all ps 0055F(5 250)frac14 86 p 0001 For the nasion the emotionblock analysis revealed that there was still a marginaleffect of emotion condition F(5 250)frac14 23 p frac14 004Although the pattern was similar the change in themean value caused a new set of emotion conditions tobe different from the mean Fear was no longer

significantly different but anger (M frac14 87) hadrelatively high fixation rates t(50) frac14 184 p frac14 007

In summary even when the neutral faces wereidentical across emotion judgment blocks (eg angerjoy) there were still effects of emotion judgment typeon patterns of fixation In fact the majority of effectsfound in the emotional faces were still present in theneutral faces although the effects were not as robustOne exception was fixations to the nasion whereneutral faces showed a pattern of fixation not presentfor emotional faces but this effect may be due to thechanging baseline of the mean number of fixations fornasion fixations overall The fact that most fixationtrends remained for neutral faces that were identicalacross blocks suggests that a substantial portion of theeffect of seeking a specific emotion is due to goal-driveninfluences on fixation patterns

Summary of intensity level analyses

In addition to dividing face trials into emotional(20 40 and 60 emotional) and neutral (0emotional) sets we also examined differences among theemotional face trials as they transitioned from 20 to60 We present a summary of this analysis here andprovide more detailed information and statisticalanalyses in the supplemental materials As the intensityof angry expressions increased fixations increased to theeye region except for the anger faces shown at 60intensity which showed a reduction in fixations to theeye region It is possible that the eyes were mostdiagnostic when the emotion was more ambiguous Asimilar pattern was observed at the nasion where moreneutral faces were fixated to a greater extent than moreemotional faces For disgust judgment blocks there wasa particularly strong tradeoff in fixation position Whenfaces were more neutral participants fixated at the eyesmore frequently However as the intensity of the disgustexpression increased participants looked less at the eyeregion and more toward the upper nose and upper lipAs the intensity of fearful expressions increasedparticipants looked toward the upper nose and upper lipand less at the nasion As the intensity of joyfulexpressions increased there was another particularlystrong effect as fixations increased to the upper lip anddecreased to the eyes and lower nose As the intensity ofsad expressions increased the distribution of fixationsover the five face regions did not significantly change Asthe intensity of shameful expressions increased at 60intensity there were more fixations over the upper liptrading off with fewer fixations over the lower nose

Two of the largest effects of emotional intensity werefor disgust and joy expressions For disgust asemotional intensity conveyed in the face increasedpeople showed greater fixation particularly to the uppernose (close to the particularly salient wrinkle that

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 8

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

appears between the eyes during an expression ofdisgust) and for joy people showed greater fixation tothe upper lip (the location of the smile) In both caseswhen faces become more neutral these fixation ratesdecrease and instead the eyes are fixated morefrequently

Gaze patterns across time

The analyses above collapse the first four fixationsacross viewing time Here we examine how thesepatterns evolve over that time Understanding the timecourse of fixation patterns can establish whether thepatterns observed above are stable and persistentacross viewing time or if fixation patterns are more

variable and dynamic We examined both the proba-bility of fixating on a given region of each type ofemotional face within each 250-ms time interval as wellas for each fixation number Although the patternswere highly similar fixation number appeared to revealmore differences especially in the earlier viewingperiod In contrast for the time interval analysisdifferences in fixation patterns among conditionsappeared to be more lsquolsquosmearedrsquorsquo across time suggestingthat the fixation number analysis might reveal moreabout participantsrsquo fixation patterns

The average duration of each fixation was 301 msand participants made between one and 15 fixationsAlthough participants made up to 15 fixations therewere few trials with this many fixations Because theaverage number of fixations was 83 per image weexamined only the first through the eighth fixation Foreach of the following analyses we use the termlsquolsquofixation ratersquorsquo defined as lsquolsquofor all of the Nth fixationson an image in the current condition the percentagethat are within a given regionrsquorsquo This manner ofcalculating fixation rates removed the influence of thegeneral drop in the number of total fixations availableas fixation number increases (ie there are always morethird fixations than fourth fixations)

Across previous data sets using similar images andtasks as well as the present data set we observed thatfixation patterns differed primarily among the first onethrough four fixations The first few fixations weretypically more variable across time and condition andthe remaining fixation rates tended to stabilize Theselater fixations reveal differences among conditions butvary far less over time Therefore we decided to focusour analyses on the first four fixations (a detailedanalysis across all eight fixations is included in thesupplemental section)

In all of the following analyses we compare fixationrates for each region across the six emotion recognitionconditions To maximize the differences among condi-tions for these analyses involving time we use only theemotional face images and omit the neutral images Aseparate analysis of the neutral emotion imagesrevealed similar patterns but to a weaker extent similarto the neutral image analysis reported above

Patterns of gaze across time by emotion

Figure 6 provides fixation distributions for the firstthree emotional fixations (20ndash60) made to aparticular face presented in the study Overall the eyeswere fixated less across all fixations in the joy anddisgust conditions especially during the first threefixations In contrast the eyes were fixated at uniformlyhigh rates across time in the anger and sadnessconditions and at increasing rates across time in thefear and shame conditions The upper nose was a

Figure 6 Fixation distributions for the first three fixations on

emotional (20ndash60) face image trials for one face identity As

participants made more fixations distinct patterns emerged

across emotions in what regions were fixated most Specifically

at the first fixation the upper nose was a particular popular

fixation location although there was still considerable variability

at the region across emotions As participants made additional

fixations a general pattern emerged such that there was

increased fixation at the upper lip for joy and disgust whereas

there was increased fixation at the eyes for anger sadness and

shame

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 9

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

consistently high target across all emotions during thefirst fixation However despite this popularity demon-strated there was still considerable variability infixation patterns across emotions The lower noseshowed wide variability across all emotions at the firstfixation but afterward fixation rates remained muchmore consistent across emotions The upper lip wasfixated at particularly high rates during the secondfixation for joy and disgust and to a weaker degreefear The nasion showed wide variability in fixationpatterns across emotions although there was a highfixation preference at the first fixation for shame

Asymmetry analysis

Previous research has shown that initial fixationstoward faces are biased toward the left side of the facefollowed by a rightward fixation (Everdell MarshYurick Munhall amp Pare 2007) We tested for thiseffect in the present data by coding each fixation asbeing toward the left or right excluding center regionsof the face (UN LN UL LL) A one-way ANOVAwith fixation number (one through eight) and blocktype (emotional or neutral) as factors showed a maineffect of fixation number F(1 7) frac14 4208 p 0001This effect was driven by the first fixation andremoving that fixation from the analysis eliminatedthe main effect F(1 6) frac14 0579 p frac14 0747 The firstfixation during trials across block type had a strongbias toward the left side of the screen (M frac14 581) incomparison to fixations two through eight (M frac14

462) which demonstrated a slight bias toward theright side of the screen These effects confirm theinitial leftward bias in face fixation patterns Possibleroots of this effect including a learned strategy thatthe right side of the face contains the most informa-tion or a general perceptual bias toward the left visualhemifield are discussed in detail in Everdell et al(2007)

Fixation patterns predict facial emotion

If the type of facial emotion displayed (or sought bythe observer) systematically affects the observerrsquospattern of eye movements it should be possible topredict the type of face displayed (or sought) bystatistical analysis of the eye movement sequence forany given trial We separately submitted the first 1ndashNfixation regions (where N frac14 1 to 8) from every trial inthe data set to a naive Bayesian classifier for categoricaldata (the NaiveBayes class of Matlab) using two thirdsof the data for training and one third for testing Wecalculated the average classification accuracy across100 fittest iterations selecting traintest sets indepen-dently for each iteration for (a) all trials (b) emotionalimagendashonly trials and (c) neutral trials

Figure 6 depicts classification accuracy for eachcondition as well as chance prediction (for sixemotions 166) Emotional-image trial classifica-tion accuracy of test trials was highest (with a peakof around 25 accuracy) for the emotional imagendashonly trial set consistent with the stronger fixationpatterns described in our other analyses Neutral-image trial classification accuracy was lower but stillwell above chance performance (with a peak ofaround 20 accuracy) Including all trials led toperformance in between these two (with a peak ofaround 23 accuracy) consistent with Hsiao andCottrell (2008) who showed that the first one to twofixations were most critical for face processing thediagnostic value of our fixation data was primarily inthe first two fixations It is clear from Figure 7 thatthe bulk of predictive power is reached by the secondfixation

Finally we examined the confusion matrix pro-duced by the classifier (Figure 8) When the wrongemotion is predicted which emotions are systemat-ically confused because they are most similar Wefocused here on two subsets of trials the highestemotion (60) images and the neutral image trialsFor the highest emotion images two pairs ofemotions stood out as having the most confusablefixation patterns joy and disgust and anger andsadness This confusability is consistent with theresults depicted in Figure 5 which reveals strongsimilarities between the fixation patterns produced by

Figure 7 Classification accuracy for each condition (emotional

face trials neutral face trials both trial types and chance

prediction) Classification accuracy for emotional trials was

highest consistent with the stronger fixation patterns observed

in our other analyses Neutral trial classification accuracy was

lower but still well above chance Consistent with the observed

fixation patterns of humans the diagnostic value of fixation

data was primarily reached in the first two fixations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 10

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

these pairs of images Two pairs of emotions stoodout as the least confusable joy and sadness and joyand anger This is also consistent with the resultsdepicted in Figure 5 in which the fixation pattern forjoy contrasts strongly with the pattern for anger andsadness For the neutral-trial subset the mostconfusable image pair was again joy and disgust andthe least confusable was again joy and sadness Otherpairings did not stand out as strong outliers for thissubset

Experiment 1 summary

Participants moved their eyes in distinctive patternsfor each type of emotional image These patternsremained (although weaker) for neutral images in eachemotionrsquos block revealing that these patterns were atleast partially goal-directed The pattern of fixationsalone could predict the type of image being sought(even if it was not present) at above chance levels in anaive Bayesian classifier

Participants preferentially fixate certain regions thatthey at least implicitly think are more diagnostic forrecognition of different types of emotions Experiment2 verifies that the two broadest differentiable regionsmdashthe eyes and the mouthmdashdo indeed differ in how wellthey signal particular emotions

Experiment 2 Varying diagnosticinformation of faces altersperformance

In order to verify whether different areas of our facestimuli contain more or less diagnostic information foreach emotion we asked a new set of participants tomake similar judgments of the emotional content offaces but we occluded information either from the eyeregion or the mouth region We predicted that foremotion judgments for which participants in Experi-ment 1 fixated the eye region at relatively high rates(ie anger and shame) occluding that region woulddecrease emotion detection accuracy In contrast foremotions for which the mouth region was particularlyhighly fixated (ie joy and disgust) occluding thatregion would decrease emotion detection accuracy

Methods

Participants

Ten college-aged participants completed Experiment2 All participants gave consent and received coursecredit for their participation

Figure 8 Confusion matrices for the Bayesian classifier including information from the first eight fixations from trials with

photographs showing the highest emotion level (left column) and trials with neutral emotion photographs (right column) The top

row contains full confusion matrices with bold type indicating significant differences from expectations of chance (1667)

conservatively Bonferroni corrected for 36 comparisons To facilitate visual inspection cell values are redundantly color coded for

magnitude within each table The bottom row removes bold formatting and folds each table along its identity diagonal to depict

mutual confusability regardless of whether a pair of classifications is actual and predicted or predicted and actual respectively Boxed

values are highlighted and discussed in the text

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 11

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Stimuli

Stimuli were identical to that of Experiment 1 withthe following exceptions For emotional faces onlyone emotional morph was used per condition Angerdisgust fear and sadness consisted of 40 emotionalintensity morphs whereas joy consisted of only 20and shame 60 We determined the specific emotionalintensity per emotion during piloting in order toensure participants could perform above chance butnot at ceiling (70 correct) The values used areconsistent with the performance observed in Experi-ment 1 (see Figure 3) Additionally all face stimulihad a black box (838 middot 288) covering either the eyesor the mouth

Apparatus

All of the stimuli were created and displayed usingMATLAB with the Psychophysics Toolbox (Brainard1997 Pelli 1997) on an Intel Macintosh running OS X106 All stimuli were displayed on a 17-in View-SonicE70fB CRT monitor (1024 middot 786 resolution 75Hz 2897 pixels8 of visual angle) The viewing distancewas approximately 56 cm

Procedure

This experiment consisted of a total of 72 trialspresented as six blocks of trials (one block of trials peremotion type) shown in random order repeated twicefor a total of 144 trials per participant Each blockconsisted of 12 trials per block consisting of sixneutral and six emotional face trials presented inrandomized order At the beginning of each trialparticipants saw a fixation cross for 1 s followed by abriefly presented face for 100 ms For half of the trials

a black box covered either the eyes or the mouth (seeFigure 9) Participants then judged whether the photodepicted a specific emotion or not (ie angry or notangry)

Results

We examined whether or not overall performancediffered according to whether the mouth (ie eyescovered by black box) or eyes (ie mouth covered byblack box) were visible For joy we found performancewas significantly better when the mouth was visible (Mfrac14 633) relative to when it was covered (Mfrac14 517)t(9)frac14 281 pfrac14 002 A similar pattern was observed fordisgust (Mfrac14 758 Mfrac14 601) t(9)frac14 338 p 001Conversely we found performance was significantlybetter for anger when the eyes were visible (Mfrac14 692)compared to when they were covered (Mfrac14 608) t(9)frac14 335 p 001 A similar trend was observed forshame (M frac14 817 M frac14 683) t(9)frac14 181 p frac14 010We failed to find any significant differences for fear (Mfrac14 65 M frac14 675) and sadness (M frac14 717 M frac14717)

Discussion

Emotion judgment performance depended on whichregions of the face were covered in a mannerconsistent with the fixation patterns found in Exper-iment 1 Performance was best for joy and disgustwhen the eyes were covered and the mouth was visibleconsistent with the eye fixation rates we observed inExperiment 1 in which observers tended to fixate moreat the upper lip and less at the eyes relative to otheremotions Figure 9 provides a strong illustration ofthis effect Joy is easy to detect in the left face butharder to detect in the right face We failed to find anymeaningful differences for fearful faces which isconsistent with the diagnostic regions identified inExperiment 1 the eyes and upper lip are notdifferentially important to identify fear For angerperformance was significantly better when the eyeswere visible compared to when they were covered Asimilar trend was also observed for shameful facesBoth of these effects are consistent with the diagnosticareas identified in Experiment 1 We did not see anysignificant differences for detecting sadness whencovering the mouth versus the eyes even thoughExperiment 1 did suggest different levels of diagnos-ticity for those regions Overall five out of the sixemotion types showed results consistent with thepredictions given the diagnostic areas of emotionalfaces identified in Experiment 1

Figure 9 For all trials participants were briefly shown a face

with either the eyes or mouth occluded The task was to judge

whether the face had a particular emotion or not (A) An

example of a joyful face with the eyes covered (B) an example

of the same face but with the mouth covered

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 12

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

General discussion

Perceiving emotion expressed in a face is associatedwith characteristic patterns of attention both whenthat emotion is visible and when it is merely antici-pated Although previous research has segmented partsof the face into three or four distinct regions (seeAviezer et al 2008 Buchan Pare amp Munhall 2007Everdell et al 2007 Henderson Falk Minut Dyer ampMahadevan 2000) the present study offers a moredetailed analysis using 21 defined regions Our resultsshow that of these regions the five main facial regions(eyes upper nose lower nose upper lip nasion)accounted for 8803 of all fixations any other facialregion accounted for at most 3 of fixations Thesepatterns suggest that these five facial regions may be themost critical for emotional recognition within faces

Despite the dominance of these five regions therewere consistent differences in fixation patterns whenseeking different emotional cues within a face Figure 5summarizes the regions that were highly fixated foreach emotion (relative to the other emotions) and offersa comparison of these patterns to past work exploringwhich regions are most diagnostic for evaluating agiven emotion (see Figure 5 Smith et al 2005)

For joy participants fixated the least at the eyesand spent the most time fixating at the upper lipsThis is likely driven by the importance of the upperlip relative to the smile the most salient facialfeature of joy (see Figure 5) Importantly thesefixation differences were significant across emo-tional and neutral stimuli suggesting a particularlystrong goal-driven strategy for perceiving joy Foremotional faces there also appeared to be amarginal difference in which the upper nose wasfixated less frequently These differences wereprimarily driven by the first few fixations

For disgust participants fixated more often at theupper lip and less often at the eyes Thesedifferences are likely due to the importance of thefurrowing of the nose and mouth when making thedisgust facial expression thus making the upper lipmore salient These patterns were primarily drivenby the first few fixations

For fear participants fixated more at the eyes andrelatively less at the nasion (see Figure 5) Theredid not appear to be any significant differencesrelative to other emotions in fixations at the uppernose lower nose and upper lip These patternswere fairly consistent across time for the nasionalthough the eyes became increasingly fixated overtime

For anger participants fixated most at the eyesand least at the upper lip When emotion was notpresent participants fixated significantly more at

the nasion as well There did not appear to be anysignificant fixation differences at the upper andlower nose (see Figure 5) These results areconsistent with previous findings suggesting thediagnostic value of the eyes and nasion fordetecting anger (Smith et al 2005) These patternswere consistent across time

For sadness participants fixated most at the eyesIn contrast fixation time was significantly lower atthe upper lip These fixation patterns weresignificant across both emotional and neutralstimuli for both the eyes and upper lip suggestingthis pattern may be driven by a goal-drivenstrategy These patterns were consistent acrosstime

For shame participants fixated to a greater extentat the eyes This pattern was most pronouncedafter the first fixation For all other regions theredid not appear to be any significant differences (seeFigure 5)

Across all blocks it is interesting that there were nosignificant differences found in fixation time spent atthe lower nose whether for emotional or neutral facesit appears to serve as a central fixation point across aface Another finding in our data is that dynamicdifferences are apparent even within the first fixationfor example in the much greater percentage of fixationsto the eye region for anger relative to joyful expres-sions This provides further support for a goal-drivenstrategy during emotional recognition of faces becausesuch differences were typically maintained acrossemotional and neutral stimuli The values and differ-ences within many regions across emotion type werelarger and more pronounced while viewing emotionalversus neutral stimuli still suggesting an important rolefor stimulus-driven factors on eye movements

Gaze patterns were particularly variable acrossemotions in the first few fixations and became lessvariable for the remainder of the viewing time Theupper nose was fixated to the greatest extent as the firstfixation regardless of emotion type and then fixationrates remained relatively variable depending on theemotion type for the remainder of the trial In contrastfor the lower nose the first fixation demonstratedvariability across different emotion types but was areliable target at the second fixation regardless ofemotion For all emotions the upper lip was consis-tently a weaker target at the first fixation and fixatedmost during the second fixation There was no generalfixation pattern at the nasion demonstrating widevariability across emotions over time We also observeda significant left dominance for the initial fixationregardless of stimulus type

Although participants made an average of 83fixations per trial the prediction rates for the Bayesianclassifier showed that the first two fixation locations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 13

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

were most predictive of the type of emotion sought bythe observer This result is consistent with previouswork showing that face recognition can be completedafter only one to two eye movements (Hsiao amp Cottrell2008) although the number of fixations needed torecognize emotional content may be greater

Building on the diagnostic regions of the emotionalfaces identified in Experiment 1 Experiment 2 offered adirect evaluation to whether those regions wereimportant by covering either the mouth or the eyesduring an emotional judgment task Consistent with thepredictions of Experiment 1 emotional judgmentperformance was impaired for emotions for which themouth was more diagnostic (joy disgust) as well as foremotions for which the eyes were more diagnostic(anger shame) Although the present data cannotdirectly confirm that eye movements to these morediagnostic regions helped performance they do showthat attentional strategies were aligned with locationsthat carry diagnostic information

Conclusion

When asked to evaluate the presence of a specificemotion within a face participants focused on acommon set of face regions but also used emotion-specific eye-movement strategies in both emotional andneutral faces These results are consistent with the ideathat focusing attention on certain diagnostic regions isbeneficial for emotion processing and that thesestrategies may be driven by both stimulus-driven andgoal-driven factors The evidence for goal-drivenstrategies is consistent with previous research suggest-ing that the goals and characteristics of the perceivermay influence how eye movements are deployed duringfacial emotion recognition (Jack et al 2009 Perlman etal 2009) Face and emotion recognition are even morebroadly affected by a facersquos gender (Wright amp Sladden2003) race or culture (Jack et al 2009 OrsquoToole et al1994) and individual facial morphology (Oosterhof ampTodorov 2009) These effects likely interact with thegoal-driven factors observed in the current study andwe hope that future research explores such interactions

The presence of a goal-driven strategy duringemotional recognition has numerous practical impli-cations specifically to individuals with disorders thatmay result in social deficits Patients with certaindisorders such as autism (Pelphrey et al 2002 SpezioAdolphs et al 2007 Spezio Huang Castelli ampAdolphs 2007) schizophrenia (Loughland Williamsamp Gordon 2002) social phobias (Horley WilliamsGonsalvez amp Gordon 2004) and Alzheimerrsquos disease(Hargrave Maddock amp Stone 2002 Ogrocki Hills ampStrauss 2000) demonstrate eye-movement patterns

that are remarkably different from normal participantsand these anomalous patterns may impair their abilityto correctly identify the emotional valence of facesThese individuals typically focus longer at nondiag-nostic regions (such as the brow or cheeks) relative todiagnostic regions of the face (eyes mouth nose)during emotional recognition These abnormal eye-movement patterns contribute to social or emotionaldeficits that accompany abnormal face recognition Bycomparing such abnormal eye-movement patterns toour results it may be possible to identify ways in whichaltered eye-movement patterns could improve emotionrecognition

Keywords face recognition emotion eye movementsattention fixation

Acknowledgments

We thank Trixie Lipke for assistance with datacollection and Sumeeth Jonathan for programming andhelp with movie creation

Commercial relationships noneCorresponding author Mark W SchurginEmail maschurginjhueduAddress Department of Psychological amp Brain Sci-ences Johns Hopkins University Baltimore MDUSA

References

Adolphs R Gosselin F Buchanan T W TranelD Schyns P amp Damasio A R (2005) Amechanism for impaired fear recognition afteramygdala damage Nature 433(7021) 68ndash72

Adolphs R Tranel D Damasio H amp Damasio A(1994) Impaired recognition of emotion in facialexpressions following bilateral damage to thehuman amygdala Nature 372 669ndash672

Aviezer H Hassin R R Ryan J Grady CSusskind J Anderson A Bentin S (2008)Angry disgusted or afraid Studies on the mal-leability of emotion perception PsychologicalScience 19(7) 724ndash732

Baron-Cohen S amp Wheelwright S (2004) Theempathy quotient An investigation of adults withAsperger syndrome or high functioning autism andnormal sex differences Journal of Autism andDevelopmental Disorders 34(2) 163ndash175

Beaupre M G amp Hess U (2005) Cross-culturalemotion recognition among Canadian ethnic

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 14

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

groups Journal of Cross-Cultural Psychology36(3) 355ndash370

Blair R J R (2005) Responding to the emotions ofothers Dissociating forms of empathy through thestudy of typical and psychiatric populationsConsciousness and Cognition 14(4) 698ndash718

Brainard D H (1997) The Psychophysics ToolboxSpatial Vision 10 443ndash446

Buchan J N Pare M amp Munhall K G (2007)Spatial statistics of gaze fixations during dynamicface processing Social Neuroscience 2(1) 1ndash13

Darwin C (1872) The expression of emotions in manand animals New York Philosophical Library

Ekman P amp Friesen W V (1975) Unmasking theface Englewood Cliffs NJ Prentice Hall

Ekman P amp Friesen W V (1978) The Facial ActionCoding System (FACS) A technique for themeasurement of facial action Palo Alto CAConsulting Psychologists Press

Elfenbein H A amp Ambady N (2002) Is there an in-group advantage in emotion recognition

Everdell I T Marsh H Yurick M D Munhall KG amp Pare M (2007) Gaze behaviour inaudiovisual speech perception Asymmetrical dis-tribution of face-directed fixations Perception 361535ndash1545

Gosselin F amp Schyns P G (2001) Bubbles Atechnique to reveal the use of information inrecognition tasks Vision Research 41(17) 2261ndash2271

Hargrave R Maddock R J amp Stone V (2002)Impaired recognition of facial expressions ofemotion in Alzheimerrsquos disease Journal of Neuro-psychiatry and Clinical Neurosciences 14 64ndash71

Hejmadi A Davidson R J amp Rozin P (2000)Exploring Hindu Indian emotion expressionsEvidence for accurate recognition by Americansand Indians Psychological Science 11(3) 183ndash187

Henderson J M Falk R Minut S Dyer F C ampMahadevan S (2000) Gaze control for facelearning and recognition by humans and machinesMichigan State University Eye Movement Labora-tory Technical Report 4 1ndash14

Henderson J M Williams C C amp Falk R J (2005)Eye movements are functional during face learningMemory amp Cognition 33(1) 98ndash106

Horley K Williams L M Gonsalvez C amp GordonE (2004) Face to face Visual scanpath evidencefor abnormal processing of facial expressions insocial phobia Psychiatry Research 127 43ndash53

Hsiao J H W amp Cottrell G (2008) Two fixations

suffice in face recognition Psychological Science19(10) 998ndash1006

Jack R E Blais C Scheepers C Schyns P G ampCaldara R (2009) Cultural confusions show thatfacial expressions are not universal Current Biol-ogy 19 1ndash6

Keltner D amp Buswell B N (1997) EmbarrassmentIts distinct form and appeasement functionsPsychological Bulletin 122(3) 250ndash270

Kowler E Anderson E Dosher B amp Blaser E(1995) The role of attention in the programming ofsaccades Vision Research 35(13) 1897ndash1916

Loughland C M Williams L M amp Gordon E(2002) Schizophrenia and affective disorder showdifferent visual scanning behaviour for faces Atrait versus state-based distinction BiologicalPsychiatry 52 338ndash348

Marsh A A Kozak M N amp Ambady N (2007)Accurate identification of fear facial expressionspredicts prosocial behavior Emotion 7(2) 239

Nahm F K D Perret A Amaral D G amp AlbrightT D (1997) How do monkeys look at facesJournal of Cognitive Neuroscience 9(5) 611ndash623

Ogrocki P K Hills A C amp Strauss M E (2000)Visual exploration of facial emotion by healthyolder adults and patients with Alzheimer diseaseNeuropsychiatry Neuropsychology and BehavioralNeurology 13 271ndash278

Oosterhof N N amp Todorov A (2009) Sharedperceptual basis of emotional expressions andtrustworthiness impressions from faces Emotion9(1) 128

OrsquoToole A J Deffenbacher K A Valentin D ampAbdi H (1994) Structural aspects of face recog-nition and the other-race effect Memory ampCognition 22(2) 208ndash224

Pelli D G (1997) The VideoToolbox software forvisual psychophysics Transforming numbers intomovies Spatial Vision 10 437ndash442

Pelphrey K A Sasson N J Reznick J Paul GGoldman B D amp Piven J (2002) Visualscanning of faces in autism Journal of Autism andDevelopmental Disorders 32(4) 1573ndash3432

Perlman S B Morris J P Vander Wyk B CGreen S R Doyle J L amp Pelphrey K A(2009) Individual differences in personality shapehow people look at faces PLoS ONE 4(6) e5952

Pflugshaupt T Mosimann U P Schmitt W J vonWartburg R Wurtz P Luthi M Muri RM (2007) To look or not to look at threatScanpath differences within a group of spiderphobics Journal of Anxiety Disorders 21(3) 353ndash366

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 15

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Sasson N Tsuchiya N Hurley R Couture S MPenn D L Adolphs R amp Piven J (2007)Orienting to social stimuli differentiates socialcognitive impairment in autism and schizophreniaNeuropsychologia 45 2580ndash2588

Schyns P G Petro L S amp Smith M L (2009)Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brainBehavioral and brain evidence PLoS One 4(5)e5625

Segerstrom S C (2001) Optimism and attentionalbias for negative and positive stimuli Personalityand Social Psychology Bulletin 27 1334ndash1343

Smith M L Cottrell G W Gosselin F amp SchynsP G (2005) Transmitting and decoding facialexpressions Psychological Science 16(3) 184ndash189

Spezio M L Adolphs R Hurley R S E amp PivenJ (2007) Analysis of face gaze in autism usinglsquolsquoBubblesrsquorsquo Neuropsychologia 45 144ndash151

Spezio M L Huang P-Y S Castelli F amp Adolphs

R (2007) Amygdala damage impairs eye contactduring conversations with real people The Journalof Neuroscience 27(15) 3994ndash3997

Susskind J M Lee D H Cusi A Feiman RGrabski W amp Anderson A K (2008) Expressingfear enhances sensory acquisition Nature Neuro-science 11(7) 843ndash850

Tracy J L amp Robins R W (2004) Show your prideEvidence for a discrete emotion expression Psy-chological Science 15 194ndash197

Walker-Smith G J Gale A G amp Findlay J M(1977) Eye movement strategies involved in faceperception Perception 6(3) 313ndash326

Wright D B amp Sladden B (2003) An own genderbias and the importance of hair in face recognitionActa Psychologica 114(1) 101ndash114

Zhao W Chellappa R Phillips P J amp RosenfeldA (2003) Face recognition A literature surveyAcm Computing Surveys (CSUR) 35(4) 399ndash458

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 16

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

  • Introduction
  • Experiment 1 Eye tracking of
  • f01
  • f02
  • f03
  • f04
  • f05
  • f06
  • f07
  • Experiment 2 Varying diagnostic information
  • f08
  • f09
  • General discussion
  • Conclusion
  • Adolphs1
  • Adolphs2
  • Aviezer1
  • BaronCohen1
  • Beaupre1
  • Blair1
  • Brainard1
  • Buchan1
  • Darwin1
  • Ekman1
  • Ekman2
  • Elfenbein1
  • Everdell1
  • Gosselin1
  • Hargrave1
  • Hejmadi1
  • Henderson1
  • Henderson2
  • Horley1
  • Hsiao1
  • Jack1
  • Keltner1
  • Kowler1
  • Loughland1
  • Marsh1
  • Nahm1
  • Ogrocki1
  • Oosterhof1
  • Otoole1
  • Pelli1
  • Pelphrey1
  • Perlman1
  • Pflugshaupt1
  • Sasson1
  • Schyns1
  • Segerstrom1
  • Smith1
  • Spezio1
  • Spezio2
  • Susskind1
  • Tracy1
  • WalkerSmith1
  • Wright1
  • Zhao1

of an equal number of African Americans Asians andCaucasians distributed evenly across gender (six malesix female) to maximize the generizability of ourfindings across these variables which can stronglyaffect face recognition performance (Elfenbein ampAmbady 2002 Jack et al 2009 OrsquoToole Deffen-bacher Valentin amp Abdi 1994 Wrist amp Sladden2003)

For each of these identities we created six linearinterpolation morphs between the neutral face and eachemotion (Figure 1A) at four levels 0 emotional (onlythe neutral face with no contribution from theemotional face) and 20 40 and 60 emotionalintensity (contribution from the emotional face) Thismanipulation ensured that the judgment task would bedifficult maximizing the importance of selecting themost diagnostic regions of the face All photographswere standardized for size (800 middot 600 pixels) andbackground color The majority of the face portion ofthe image subtended an average of 88 in width and 108in height A more conservative measure of theboundaries of a lsquolsquofacersquorsquo (ie the vertical measure of theface includes the entire forehead up to the hairline)results in an image 988 in width and 1228 in heightBoth values roughly conform to the visual angle of aface in normal human interactions (see also Hendersonet al 2005 Hsiao amp Cottrell 2008)

Apparatus

The experiment took place in a dimly lit roomStimuli were presented on a 17-in CRT monitor(resolution of 800 middot 600 pixels 85-Hz refresh rate 255

pixels8 of visual angle) located approximately 63 cmfrom participantsrsquo eyes and were generated using SRResearch Experiment Builder on Windows XP View-ing distance was maintained throughout the experi-ment Eye movements were recorded at a sampling rateof 1000 Hz (pupil-only mode) with an EyeLink 1000eye tracker (SR Research 0158 resolution) using adesktop camera-mount illuminator Normal sensitivitysettings were used setting the saccade threshold ateither 308s for velocity or 80008ss for acceleration Afixation was defined as any period that is not a blink orsaccade Responses were collected using a USBSidewinder gamepad

Procedure

This experiment consisted of a total of 144 trials(Figure 1B) presented as six blocks of trials (one blockof trials per emotion type) shown in one of six possiblepseudorandom counterbalanced orders Each blockconsisted of 24 trials per block each consisting of 12neutral and 12 emotional face trials (equally dividedinto 20 40 or 60) presented in randomized orderThe 12 facial identities could not be fully crossed withineach block or participant but were counterbalancedsuch that they were equally frequently associated witheach emotional intensity level across participants Theexperiment began with an eye-tracking calibrationscreen consisting of nine dots spread around a screenThis calibration was repeated as needed throughoutthe experiment

At the beginning of each trial participants fixated ata central cross on a gray screen Participants started the

Figure 1 Illustration of face stimuli (A) and experimental paradigm (B) Participants were presented with blocks of trials containing

half neutral faces (0 intensity) and half emotional faces (varying from 20ndash60 intensity) and were asked to judge whether each

face had any amount of a particular emotion present (eg fear)

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 3

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

trial by pressing a button on the gamepad while fixatingwithin a required 28 radius of screen center A photowould then appear in the center of the screen for 3 sfollowed by a blank gray screen Participants thenjudged whether the photo depicted a neutral face or aface with even a slight amount of emotion Theyindicated their responses by pressing one of twobuttons on the gamepad

Eye-tracking analysis

For each facial stimulus 21 face regions were definedaccording to the template shown in Figure 2 Allanalyses used percentage of total fixation time overthese regions and Figure 2 also shows overall fixationrates for each region across all conditions Substitutingthe number of fixations for total fixation duration didnot change the patterns of results reported below thenumber of fixations and total fixation duration werehighly correlated All percentages omit fixations thatbegin before the presentation of the image and fixationsthat were beyond the bounds of the face image Foranalyses examining the time course of fixation rates thedependent measure is the percentage of all fixationswithin a given region within a given time window (ie3) This measure was not normalized relative to the sizeof each region because there appears to be littlerelationship between region size and fixation table (seeFigure 2) Fixation rates are not directly comparedacross regions because these rates are not independent

of each other Instead for each region fixation ratesare compared across different conditions and timewindows The distribution of fixation time acrossregions yielded five main facial regions (eyes uppernose lower nose upper lip nasion) that togetheraccounted for 8803 of all fixations These highfixation frequencies were not a result of these areasbeing physically larger Some of the largest definedareas within the face showed extremely low fixationrates including the right and left cheek (26 22)the forehead (09) the hair (02) and the largestregion the background (023) Hence we restrictedour subsequent statistical analyses on eye-movementdata to these five main facial regions

Results

Behavioral results Emotional intensity ratings

We first examined the percentage of trials thatparticipants rated as lsquolsquoemotionalrsquorsquo as a function ofemotion type and intensity (see Figure 3) A one-wayANOVA with the six emotions as a factor revealed asignificant effect of emotion type on the emotionaljudgment task F(5 245) frac14 10 p 0001 This effectwas driven by higher overall emotional ratings duringblocks containing sad faces (Mfrac14 662) relative to allother emotion blocks all ts(49) 42 all ps 0001Across all trials joy (M frac14 622) and disgust (M frac14623) were rated slightly lower in emotional intensity

Figure 2 Illustration of 21 facial regions of interest (ROIs) We identified five main facial ROIsmdasheyes (green) upper nose (blue) lower

nose (orange) upper lip (red) and nasion (purple)mdashthat accounted for more than 88 of all fixations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 4

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

compared to sadness both ts(49) 46 ps 0001 andhigher compared to fear and sadness both ts(49) 38both ps 0001 Shame (M frac14 585) and anger (M frac14589) were slightly lower but still higher than fear (Mfrac14 548) both ts(49) 32 both ps 0002

As a measure of the effect of emotional intensitymanipulation on ratings we compared the slope of thefunction relating emotion intensity (0 20 4060) in the face image to ratings We omitted 60emotion from this slope because ratings overall wereabove 90 and produced little slope variance Amongemotion blocks there were differences in the slope fromzero to 40 emotion F(5 245)frac14 177 p 0001 Joy(Mfrac14 77 difference in rating value) and disgust (Mfrac1474 difference) were higher than fear (M frac14 62difference) t(49) 24 p 0021 Anger (M frac14 47difference) sadness (M frac14 45) and shame (M frac14 43difference) were all lower than fear all ts(49) 247 p 001

Behavioral results Emotional judgment accuracy

We then evaluated whether these emotional judg-ments were lsquolsquocorrectrsquorsquo according to our definition thatany face containing 0 intensity of the emotional facewas lsquolsquoneutralrsquorsquo and any face containing 20ndash60intensity of the emotional face should be judged aslsquolsquoemotionalrsquorsquo This definition of lsquolsquocorrectrsquorsquo is as arbitraryas our threshold for emotional content The goal of thetask was to engage the participant in recognizing aparticular emotion and our analysis of performance isonly intended as confirmation that participants per-formed the task and that performance was at neitherfloor nor ceiling level

Our analysis confirmed these assumptions Partici-pantsrsquo overall accuracy ranged from 660 to 868with a mean of 755 Emotion judgment accuracy

differed across the six emotion block conditions F(5245)frac14 96 p 0001 Participants recognized joy (Mfrac14827 correct) and disgust (M frac14 795) with higheraccuracy than all others t(49) 23 all ps 003Participants showed lower accuracy when recognizingfear (M frac14 762) and anger (M frac14 735) Howeverparticipants were more accurate in recognizing fearcompared to sadness (M frac14 707) and shame (M frac14704) both t(49) 26 both p 0013 Theseaccuracy rates suggest that although there weredifferences among emotion blocks overall performancewas roughly similar and far from floor or ceiling levelsconfirming that participants performed the judgmenttask as designed

Gaze patterns across emotions

The following analyses only included the first fourfixations of each trial with the lsquolsquofirstrsquorsquo fixation definedas the first new landing of the eye after the appearanceof the image As will be detailed later participantsmade an average of 83 fixations per trial and we foundthe first few fixations to be the most diagnostic in termsof capturing differences across emotions Althoughsome of our later analyses suggest that only the firsttwo fixations are most critical for emotion recognitionhere we conservatively analyze the first four andexamine the changes in pattern across this initial rangein a later section For the following analyses isolatingthe data to the first four fixations produces a pattern ofresults that is qualitatively similar to using more (egeight) fixations All analyses were performed irrespec-tive of whether a response on a trial was correct orincorrect

Figure 3 Percentage of trials rated as lsquolsquoemotionalrsquorsquo as a function of emotion type and intensity These data demonstrate the varying

difficulty of the task which was meant to encourage selection of the most diagnostic regions of the face Across all emotions

performance did not reach above chance until 40 intensity and was near ceiling for 60 intensity images

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 5

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Emotional faces

For each region fixation rates on emotional faces(those with 20 40 or 60 emotional intensity) weresubmitted to one-way ANOVA with emotion block asa factor In these trials eye movements can be affectedby both the emotional content of the face images(stimulus-driven information) and the fact that theobserver is currently seeking that emotion type (goal-driven information) Figure 4A shows fixation rates for

each face region across the six emotions We summarizethe major findings at the end of this section withadditional information and statistical analyses detailedin the supplemental materials

For the eyes there was a main effect of face emotionF(5 250) frac14 272 p 0001 whereby participantslooked longer at the eye region in facial expressions ofanger (M frac14 352) fear (M frac14 308) sadness (M frac14342) and shame (Mfrac14 343) and looked less withinthe eye region for disgust (M frac14 197) and joy (M frac14

Figure 4 (A) For each emotional face fixation time spent within each main ROI (B) For each neutral face fixation time spent within

each main ROI Designates t test p 005 relative to the mean for each emotion

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 6

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

195) faces relative to the mean (M frac14 289) allts(50) 82 ps 0001 For the upper nose there wasan effect of emotion condition F(5 250) frac14 47 p

0001 whereby participants looked marginally less atjoyful faces (M frac14 187) relative to the mean (M frac14218) all ts(50) 158 all ps 01 For the lowernose there was no effect of emotion condition F 1For the upper lip there was an effect of emotioncondition F(5 250)frac14 272 p 0001 wherebyparticipants looked longer at the upper lip for disgust(Mfrac14 155) and joy faces (Mfrac14 209) and less at theupper lip for anger (M frac14 82) and sad (M frac14 68)faces relative to the mean (Mfrac14121) all ts(50) 24all ps 002 For the nasion there was an effect ofemotion condition F(5 250) frac14 31 p 001 wherebyparticipants looked less within the nasion for fear faces(M frac14 38) relative to the mean (M frac14 61) t(50) frac1439 p 0001

To better visualize the differences among emotionconditions Figure 5 graphically illustrates fixationrates for each of the top five regions across the sixemotion conditions relative to the average rate ofregion fixation collapsed across all emotions The toprow of the figure depicts a correlate of the mostdiagnostic information in the image that an idealobserver might use to distinguish between emotionaland neutral facesmdashan image subtraction of the 60emotional image from the neutral image for a singleface identity (this face was chosen because it haduniquely low head movement across photographsrequired for this technique to reveal a useful contrast)Areas of greater difference are depicted with white andlower difference with black respectively

The middle row of the figure graphically illustratesfixation rates for each of the top five regions across our

Figure 5 Top row Image subtractions between the emotional and neutral versions of a particular face from the image set used in

Experiment 1 revealing areas of maximal diagnostic information (white) for judging presence of emotion in the face Middle row

Emotional fixation rates for each of the top five regions across the six emotion conditions relative to the average rate of region

fixation collapsed across all emotional face trials Bottom row Diagnostic information for judgment of each emotion type using the

lsquolsquoBubblesrsquorsquo technique (adapted with permission from Smith et al 2005)

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 7

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

six emotion conditions relative to the average rate ofregion fixation collapsed across all emotions

The bottom row of the figure contains images fromSmith and colleagues (2005) using the lsquolsquobubblesrsquorsquotechnique to find the maximally diagnostic regions offaces for judging each emotion The similarities withincolumns of this figure are striking despite theindependence of the data sources for each Forexample in the middle row note the relatively highfixation rates on the lips for joy and disgust in contrastto the relatively high fixation rates on the eyes foranger sadness and shame Both the top and bottomrows confirm the relatively high diagnostic value ofthese areas for those particular emotion judgments

Neutral faces

In each emotion block half of the faces werenonemotional (neutral) Because these faces wereidentical across blocks any difference in fixation ratesrepresent solely goal-driven strategies resulting fromseeking a given emotion and cannot be due to stimulus-level differences in the images The results will show thatweaker versions of emotion-specific strategies endureacross these neutral faces Figure 4B depicts fixationrates within each block across these neutral faces Foreach region of the face we report the results of a one-way ANOVA seeking differences among the six emotionblock types but focusing only on the neutral faces

For the eyes the emotion block analysis revealed aneffect of emotion condition across neutral faces F(5250) frac14 77 p 0001 whereby participants lookedlonger for anger (Mfrac14334) fear (Mfrac14343) sad (Mfrac14 36) and shame (M frac14 361) faces By contrastparticipants looked less for disgust (M frac14 286) andjoy (M frac14 291) faces relative to the mean (M frac14329) all ts(50) 103 all ps 0001 For the uppernose the emotion block analysis revealed that therewas still no effect of emotion condition across neutralfaces F 1 For the lower nose the emotion blockanalysis revealed that there was still no effect ofemotion condition across neutral faces F 1 For theupper lip the emotion block analysis reveals that therewas still an effect of emotion condition across neutralfaces due to relatively high fixation rates for the joy (Mfrac14 14) condition (disgust was no longer significant)and lower fixation rates for sadness (M frac14 79)condition (anger was no longer significant) relative tothe mean (Mfrac14 99) all ts(50) 197 all ps 0055F(5 250)frac14 86 p 0001 For the nasion the emotionblock analysis revealed that there was still a marginaleffect of emotion condition F(5 250)frac14 23 p frac14 004Although the pattern was similar the change in themean value caused a new set of emotion conditions tobe different from the mean Fear was no longer

significantly different but anger (M frac14 87) hadrelatively high fixation rates t(50) frac14 184 p frac14 007

In summary even when the neutral faces wereidentical across emotion judgment blocks (eg angerjoy) there were still effects of emotion judgment typeon patterns of fixation In fact the majority of effectsfound in the emotional faces were still present in theneutral faces although the effects were not as robustOne exception was fixations to the nasion whereneutral faces showed a pattern of fixation not presentfor emotional faces but this effect may be due to thechanging baseline of the mean number of fixations fornasion fixations overall The fact that most fixationtrends remained for neutral faces that were identicalacross blocks suggests that a substantial portion of theeffect of seeking a specific emotion is due to goal-driveninfluences on fixation patterns

Summary of intensity level analyses

In addition to dividing face trials into emotional(20 40 and 60 emotional) and neutral (0emotional) sets we also examined differences among theemotional face trials as they transitioned from 20 to60 We present a summary of this analysis here andprovide more detailed information and statisticalanalyses in the supplemental materials As the intensityof angry expressions increased fixations increased to theeye region except for the anger faces shown at 60intensity which showed a reduction in fixations to theeye region It is possible that the eyes were mostdiagnostic when the emotion was more ambiguous Asimilar pattern was observed at the nasion where moreneutral faces were fixated to a greater extent than moreemotional faces For disgust judgment blocks there wasa particularly strong tradeoff in fixation position Whenfaces were more neutral participants fixated at the eyesmore frequently However as the intensity of the disgustexpression increased participants looked less at the eyeregion and more toward the upper nose and upper lipAs the intensity of fearful expressions increasedparticipants looked toward the upper nose and upper lipand less at the nasion As the intensity of joyfulexpressions increased there was another particularlystrong effect as fixations increased to the upper lip anddecreased to the eyes and lower nose As the intensity ofsad expressions increased the distribution of fixationsover the five face regions did not significantly change Asthe intensity of shameful expressions increased at 60intensity there were more fixations over the upper liptrading off with fewer fixations over the lower nose

Two of the largest effects of emotional intensity werefor disgust and joy expressions For disgust asemotional intensity conveyed in the face increasedpeople showed greater fixation particularly to the uppernose (close to the particularly salient wrinkle that

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 8

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

appears between the eyes during an expression ofdisgust) and for joy people showed greater fixation tothe upper lip (the location of the smile) In both caseswhen faces become more neutral these fixation ratesdecrease and instead the eyes are fixated morefrequently

Gaze patterns across time

The analyses above collapse the first four fixationsacross viewing time Here we examine how thesepatterns evolve over that time Understanding the timecourse of fixation patterns can establish whether thepatterns observed above are stable and persistentacross viewing time or if fixation patterns are more

variable and dynamic We examined both the proba-bility of fixating on a given region of each type ofemotional face within each 250-ms time interval as wellas for each fixation number Although the patternswere highly similar fixation number appeared to revealmore differences especially in the earlier viewingperiod In contrast for the time interval analysisdifferences in fixation patterns among conditionsappeared to be more lsquolsquosmearedrsquorsquo across time suggestingthat the fixation number analysis might reveal moreabout participantsrsquo fixation patterns

The average duration of each fixation was 301 msand participants made between one and 15 fixationsAlthough participants made up to 15 fixations therewere few trials with this many fixations Because theaverage number of fixations was 83 per image weexamined only the first through the eighth fixation Foreach of the following analyses we use the termlsquolsquofixation ratersquorsquo defined as lsquolsquofor all of the Nth fixationson an image in the current condition the percentagethat are within a given regionrsquorsquo This manner ofcalculating fixation rates removed the influence of thegeneral drop in the number of total fixations availableas fixation number increases (ie there are always morethird fixations than fourth fixations)

Across previous data sets using similar images andtasks as well as the present data set we observed thatfixation patterns differed primarily among the first onethrough four fixations The first few fixations weretypically more variable across time and condition andthe remaining fixation rates tended to stabilize Theselater fixations reveal differences among conditions butvary far less over time Therefore we decided to focusour analyses on the first four fixations (a detailedanalysis across all eight fixations is included in thesupplemental section)

In all of the following analyses we compare fixationrates for each region across the six emotion recognitionconditions To maximize the differences among condi-tions for these analyses involving time we use only theemotional face images and omit the neutral images Aseparate analysis of the neutral emotion imagesrevealed similar patterns but to a weaker extent similarto the neutral image analysis reported above

Patterns of gaze across time by emotion

Figure 6 provides fixation distributions for the firstthree emotional fixations (20ndash60) made to aparticular face presented in the study Overall the eyeswere fixated less across all fixations in the joy anddisgust conditions especially during the first threefixations In contrast the eyes were fixated at uniformlyhigh rates across time in the anger and sadnessconditions and at increasing rates across time in thefear and shame conditions The upper nose was a

Figure 6 Fixation distributions for the first three fixations on

emotional (20ndash60) face image trials for one face identity As

participants made more fixations distinct patterns emerged

across emotions in what regions were fixated most Specifically

at the first fixation the upper nose was a particular popular

fixation location although there was still considerable variability

at the region across emotions As participants made additional

fixations a general pattern emerged such that there was

increased fixation at the upper lip for joy and disgust whereas

there was increased fixation at the eyes for anger sadness and

shame

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 9

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

consistently high target across all emotions during thefirst fixation However despite this popularity demon-strated there was still considerable variability infixation patterns across emotions The lower noseshowed wide variability across all emotions at the firstfixation but afterward fixation rates remained muchmore consistent across emotions The upper lip wasfixated at particularly high rates during the secondfixation for joy and disgust and to a weaker degreefear The nasion showed wide variability in fixationpatterns across emotions although there was a highfixation preference at the first fixation for shame

Asymmetry analysis

Previous research has shown that initial fixationstoward faces are biased toward the left side of the facefollowed by a rightward fixation (Everdell MarshYurick Munhall amp Pare 2007) We tested for thiseffect in the present data by coding each fixation asbeing toward the left or right excluding center regionsof the face (UN LN UL LL) A one-way ANOVAwith fixation number (one through eight) and blocktype (emotional or neutral) as factors showed a maineffect of fixation number F(1 7) frac14 4208 p 0001This effect was driven by the first fixation andremoving that fixation from the analysis eliminatedthe main effect F(1 6) frac14 0579 p frac14 0747 The firstfixation during trials across block type had a strongbias toward the left side of the screen (M frac14 581) incomparison to fixations two through eight (M frac14

462) which demonstrated a slight bias toward theright side of the screen These effects confirm theinitial leftward bias in face fixation patterns Possibleroots of this effect including a learned strategy thatthe right side of the face contains the most informa-tion or a general perceptual bias toward the left visualhemifield are discussed in detail in Everdell et al(2007)

Fixation patterns predict facial emotion

If the type of facial emotion displayed (or sought bythe observer) systematically affects the observerrsquospattern of eye movements it should be possible topredict the type of face displayed (or sought) bystatistical analysis of the eye movement sequence forany given trial We separately submitted the first 1ndashNfixation regions (where N frac14 1 to 8) from every trial inthe data set to a naive Bayesian classifier for categoricaldata (the NaiveBayes class of Matlab) using two thirdsof the data for training and one third for testing Wecalculated the average classification accuracy across100 fittest iterations selecting traintest sets indepen-dently for each iteration for (a) all trials (b) emotionalimagendashonly trials and (c) neutral trials

Figure 6 depicts classification accuracy for eachcondition as well as chance prediction (for sixemotions 166) Emotional-image trial classifica-tion accuracy of test trials was highest (with a peakof around 25 accuracy) for the emotional imagendashonly trial set consistent with the stronger fixationpatterns described in our other analyses Neutral-image trial classification accuracy was lower but stillwell above chance performance (with a peak ofaround 20 accuracy) Including all trials led toperformance in between these two (with a peak ofaround 23 accuracy) consistent with Hsiao andCottrell (2008) who showed that the first one to twofixations were most critical for face processing thediagnostic value of our fixation data was primarily inthe first two fixations It is clear from Figure 7 thatthe bulk of predictive power is reached by the secondfixation

Finally we examined the confusion matrix pro-duced by the classifier (Figure 8) When the wrongemotion is predicted which emotions are systemat-ically confused because they are most similar Wefocused here on two subsets of trials the highestemotion (60) images and the neutral image trialsFor the highest emotion images two pairs ofemotions stood out as having the most confusablefixation patterns joy and disgust and anger andsadness This confusability is consistent with theresults depicted in Figure 5 which reveals strongsimilarities between the fixation patterns produced by

Figure 7 Classification accuracy for each condition (emotional

face trials neutral face trials both trial types and chance

prediction) Classification accuracy for emotional trials was

highest consistent with the stronger fixation patterns observed

in our other analyses Neutral trial classification accuracy was

lower but still well above chance Consistent with the observed

fixation patterns of humans the diagnostic value of fixation

data was primarily reached in the first two fixations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 10

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

these pairs of images Two pairs of emotions stoodout as the least confusable joy and sadness and joyand anger This is also consistent with the resultsdepicted in Figure 5 in which the fixation pattern forjoy contrasts strongly with the pattern for anger andsadness For the neutral-trial subset the mostconfusable image pair was again joy and disgust andthe least confusable was again joy and sadness Otherpairings did not stand out as strong outliers for thissubset

Experiment 1 summary

Participants moved their eyes in distinctive patternsfor each type of emotional image These patternsremained (although weaker) for neutral images in eachemotionrsquos block revealing that these patterns were atleast partially goal-directed The pattern of fixationsalone could predict the type of image being sought(even if it was not present) at above chance levels in anaive Bayesian classifier

Participants preferentially fixate certain regions thatthey at least implicitly think are more diagnostic forrecognition of different types of emotions Experiment2 verifies that the two broadest differentiable regionsmdashthe eyes and the mouthmdashdo indeed differ in how wellthey signal particular emotions

Experiment 2 Varying diagnosticinformation of faces altersperformance

In order to verify whether different areas of our facestimuli contain more or less diagnostic information foreach emotion we asked a new set of participants tomake similar judgments of the emotional content offaces but we occluded information either from the eyeregion or the mouth region We predicted that foremotion judgments for which participants in Experi-ment 1 fixated the eye region at relatively high rates(ie anger and shame) occluding that region woulddecrease emotion detection accuracy In contrast foremotions for which the mouth region was particularlyhighly fixated (ie joy and disgust) occluding thatregion would decrease emotion detection accuracy

Methods

Participants

Ten college-aged participants completed Experiment2 All participants gave consent and received coursecredit for their participation

Figure 8 Confusion matrices for the Bayesian classifier including information from the first eight fixations from trials with

photographs showing the highest emotion level (left column) and trials with neutral emotion photographs (right column) The top

row contains full confusion matrices with bold type indicating significant differences from expectations of chance (1667)

conservatively Bonferroni corrected for 36 comparisons To facilitate visual inspection cell values are redundantly color coded for

magnitude within each table The bottom row removes bold formatting and folds each table along its identity diagonal to depict

mutual confusability regardless of whether a pair of classifications is actual and predicted or predicted and actual respectively Boxed

values are highlighted and discussed in the text

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 11

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Stimuli

Stimuli were identical to that of Experiment 1 withthe following exceptions For emotional faces onlyone emotional morph was used per condition Angerdisgust fear and sadness consisted of 40 emotionalintensity morphs whereas joy consisted of only 20and shame 60 We determined the specific emotionalintensity per emotion during piloting in order toensure participants could perform above chance butnot at ceiling (70 correct) The values used areconsistent with the performance observed in Experi-ment 1 (see Figure 3) Additionally all face stimulihad a black box (838 middot 288) covering either the eyesor the mouth

Apparatus

All of the stimuli were created and displayed usingMATLAB with the Psychophysics Toolbox (Brainard1997 Pelli 1997) on an Intel Macintosh running OS X106 All stimuli were displayed on a 17-in View-SonicE70fB CRT monitor (1024 middot 786 resolution 75Hz 2897 pixels8 of visual angle) The viewing distancewas approximately 56 cm

Procedure

This experiment consisted of a total of 72 trialspresented as six blocks of trials (one block of trials peremotion type) shown in random order repeated twicefor a total of 144 trials per participant Each blockconsisted of 12 trials per block consisting of sixneutral and six emotional face trials presented inrandomized order At the beginning of each trialparticipants saw a fixation cross for 1 s followed by abriefly presented face for 100 ms For half of the trials

a black box covered either the eyes or the mouth (seeFigure 9) Participants then judged whether the photodepicted a specific emotion or not (ie angry or notangry)

Results

We examined whether or not overall performancediffered according to whether the mouth (ie eyescovered by black box) or eyes (ie mouth covered byblack box) were visible For joy we found performancewas significantly better when the mouth was visible (Mfrac14 633) relative to when it was covered (Mfrac14 517)t(9)frac14 281 pfrac14 002 A similar pattern was observed fordisgust (Mfrac14 758 Mfrac14 601) t(9)frac14 338 p 001Conversely we found performance was significantlybetter for anger when the eyes were visible (Mfrac14 692)compared to when they were covered (Mfrac14 608) t(9)frac14 335 p 001 A similar trend was observed forshame (M frac14 817 M frac14 683) t(9)frac14 181 p frac14 010We failed to find any significant differences for fear (Mfrac14 65 M frac14 675) and sadness (M frac14 717 M frac14717)

Discussion

Emotion judgment performance depended on whichregions of the face were covered in a mannerconsistent with the fixation patterns found in Exper-iment 1 Performance was best for joy and disgustwhen the eyes were covered and the mouth was visibleconsistent with the eye fixation rates we observed inExperiment 1 in which observers tended to fixate moreat the upper lip and less at the eyes relative to otheremotions Figure 9 provides a strong illustration ofthis effect Joy is easy to detect in the left face butharder to detect in the right face We failed to find anymeaningful differences for fearful faces which isconsistent with the diagnostic regions identified inExperiment 1 the eyes and upper lip are notdifferentially important to identify fear For angerperformance was significantly better when the eyeswere visible compared to when they were covered Asimilar trend was also observed for shameful facesBoth of these effects are consistent with the diagnosticareas identified in Experiment 1 We did not see anysignificant differences for detecting sadness whencovering the mouth versus the eyes even thoughExperiment 1 did suggest different levels of diagnos-ticity for those regions Overall five out of the sixemotion types showed results consistent with thepredictions given the diagnostic areas of emotionalfaces identified in Experiment 1

Figure 9 For all trials participants were briefly shown a face

with either the eyes or mouth occluded The task was to judge

whether the face had a particular emotion or not (A) An

example of a joyful face with the eyes covered (B) an example

of the same face but with the mouth covered

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 12

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

General discussion

Perceiving emotion expressed in a face is associatedwith characteristic patterns of attention both whenthat emotion is visible and when it is merely antici-pated Although previous research has segmented partsof the face into three or four distinct regions (seeAviezer et al 2008 Buchan Pare amp Munhall 2007Everdell et al 2007 Henderson Falk Minut Dyer ampMahadevan 2000) the present study offers a moredetailed analysis using 21 defined regions Our resultsshow that of these regions the five main facial regions(eyes upper nose lower nose upper lip nasion)accounted for 8803 of all fixations any other facialregion accounted for at most 3 of fixations Thesepatterns suggest that these five facial regions may be themost critical for emotional recognition within faces

Despite the dominance of these five regions therewere consistent differences in fixation patterns whenseeking different emotional cues within a face Figure 5summarizes the regions that were highly fixated foreach emotion (relative to the other emotions) and offersa comparison of these patterns to past work exploringwhich regions are most diagnostic for evaluating agiven emotion (see Figure 5 Smith et al 2005)

For joy participants fixated the least at the eyesand spent the most time fixating at the upper lipsThis is likely driven by the importance of the upperlip relative to the smile the most salient facialfeature of joy (see Figure 5) Importantly thesefixation differences were significant across emo-tional and neutral stimuli suggesting a particularlystrong goal-driven strategy for perceiving joy Foremotional faces there also appeared to be amarginal difference in which the upper nose wasfixated less frequently These differences wereprimarily driven by the first few fixations

For disgust participants fixated more often at theupper lip and less often at the eyes Thesedifferences are likely due to the importance of thefurrowing of the nose and mouth when making thedisgust facial expression thus making the upper lipmore salient These patterns were primarily drivenby the first few fixations

For fear participants fixated more at the eyes andrelatively less at the nasion (see Figure 5) Theredid not appear to be any significant differencesrelative to other emotions in fixations at the uppernose lower nose and upper lip These patternswere fairly consistent across time for the nasionalthough the eyes became increasingly fixated overtime

For anger participants fixated most at the eyesand least at the upper lip When emotion was notpresent participants fixated significantly more at

the nasion as well There did not appear to be anysignificant fixation differences at the upper andlower nose (see Figure 5) These results areconsistent with previous findings suggesting thediagnostic value of the eyes and nasion fordetecting anger (Smith et al 2005) These patternswere consistent across time

For sadness participants fixated most at the eyesIn contrast fixation time was significantly lower atthe upper lip These fixation patterns weresignificant across both emotional and neutralstimuli for both the eyes and upper lip suggestingthis pattern may be driven by a goal-drivenstrategy These patterns were consistent acrosstime

For shame participants fixated to a greater extentat the eyes This pattern was most pronouncedafter the first fixation For all other regions theredid not appear to be any significant differences (seeFigure 5)

Across all blocks it is interesting that there were nosignificant differences found in fixation time spent atthe lower nose whether for emotional or neutral facesit appears to serve as a central fixation point across aface Another finding in our data is that dynamicdifferences are apparent even within the first fixationfor example in the much greater percentage of fixationsto the eye region for anger relative to joyful expres-sions This provides further support for a goal-drivenstrategy during emotional recognition of faces becausesuch differences were typically maintained acrossemotional and neutral stimuli The values and differ-ences within many regions across emotion type werelarger and more pronounced while viewing emotionalversus neutral stimuli still suggesting an important rolefor stimulus-driven factors on eye movements

Gaze patterns were particularly variable acrossemotions in the first few fixations and became lessvariable for the remainder of the viewing time Theupper nose was fixated to the greatest extent as the firstfixation regardless of emotion type and then fixationrates remained relatively variable depending on theemotion type for the remainder of the trial In contrastfor the lower nose the first fixation demonstratedvariability across different emotion types but was areliable target at the second fixation regardless ofemotion For all emotions the upper lip was consis-tently a weaker target at the first fixation and fixatedmost during the second fixation There was no generalfixation pattern at the nasion demonstrating widevariability across emotions over time We also observeda significant left dominance for the initial fixationregardless of stimulus type

Although participants made an average of 83fixations per trial the prediction rates for the Bayesianclassifier showed that the first two fixation locations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 13

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

were most predictive of the type of emotion sought bythe observer This result is consistent with previouswork showing that face recognition can be completedafter only one to two eye movements (Hsiao amp Cottrell2008) although the number of fixations needed torecognize emotional content may be greater

Building on the diagnostic regions of the emotionalfaces identified in Experiment 1 Experiment 2 offered adirect evaluation to whether those regions wereimportant by covering either the mouth or the eyesduring an emotional judgment task Consistent with thepredictions of Experiment 1 emotional judgmentperformance was impaired for emotions for which themouth was more diagnostic (joy disgust) as well as foremotions for which the eyes were more diagnostic(anger shame) Although the present data cannotdirectly confirm that eye movements to these morediagnostic regions helped performance they do showthat attentional strategies were aligned with locationsthat carry diagnostic information

Conclusion

When asked to evaluate the presence of a specificemotion within a face participants focused on acommon set of face regions but also used emotion-specific eye-movement strategies in both emotional andneutral faces These results are consistent with the ideathat focusing attention on certain diagnostic regions isbeneficial for emotion processing and that thesestrategies may be driven by both stimulus-driven andgoal-driven factors The evidence for goal-drivenstrategies is consistent with previous research suggest-ing that the goals and characteristics of the perceivermay influence how eye movements are deployed duringfacial emotion recognition (Jack et al 2009 Perlman etal 2009) Face and emotion recognition are even morebroadly affected by a facersquos gender (Wright amp Sladden2003) race or culture (Jack et al 2009 OrsquoToole et al1994) and individual facial morphology (Oosterhof ampTodorov 2009) These effects likely interact with thegoal-driven factors observed in the current study andwe hope that future research explores such interactions

The presence of a goal-driven strategy duringemotional recognition has numerous practical impli-cations specifically to individuals with disorders thatmay result in social deficits Patients with certaindisorders such as autism (Pelphrey et al 2002 SpezioAdolphs et al 2007 Spezio Huang Castelli ampAdolphs 2007) schizophrenia (Loughland Williamsamp Gordon 2002) social phobias (Horley WilliamsGonsalvez amp Gordon 2004) and Alzheimerrsquos disease(Hargrave Maddock amp Stone 2002 Ogrocki Hills ampStrauss 2000) demonstrate eye-movement patterns

that are remarkably different from normal participantsand these anomalous patterns may impair their abilityto correctly identify the emotional valence of facesThese individuals typically focus longer at nondiag-nostic regions (such as the brow or cheeks) relative todiagnostic regions of the face (eyes mouth nose)during emotional recognition These abnormal eye-movement patterns contribute to social or emotionaldeficits that accompany abnormal face recognition Bycomparing such abnormal eye-movement patterns toour results it may be possible to identify ways in whichaltered eye-movement patterns could improve emotionrecognition

Keywords face recognition emotion eye movementsattention fixation

Acknowledgments

We thank Trixie Lipke for assistance with datacollection and Sumeeth Jonathan for programming andhelp with movie creation

Commercial relationships noneCorresponding author Mark W SchurginEmail maschurginjhueduAddress Department of Psychological amp Brain Sci-ences Johns Hopkins University Baltimore MDUSA

References

Adolphs R Gosselin F Buchanan T W TranelD Schyns P amp Damasio A R (2005) Amechanism for impaired fear recognition afteramygdala damage Nature 433(7021) 68ndash72

Adolphs R Tranel D Damasio H amp Damasio A(1994) Impaired recognition of emotion in facialexpressions following bilateral damage to thehuman amygdala Nature 372 669ndash672

Aviezer H Hassin R R Ryan J Grady CSusskind J Anderson A Bentin S (2008)Angry disgusted or afraid Studies on the mal-leability of emotion perception PsychologicalScience 19(7) 724ndash732

Baron-Cohen S amp Wheelwright S (2004) Theempathy quotient An investigation of adults withAsperger syndrome or high functioning autism andnormal sex differences Journal of Autism andDevelopmental Disorders 34(2) 163ndash175

Beaupre M G amp Hess U (2005) Cross-culturalemotion recognition among Canadian ethnic

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 14

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

groups Journal of Cross-Cultural Psychology36(3) 355ndash370

Blair R J R (2005) Responding to the emotions ofothers Dissociating forms of empathy through thestudy of typical and psychiatric populationsConsciousness and Cognition 14(4) 698ndash718

Brainard D H (1997) The Psychophysics ToolboxSpatial Vision 10 443ndash446

Buchan J N Pare M amp Munhall K G (2007)Spatial statistics of gaze fixations during dynamicface processing Social Neuroscience 2(1) 1ndash13

Darwin C (1872) The expression of emotions in manand animals New York Philosophical Library

Ekman P amp Friesen W V (1975) Unmasking theface Englewood Cliffs NJ Prentice Hall

Ekman P amp Friesen W V (1978) The Facial ActionCoding System (FACS) A technique for themeasurement of facial action Palo Alto CAConsulting Psychologists Press

Elfenbein H A amp Ambady N (2002) Is there an in-group advantage in emotion recognition

Everdell I T Marsh H Yurick M D Munhall KG amp Pare M (2007) Gaze behaviour inaudiovisual speech perception Asymmetrical dis-tribution of face-directed fixations Perception 361535ndash1545

Gosselin F amp Schyns P G (2001) Bubbles Atechnique to reveal the use of information inrecognition tasks Vision Research 41(17) 2261ndash2271

Hargrave R Maddock R J amp Stone V (2002)Impaired recognition of facial expressions ofemotion in Alzheimerrsquos disease Journal of Neuro-psychiatry and Clinical Neurosciences 14 64ndash71

Hejmadi A Davidson R J amp Rozin P (2000)Exploring Hindu Indian emotion expressionsEvidence for accurate recognition by Americansand Indians Psychological Science 11(3) 183ndash187

Henderson J M Falk R Minut S Dyer F C ampMahadevan S (2000) Gaze control for facelearning and recognition by humans and machinesMichigan State University Eye Movement Labora-tory Technical Report 4 1ndash14

Henderson J M Williams C C amp Falk R J (2005)Eye movements are functional during face learningMemory amp Cognition 33(1) 98ndash106

Horley K Williams L M Gonsalvez C amp GordonE (2004) Face to face Visual scanpath evidencefor abnormal processing of facial expressions insocial phobia Psychiatry Research 127 43ndash53

Hsiao J H W amp Cottrell G (2008) Two fixations

suffice in face recognition Psychological Science19(10) 998ndash1006

Jack R E Blais C Scheepers C Schyns P G ampCaldara R (2009) Cultural confusions show thatfacial expressions are not universal Current Biol-ogy 19 1ndash6

Keltner D amp Buswell B N (1997) EmbarrassmentIts distinct form and appeasement functionsPsychological Bulletin 122(3) 250ndash270

Kowler E Anderson E Dosher B amp Blaser E(1995) The role of attention in the programming ofsaccades Vision Research 35(13) 1897ndash1916

Loughland C M Williams L M amp Gordon E(2002) Schizophrenia and affective disorder showdifferent visual scanning behaviour for faces Atrait versus state-based distinction BiologicalPsychiatry 52 338ndash348

Marsh A A Kozak M N amp Ambady N (2007)Accurate identification of fear facial expressionspredicts prosocial behavior Emotion 7(2) 239

Nahm F K D Perret A Amaral D G amp AlbrightT D (1997) How do monkeys look at facesJournal of Cognitive Neuroscience 9(5) 611ndash623

Ogrocki P K Hills A C amp Strauss M E (2000)Visual exploration of facial emotion by healthyolder adults and patients with Alzheimer diseaseNeuropsychiatry Neuropsychology and BehavioralNeurology 13 271ndash278

Oosterhof N N amp Todorov A (2009) Sharedperceptual basis of emotional expressions andtrustworthiness impressions from faces Emotion9(1) 128

OrsquoToole A J Deffenbacher K A Valentin D ampAbdi H (1994) Structural aspects of face recog-nition and the other-race effect Memory ampCognition 22(2) 208ndash224

Pelli D G (1997) The VideoToolbox software forvisual psychophysics Transforming numbers intomovies Spatial Vision 10 437ndash442

Pelphrey K A Sasson N J Reznick J Paul GGoldman B D amp Piven J (2002) Visualscanning of faces in autism Journal of Autism andDevelopmental Disorders 32(4) 1573ndash3432

Perlman S B Morris J P Vander Wyk B CGreen S R Doyle J L amp Pelphrey K A(2009) Individual differences in personality shapehow people look at faces PLoS ONE 4(6) e5952

Pflugshaupt T Mosimann U P Schmitt W J vonWartburg R Wurtz P Luthi M Muri RM (2007) To look or not to look at threatScanpath differences within a group of spiderphobics Journal of Anxiety Disorders 21(3) 353ndash366

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 15

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Sasson N Tsuchiya N Hurley R Couture S MPenn D L Adolphs R amp Piven J (2007)Orienting to social stimuli differentiates socialcognitive impairment in autism and schizophreniaNeuropsychologia 45 2580ndash2588

Schyns P G Petro L S amp Smith M L (2009)Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brainBehavioral and brain evidence PLoS One 4(5)e5625

Segerstrom S C (2001) Optimism and attentionalbias for negative and positive stimuli Personalityand Social Psychology Bulletin 27 1334ndash1343

Smith M L Cottrell G W Gosselin F amp SchynsP G (2005) Transmitting and decoding facialexpressions Psychological Science 16(3) 184ndash189

Spezio M L Adolphs R Hurley R S E amp PivenJ (2007) Analysis of face gaze in autism usinglsquolsquoBubblesrsquorsquo Neuropsychologia 45 144ndash151

Spezio M L Huang P-Y S Castelli F amp Adolphs

R (2007) Amygdala damage impairs eye contactduring conversations with real people The Journalof Neuroscience 27(15) 3994ndash3997

Susskind J M Lee D H Cusi A Feiman RGrabski W amp Anderson A K (2008) Expressingfear enhances sensory acquisition Nature Neuro-science 11(7) 843ndash850

Tracy J L amp Robins R W (2004) Show your prideEvidence for a discrete emotion expression Psy-chological Science 15 194ndash197

Walker-Smith G J Gale A G amp Findlay J M(1977) Eye movement strategies involved in faceperception Perception 6(3) 313ndash326

Wright D B amp Sladden B (2003) An own genderbias and the importance of hair in face recognitionActa Psychologica 114(1) 101ndash114

Zhao W Chellappa R Phillips P J amp RosenfeldA (2003) Face recognition A literature surveyAcm Computing Surveys (CSUR) 35(4) 399ndash458

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 16

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

  • Introduction
  • Experiment 1 Eye tracking of
  • f01
  • f02
  • f03
  • f04
  • f05
  • f06
  • f07
  • Experiment 2 Varying diagnostic information
  • f08
  • f09
  • General discussion
  • Conclusion
  • Adolphs1
  • Adolphs2
  • Aviezer1
  • BaronCohen1
  • Beaupre1
  • Blair1
  • Brainard1
  • Buchan1
  • Darwin1
  • Ekman1
  • Ekman2
  • Elfenbein1
  • Everdell1
  • Gosselin1
  • Hargrave1
  • Hejmadi1
  • Henderson1
  • Henderson2
  • Horley1
  • Hsiao1
  • Jack1
  • Keltner1
  • Kowler1
  • Loughland1
  • Marsh1
  • Nahm1
  • Ogrocki1
  • Oosterhof1
  • Otoole1
  • Pelli1
  • Pelphrey1
  • Perlman1
  • Pflugshaupt1
  • Sasson1
  • Schyns1
  • Segerstrom1
  • Smith1
  • Spezio1
  • Spezio2
  • Susskind1
  • Tracy1
  • WalkerSmith1
  • Wright1
  • Zhao1

trial by pressing a button on the gamepad while fixatingwithin a required 28 radius of screen center A photowould then appear in the center of the screen for 3 sfollowed by a blank gray screen Participants thenjudged whether the photo depicted a neutral face or aface with even a slight amount of emotion Theyindicated their responses by pressing one of twobuttons on the gamepad

Eye-tracking analysis

For each facial stimulus 21 face regions were definedaccording to the template shown in Figure 2 Allanalyses used percentage of total fixation time overthese regions and Figure 2 also shows overall fixationrates for each region across all conditions Substitutingthe number of fixations for total fixation duration didnot change the patterns of results reported below thenumber of fixations and total fixation duration werehighly correlated All percentages omit fixations thatbegin before the presentation of the image and fixationsthat were beyond the bounds of the face image Foranalyses examining the time course of fixation rates thedependent measure is the percentage of all fixationswithin a given region within a given time window (ie3) This measure was not normalized relative to the sizeof each region because there appears to be littlerelationship between region size and fixation table (seeFigure 2) Fixation rates are not directly comparedacross regions because these rates are not independent

of each other Instead for each region fixation ratesare compared across different conditions and timewindows The distribution of fixation time acrossregions yielded five main facial regions (eyes uppernose lower nose upper lip nasion) that togetheraccounted for 8803 of all fixations These highfixation frequencies were not a result of these areasbeing physically larger Some of the largest definedareas within the face showed extremely low fixationrates including the right and left cheek (26 22)the forehead (09) the hair (02) and the largestregion the background (023) Hence we restrictedour subsequent statistical analyses on eye-movementdata to these five main facial regions

Results

Behavioral results Emotional intensity ratings

We first examined the percentage of trials thatparticipants rated as lsquolsquoemotionalrsquorsquo as a function ofemotion type and intensity (see Figure 3) A one-wayANOVA with the six emotions as a factor revealed asignificant effect of emotion type on the emotionaljudgment task F(5 245) frac14 10 p 0001 This effectwas driven by higher overall emotional ratings duringblocks containing sad faces (Mfrac14 662) relative to allother emotion blocks all ts(49) 42 all ps 0001Across all trials joy (M frac14 622) and disgust (M frac14623) were rated slightly lower in emotional intensity

Figure 2 Illustration of 21 facial regions of interest (ROIs) We identified five main facial ROIsmdasheyes (green) upper nose (blue) lower

nose (orange) upper lip (red) and nasion (purple)mdashthat accounted for more than 88 of all fixations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 4

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

compared to sadness both ts(49) 46 ps 0001 andhigher compared to fear and sadness both ts(49) 38both ps 0001 Shame (M frac14 585) and anger (M frac14589) were slightly lower but still higher than fear (Mfrac14 548) both ts(49) 32 both ps 0002

As a measure of the effect of emotional intensitymanipulation on ratings we compared the slope of thefunction relating emotion intensity (0 20 4060) in the face image to ratings We omitted 60emotion from this slope because ratings overall wereabove 90 and produced little slope variance Amongemotion blocks there were differences in the slope fromzero to 40 emotion F(5 245)frac14 177 p 0001 Joy(Mfrac14 77 difference in rating value) and disgust (Mfrac1474 difference) were higher than fear (M frac14 62difference) t(49) 24 p 0021 Anger (M frac14 47difference) sadness (M frac14 45) and shame (M frac14 43difference) were all lower than fear all ts(49) 247 p 001

Behavioral results Emotional judgment accuracy

We then evaluated whether these emotional judg-ments were lsquolsquocorrectrsquorsquo according to our definition thatany face containing 0 intensity of the emotional facewas lsquolsquoneutralrsquorsquo and any face containing 20ndash60intensity of the emotional face should be judged aslsquolsquoemotionalrsquorsquo This definition of lsquolsquocorrectrsquorsquo is as arbitraryas our threshold for emotional content The goal of thetask was to engage the participant in recognizing aparticular emotion and our analysis of performance isonly intended as confirmation that participants per-formed the task and that performance was at neitherfloor nor ceiling level

Our analysis confirmed these assumptions Partici-pantsrsquo overall accuracy ranged from 660 to 868with a mean of 755 Emotion judgment accuracy

differed across the six emotion block conditions F(5245)frac14 96 p 0001 Participants recognized joy (Mfrac14827 correct) and disgust (M frac14 795) with higheraccuracy than all others t(49) 23 all ps 003Participants showed lower accuracy when recognizingfear (M frac14 762) and anger (M frac14 735) Howeverparticipants were more accurate in recognizing fearcompared to sadness (M frac14 707) and shame (M frac14704) both t(49) 26 both p 0013 Theseaccuracy rates suggest that although there weredifferences among emotion blocks overall performancewas roughly similar and far from floor or ceiling levelsconfirming that participants performed the judgmenttask as designed

Gaze patterns across emotions

The following analyses only included the first fourfixations of each trial with the lsquolsquofirstrsquorsquo fixation definedas the first new landing of the eye after the appearanceof the image As will be detailed later participantsmade an average of 83 fixations per trial and we foundthe first few fixations to be the most diagnostic in termsof capturing differences across emotions Althoughsome of our later analyses suggest that only the firsttwo fixations are most critical for emotion recognitionhere we conservatively analyze the first four andexamine the changes in pattern across this initial rangein a later section For the following analyses isolatingthe data to the first four fixations produces a pattern ofresults that is qualitatively similar to using more (egeight) fixations All analyses were performed irrespec-tive of whether a response on a trial was correct orincorrect

Figure 3 Percentage of trials rated as lsquolsquoemotionalrsquorsquo as a function of emotion type and intensity These data demonstrate the varying

difficulty of the task which was meant to encourage selection of the most diagnostic regions of the face Across all emotions

performance did not reach above chance until 40 intensity and was near ceiling for 60 intensity images

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 5

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Emotional faces

For each region fixation rates on emotional faces(those with 20 40 or 60 emotional intensity) weresubmitted to one-way ANOVA with emotion block asa factor In these trials eye movements can be affectedby both the emotional content of the face images(stimulus-driven information) and the fact that theobserver is currently seeking that emotion type (goal-driven information) Figure 4A shows fixation rates for

each face region across the six emotions We summarizethe major findings at the end of this section withadditional information and statistical analyses detailedin the supplemental materials

For the eyes there was a main effect of face emotionF(5 250) frac14 272 p 0001 whereby participantslooked longer at the eye region in facial expressions ofanger (M frac14 352) fear (M frac14 308) sadness (M frac14342) and shame (Mfrac14 343) and looked less withinthe eye region for disgust (M frac14 197) and joy (M frac14

Figure 4 (A) For each emotional face fixation time spent within each main ROI (B) For each neutral face fixation time spent within

each main ROI Designates t test p 005 relative to the mean for each emotion

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 6

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

195) faces relative to the mean (M frac14 289) allts(50) 82 ps 0001 For the upper nose there wasan effect of emotion condition F(5 250) frac14 47 p

0001 whereby participants looked marginally less atjoyful faces (M frac14 187) relative to the mean (M frac14218) all ts(50) 158 all ps 01 For the lowernose there was no effect of emotion condition F 1For the upper lip there was an effect of emotioncondition F(5 250)frac14 272 p 0001 wherebyparticipants looked longer at the upper lip for disgust(Mfrac14 155) and joy faces (Mfrac14 209) and less at theupper lip for anger (M frac14 82) and sad (M frac14 68)faces relative to the mean (Mfrac14121) all ts(50) 24all ps 002 For the nasion there was an effect ofemotion condition F(5 250) frac14 31 p 001 wherebyparticipants looked less within the nasion for fear faces(M frac14 38) relative to the mean (M frac14 61) t(50) frac1439 p 0001

To better visualize the differences among emotionconditions Figure 5 graphically illustrates fixationrates for each of the top five regions across the sixemotion conditions relative to the average rate ofregion fixation collapsed across all emotions The toprow of the figure depicts a correlate of the mostdiagnostic information in the image that an idealobserver might use to distinguish between emotionaland neutral facesmdashan image subtraction of the 60emotional image from the neutral image for a singleface identity (this face was chosen because it haduniquely low head movement across photographsrequired for this technique to reveal a useful contrast)Areas of greater difference are depicted with white andlower difference with black respectively

The middle row of the figure graphically illustratesfixation rates for each of the top five regions across our

Figure 5 Top row Image subtractions between the emotional and neutral versions of a particular face from the image set used in

Experiment 1 revealing areas of maximal diagnostic information (white) for judging presence of emotion in the face Middle row

Emotional fixation rates for each of the top five regions across the six emotion conditions relative to the average rate of region

fixation collapsed across all emotional face trials Bottom row Diagnostic information for judgment of each emotion type using the

lsquolsquoBubblesrsquorsquo technique (adapted with permission from Smith et al 2005)

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 7

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

six emotion conditions relative to the average rate ofregion fixation collapsed across all emotions

The bottom row of the figure contains images fromSmith and colleagues (2005) using the lsquolsquobubblesrsquorsquotechnique to find the maximally diagnostic regions offaces for judging each emotion The similarities withincolumns of this figure are striking despite theindependence of the data sources for each Forexample in the middle row note the relatively highfixation rates on the lips for joy and disgust in contrastto the relatively high fixation rates on the eyes foranger sadness and shame Both the top and bottomrows confirm the relatively high diagnostic value ofthese areas for those particular emotion judgments

Neutral faces

In each emotion block half of the faces werenonemotional (neutral) Because these faces wereidentical across blocks any difference in fixation ratesrepresent solely goal-driven strategies resulting fromseeking a given emotion and cannot be due to stimulus-level differences in the images The results will show thatweaker versions of emotion-specific strategies endureacross these neutral faces Figure 4B depicts fixationrates within each block across these neutral faces Foreach region of the face we report the results of a one-way ANOVA seeking differences among the six emotionblock types but focusing only on the neutral faces

For the eyes the emotion block analysis revealed aneffect of emotion condition across neutral faces F(5250) frac14 77 p 0001 whereby participants lookedlonger for anger (Mfrac14334) fear (Mfrac14343) sad (Mfrac14 36) and shame (M frac14 361) faces By contrastparticipants looked less for disgust (M frac14 286) andjoy (M frac14 291) faces relative to the mean (M frac14329) all ts(50) 103 all ps 0001 For the uppernose the emotion block analysis revealed that therewas still no effect of emotion condition across neutralfaces F 1 For the lower nose the emotion blockanalysis revealed that there was still no effect ofemotion condition across neutral faces F 1 For theupper lip the emotion block analysis reveals that therewas still an effect of emotion condition across neutralfaces due to relatively high fixation rates for the joy (Mfrac14 14) condition (disgust was no longer significant)and lower fixation rates for sadness (M frac14 79)condition (anger was no longer significant) relative tothe mean (Mfrac14 99) all ts(50) 197 all ps 0055F(5 250)frac14 86 p 0001 For the nasion the emotionblock analysis revealed that there was still a marginaleffect of emotion condition F(5 250)frac14 23 p frac14 004Although the pattern was similar the change in themean value caused a new set of emotion conditions tobe different from the mean Fear was no longer

significantly different but anger (M frac14 87) hadrelatively high fixation rates t(50) frac14 184 p frac14 007

In summary even when the neutral faces wereidentical across emotion judgment blocks (eg angerjoy) there were still effects of emotion judgment typeon patterns of fixation In fact the majority of effectsfound in the emotional faces were still present in theneutral faces although the effects were not as robustOne exception was fixations to the nasion whereneutral faces showed a pattern of fixation not presentfor emotional faces but this effect may be due to thechanging baseline of the mean number of fixations fornasion fixations overall The fact that most fixationtrends remained for neutral faces that were identicalacross blocks suggests that a substantial portion of theeffect of seeking a specific emotion is due to goal-driveninfluences on fixation patterns

Summary of intensity level analyses

In addition to dividing face trials into emotional(20 40 and 60 emotional) and neutral (0emotional) sets we also examined differences among theemotional face trials as they transitioned from 20 to60 We present a summary of this analysis here andprovide more detailed information and statisticalanalyses in the supplemental materials As the intensityof angry expressions increased fixations increased to theeye region except for the anger faces shown at 60intensity which showed a reduction in fixations to theeye region It is possible that the eyes were mostdiagnostic when the emotion was more ambiguous Asimilar pattern was observed at the nasion where moreneutral faces were fixated to a greater extent than moreemotional faces For disgust judgment blocks there wasa particularly strong tradeoff in fixation position Whenfaces were more neutral participants fixated at the eyesmore frequently However as the intensity of the disgustexpression increased participants looked less at the eyeregion and more toward the upper nose and upper lipAs the intensity of fearful expressions increasedparticipants looked toward the upper nose and upper lipand less at the nasion As the intensity of joyfulexpressions increased there was another particularlystrong effect as fixations increased to the upper lip anddecreased to the eyes and lower nose As the intensity ofsad expressions increased the distribution of fixationsover the five face regions did not significantly change Asthe intensity of shameful expressions increased at 60intensity there were more fixations over the upper liptrading off with fewer fixations over the lower nose

Two of the largest effects of emotional intensity werefor disgust and joy expressions For disgust asemotional intensity conveyed in the face increasedpeople showed greater fixation particularly to the uppernose (close to the particularly salient wrinkle that

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 8

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

appears between the eyes during an expression ofdisgust) and for joy people showed greater fixation tothe upper lip (the location of the smile) In both caseswhen faces become more neutral these fixation ratesdecrease and instead the eyes are fixated morefrequently

Gaze patterns across time

The analyses above collapse the first four fixationsacross viewing time Here we examine how thesepatterns evolve over that time Understanding the timecourse of fixation patterns can establish whether thepatterns observed above are stable and persistentacross viewing time or if fixation patterns are more

variable and dynamic We examined both the proba-bility of fixating on a given region of each type ofemotional face within each 250-ms time interval as wellas for each fixation number Although the patternswere highly similar fixation number appeared to revealmore differences especially in the earlier viewingperiod In contrast for the time interval analysisdifferences in fixation patterns among conditionsappeared to be more lsquolsquosmearedrsquorsquo across time suggestingthat the fixation number analysis might reveal moreabout participantsrsquo fixation patterns

The average duration of each fixation was 301 msand participants made between one and 15 fixationsAlthough participants made up to 15 fixations therewere few trials with this many fixations Because theaverage number of fixations was 83 per image weexamined only the first through the eighth fixation Foreach of the following analyses we use the termlsquolsquofixation ratersquorsquo defined as lsquolsquofor all of the Nth fixationson an image in the current condition the percentagethat are within a given regionrsquorsquo This manner ofcalculating fixation rates removed the influence of thegeneral drop in the number of total fixations availableas fixation number increases (ie there are always morethird fixations than fourth fixations)

Across previous data sets using similar images andtasks as well as the present data set we observed thatfixation patterns differed primarily among the first onethrough four fixations The first few fixations weretypically more variable across time and condition andthe remaining fixation rates tended to stabilize Theselater fixations reveal differences among conditions butvary far less over time Therefore we decided to focusour analyses on the first four fixations (a detailedanalysis across all eight fixations is included in thesupplemental section)

In all of the following analyses we compare fixationrates for each region across the six emotion recognitionconditions To maximize the differences among condi-tions for these analyses involving time we use only theemotional face images and omit the neutral images Aseparate analysis of the neutral emotion imagesrevealed similar patterns but to a weaker extent similarto the neutral image analysis reported above

Patterns of gaze across time by emotion

Figure 6 provides fixation distributions for the firstthree emotional fixations (20ndash60) made to aparticular face presented in the study Overall the eyeswere fixated less across all fixations in the joy anddisgust conditions especially during the first threefixations In contrast the eyes were fixated at uniformlyhigh rates across time in the anger and sadnessconditions and at increasing rates across time in thefear and shame conditions The upper nose was a

Figure 6 Fixation distributions for the first three fixations on

emotional (20ndash60) face image trials for one face identity As

participants made more fixations distinct patterns emerged

across emotions in what regions were fixated most Specifically

at the first fixation the upper nose was a particular popular

fixation location although there was still considerable variability

at the region across emotions As participants made additional

fixations a general pattern emerged such that there was

increased fixation at the upper lip for joy and disgust whereas

there was increased fixation at the eyes for anger sadness and

shame

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 9

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

consistently high target across all emotions during thefirst fixation However despite this popularity demon-strated there was still considerable variability infixation patterns across emotions The lower noseshowed wide variability across all emotions at the firstfixation but afterward fixation rates remained muchmore consistent across emotions The upper lip wasfixated at particularly high rates during the secondfixation for joy and disgust and to a weaker degreefear The nasion showed wide variability in fixationpatterns across emotions although there was a highfixation preference at the first fixation for shame

Asymmetry analysis

Previous research has shown that initial fixationstoward faces are biased toward the left side of the facefollowed by a rightward fixation (Everdell MarshYurick Munhall amp Pare 2007) We tested for thiseffect in the present data by coding each fixation asbeing toward the left or right excluding center regionsof the face (UN LN UL LL) A one-way ANOVAwith fixation number (one through eight) and blocktype (emotional or neutral) as factors showed a maineffect of fixation number F(1 7) frac14 4208 p 0001This effect was driven by the first fixation andremoving that fixation from the analysis eliminatedthe main effect F(1 6) frac14 0579 p frac14 0747 The firstfixation during trials across block type had a strongbias toward the left side of the screen (M frac14 581) incomparison to fixations two through eight (M frac14

462) which demonstrated a slight bias toward theright side of the screen These effects confirm theinitial leftward bias in face fixation patterns Possibleroots of this effect including a learned strategy thatthe right side of the face contains the most informa-tion or a general perceptual bias toward the left visualhemifield are discussed in detail in Everdell et al(2007)

Fixation patterns predict facial emotion

If the type of facial emotion displayed (or sought bythe observer) systematically affects the observerrsquospattern of eye movements it should be possible topredict the type of face displayed (or sought) bystatistical analysis of the eye movement sequence forany given trial We separately submitted the first 1ndashNfixation regions (where N frac14 1 to 8) from every trial inthe data set to a naive Bayesian classifier for categoricaldata (the NaiveBayes class of Matlab) using two thirdsof the data for training and one third for testing Wecalculated the average classification accuracy across100 fittest iterations selecting traintest sets indepen-dently for each iteration for (a) all trials (b) emotionalimagendashonly trials and (c) neutral trials

Figure 6 depicts classification accuracy for eachcondition as well as chance prediction (for sixemotions 166) Emotional-image trial classifica-tion accuracy of test trials was highest (with a peakof around 25 accuracy) for the emotional imagendashonly trial set consistent with the stronger fixationpatterns described in our other analyses Neutral-image trial classification accuracy was lower but stillwell above chance performance (with a peak ofaround 20 accuracy) Including all trials led toperformance in between these two (with a peak ofaround 23 accuracy) consistent with Hsiao andCottrell (2008) who showed that the first one to twofixations were most critical for face processing thediagnostic value of our fixation data was primarily inthe first two fixations It is clear from Figure 7 thatthe bulk of predictive power is reached by the secondfixation

Finally we examined the confusion matrix pro-duced by the classifier (Figure 8) When the wrongemotion is predicted which emotions are systemat-ically confused because they are most similar Wefocused here on two subsets of trials the highestemotion (60) images and the neutral image trialsFor the highest emotion images two pairs ofemotions stood out as having the most confusablefixation patterns joy and disgust and anger andsadness This confusability is consistent with theresults depicted in Figure 5 which reveals strongsimilarities between the fixation patterns produced by

Figure 7 Classification accuracy for each condition (emotional

face trials neutral face trials both trial types and chance

prediction) Classification accuracy for emotional trials was

highest consistent with the stronger fixation patterns observed

in our other analyses Neutral trial classification accuracy was

lower but still well above chance Consistent with the observed

fixation patterns of humans the diagnostic value of fixation

data was primarily reached in the first two fixations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 10

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

these pairs of images Two pairs of emotions stoodout as the least confusable joy and sadness and joyand anger This is also consistent with the resultsdepicted in Figure 5 in which the fixation pattern forjoy contrasts strongly with the pattern for anger andsadness For the neutral-trial subset the mostconfusable image pair was again joy and disgust andthe least confusable was again joy and sadness Otherpairings did not stand out as strong outliers for thissubset

Experiment 1 summary

Participants moved their eyes in distinctive patternsfor each type of emotional image These patternsremained (although weaker) for neutral images in eachemotionrsquos block revealing that these patterns were atleast partially goal-directed The pattern of fixationsalone could predict the type of image being sought(even if it was not present) at above chance levels in anaive Bayesian classifier

Participants preferentially fixate certain regions thatthey at least implicitly think are more diagnostic forrecognition of different types of emotions Experiment2 verifies that the two broadest differentiable regionsmdashthe eyes and the mouthmdashdo indeed differ in how wellthey signal particular emotions

Experiment 2 Varying diagnosticinformation of faces altersperformance

In order to verify whether different areas of our facestimuli contain more or less diagnostic information foreach emotion we asked a new set of participants tomake similar judgments of the emotional content offaces but we occluded information either from the eyeregion or the mouth region We predicted that foremotion judgments for which participants in Experi-ment 1 fixated the eye region at relatively high rates(ie anger and shame) occluding that region woulddecrease emotion detection accuracy In contrast foremotions for which the mouth region was particularlyhighly fixated (ie joy and disgust) occluding thatregion would decrease emotion detection accuracy

Methods

Participants

Ten college-aged participants completed Experiment2 All participants gave consent and received coursecredit for their participation

Figure 8 Confusion matrices for the Bayesian classifier including information from the first eight fixations from trials with

photographs showing the highest emotion level (left column) and trials with neutral emotion photographs (right column) The top

row contains full confusion matrices with bold type indicating significant differences from expectations of chance (1667)

conservatively Bonferroni corrected for 36 comparisons To facilitate visual inspection cell values are redundantly color coded for

magnitude within each table The bottom row removes bold formatting and folds each table along its identity diagonal to depict

mutual confusability regardless of whether a pair of classifications is actual and predicted or predicted and actual respectively Boxed

values are highlighted and discussed in the text

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 11

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Stimuli

Stimuli were identical to that of Experiment 1 withthe following exceptions For emotional faces onlyone emotional morph was used per condition Angerdisgust fear and sadness consisted of 40 emotionalintensity morphs whereas joy consisted of only 20and shame 60 We determined the specific emotionalintensity per emotion during piloting in order toensure participants could perform above chance butnot at ceiling (70 correct) The values used areconsistent with the performance observed in Experi-ment 1 (see Figure 3) Additionally all face stimulihad a black box (838 middot 288) covering either the eyesor the mouth

Apparatus

All of the stimuli were created and displayed usingMATLAB with the Psychophysics Toolbox (Brainard1997 Pelli 1997) on an Intel Macintosh running OS X106 All stimuli were displayed on a 17-in View-SonicE70fB CRT monitor (1024 middot 786 resolution 75Hz 2897 pixels8 of visual angle) The viewing distancewas approximately 56 cm

Procedure

This experiment consisted of a total of 72 trialspresented as six blocks of trials (one block of trials peremotion type) shown in random order repeated twicefor a total of 144 trials per participant Each blockconsisted of 12 trials per block consisting of sixneutral and six emotional face trials presented inrandomized order At the beginning of each trialparticipants saw a fixation cross for 1 s followed by abriefly presented face for 100 ms For half of the trials

a black box covered either the eyes or the mouth (seeFigure 9) Participants then judged whether the photodepicted a specific emotion or not (ie angry or notangry)

Results

We examined whether or not overall performancediffered according to whether the mouth (ie eyescovered by black box) or eyes (ie mouth covered byblack box) were visible For joy we found performancewas significantly better when the mouth was visible (Mfrac14 633) relative to when it was covered (Mfrac14 517)t(9)frac14 281 pfrac14 002 A similar pattern was observed fordisgust (Mfrac14 758 Mfrac14 601) t(9)frac14 338 p 001Conversely we found performance was significantlybetter for anger when the eyes were visible (Mfrac14 692)compared to when they were covered (Mfrac14 608) t(9)frac14 335 p 001 A similar trend was observed forshame (M frac14 817 M frac14 683) t(9)frac14 181 p frac14 010We failed to find any significant differences for fear (Mfrac14 65 M frac14 675) and sadness (M frac14 717 M frac14717)

Discussion

Emotion judgment performance depended on whichregions of the face were covered in a mannerconsistent with the fixation patterns found in Exper-iment 1 Performance was best for joy and disgustwhen the eyes were covered and the mouth was visibleconsistent with the eye fixation rates we observed inExperiment 1 in which observers tended to fixate moreat the upper lip and less at the eyes relative to otheremotions Figure 9 provides a strong illustration ofthis effect Joy is easy to detect in the left face butharder to detect in the right face We failed to find anymeaningful differences for fearful faces which isconsistent with the diagnostic regions identified inExperiment 1 the eyes and upper lip are notdifferentially important to identify fear For angerperformance was significantly better when the eyeswere visible compared to when they were covered Asimilar trend was also observed for shameful facesBoth of these effects are consistent with the diagnosticareas identified in Experiment 1 We did not see anysignificant differences for detecting sadness whencovering the mouth versus the eyes even thoughExperiment 1 did suggest different levels of diagnos-ticity for those regions Overall five out of the sixemotion types showed results consistent with thepredictions given the diagnostic areas of emotionalfaces identified in Experiment 1

Figure 9 For all trials participants were briefly shown a face

with either the eyes or mouth occluded The task was to judge

whether the face had a particular emotion or not (A) An

example of a joyful face with the eyes covered (B) an example

of the same face but with the mouth covered

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 12

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

General discussion

Perceiving emotion expressed in a face is associatedwith characteristic patterns of attention both whenthat emotion is visible and when it is merely antici-pated Although previous research has segmented partsof the face into three or four distinct regions (seeAviezer et al 2008 Buchan Pare amp Munhall 2007Everdell et al 2007 Henderson Falk Minut Dyer ampMahadevan 2000) the present study offers a moredetailed analysis using 21 defined regions Our resultsshow that of these regions the five main facial regions(eyes upper nose lower nose upper lip nasion)accounted for 8803 of all fixations any other facialregion accounted for at most 3 of fixations Thesepatterns suggest that these five facial regions may be themost critical for emotional recognition within faces

Despite the dominance of these five regions therewere consistent differences in fixation patterns whenseeking different emotional cues within a face Figure 5summarizes the regions that were highly fixated foreach emotion (relative to the other emotions) and offersa comparison of these patterns to past work exploringwhich regions are most diagnostic for evaluating agiven emotion (see Figure 5 Smith et al 2005)

For joy participants fixated the least at the eyesand spent the most time fixating at the upper lipsThis is likely driven by the importance of the upperlip relative to the smile the most salient facialfeature of joy (see Figure 5) Importantly thesefixation differences were significant across emo-tional and neutral stimuli suggesting a particularlystrong goal-driven strategy for perceiving joy Foremotional faces there also appeared to be amarginal difference in which the upper nose wasfixated less frequently These differences wereprimarily driven by the first few fixations

For disgust participants fixated more often at theupper lip and less often at the eyes Thesedifferences are likely due to the importance of thefurrowing of the nose and mouth when making thedisgust facial expression thus making the upper lipmore salient These patterns were primarily drivenby the first few fixations

For fear participants fixated more at the eyes andrelatively less at the nasion (see Figure 5) Theredid not appear to be any significant differencesrelative to other emotions in fixations at the uppernose lower nose and upper lip These patternswere fairly consistent across time for the nasionalthough the eyes became increasingly fixated overtime

For anger participants fixated most at the eyesand least at the upper lip When emotion was notpresent participants fixated significantly more at

the nasion as well There did not appear to be anysignificant fixation differences at the upper andlower nose (see Figure 5) These results areconsistent with previous findings suggesting thediagnostic value of the eyes and nasion fordetecting anger (Smith et al 2005) These patternswere consistent across time

For sadness participants fixated most at the eyesIn contrast fixation time was significantly lower atthe upper lip These fixation patterns weresignificant across both emotional and neutralstimuli for both the eyes and upper lip suggestingthis pattern may be driven by a goal-drivenstrategy These patterns were consistent acrosstime

For shame participants fixated to a greater extentat the eyes This pattern was most pronouncedafter the first fixation For all other regions theredid not appear to be any significant differences (seeFigure 5)

Across all blocks it is interesting that there were nosignificant differences found in fixation time spent atthe lower nose whether for emotional or neutral facesit appears to serve as a central fixation point across aface Another finding in our data is that dynamicdifferences are apparent even within the first fixationfor example in the much greater percentage of fixationsto the eye region for anger relative to joyful expres-sions This provides further support for a goal-drivenstrategy during emotional recognition of faces becausesuch differences were typically maintained acrossemotional and neutral stimuli The values and differ-ences within many regions across emotion type werelarger and more pronounced while viewing emotionalversus neutral stimuli still suggesting an important rolefor stimulus-driven factors on eye movements

Gaze patterns were particularly variable acrossemotions in the first few fixations and became lessvariable for the remainder of the viewing time Theupper nose was fixated to the greatest extent as the firstfixation regardless of emotion type and then fixationrates remained relatively variable depending on theemotion type for the remainder of the trial In contrastfor the lower nose the first fixation demonstratedvariability across different emotion types but was areliable target at the second fixation regardless ofemotion For all emotions the upper lip was consis-tently a weaker target at the first fixation and fixatedmost during the second fixation There was no generalfixation pattern at the nasion demonstrating widevariability across emotions over time We also observeda significant left dominance for the initial fixationregardless of stimulus type

Although participants made an average of 83fixations per trial the prediction rates for the Bayesianclassifier showed that the first two fixation locations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 13

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

were most predictive of the type of emotion sought bythe observer This result is consistent with previouswork showing that face recognition can be completedafter only one to two eye movements (Hsiao amp Cottrell2008) although the number of fixations needed torecognize emotional content may be greater

Building on the diagnostic regions of the emotionalfaces identified in Experiment 1 Experiment 2 offered adirect evaluation to whether those regions wereimportant by covering either the mouth or the eyesduring an emotional judgment task Consistent with thepredictions of Experiment 1 emotional judgmentperformance was impaired for emotions for which themouth was more diagnostic (joy disgust) as well as foremotions for which the eyes were more diagnostic(anger shame) Although the present data cannotdirectly confirm that eye movements to these morediagnostic regions helped performance they do showthat attentional strategies were aligned with locationsthat carry diagnostic information

Conclusion

When asked to evaluate the presence of a specificemotion within a face participants focused on acommon set of face regions but also used emotion-specific eye-movement strategies in both emotional andneutral faces These results are consistent with the ideathat focusing attention on certain diagnostic regions isbeneficial for emotion processing and that thesestrategies may be driven by both stimulus-driven andgoal-driven factors The evidence for goal-drivenstrategies is consistent with previous research suggest-ing that the goals and characteristics of the perceivermay influence how eye movements are deployed duringfacial emotion recognition (Jack et al 2009 Perlman etal 2009) Face and emotion recognition are even morebroadly affected by a facersquos gender (Wright amp Sladden2003) race or culture (Jack et al 2009 OrsquoToole et al1994) and individual facial morphology (Oosterhof ampTodorov 2009) These effects likely interact with thegoal-driven factors observed in the current study andwe hope that future research explores such interactions

The presence of a goal-driven strategy duringemotional recognition has numerous practical impli-cations specifically to individuals with disorders thatmay result in social deficits Patients with certaindisorders such as autism (Pelphrey et al 2002 SpezioAdolphs et al 2007 Spezio Huang Castelli ampAdolphs 2007) schizophrenia (Loughland Williamsamp Gordon 2002) social phobias (Horley WilliamsGonsalvez amp Gordon 2004) and Alzheimerrsquos disease(Hargrave Maddock amp Stone 2002 Ogrocki Hills ampStrauss 2000) demonstrate eye-movement patterns

that are remarkably different from normal participantsand these anomalous patterns may impair their abilityto correctly identify the emotional valence of facesThese individuals typically focus longer at nondiag-nostic regions (such as the brow or cheeks) relative todiagnostic regions of the face (eyes mouth nose)during emotional recognition These abnormal eye-movement patterns contribute to social or emotionaldeficits that accompany abnormal face recognition Bycomparing such abnormal eye-movement patterns toour results it may be possible to identify ways in whichaltered eye-movement patterns could improve emotionrecognition

Keywords face recognition emotion eye movementsattention fixation

Acknowledgments

We thank Trixie Lipke for assistance with datacollection and Sumeeth Jonathan for programming andhelp with movie creation

Commercial relationships noneCorresponding author Mark W SchurginEmail maschurginjhueduAddress Department of Psychological amp Brain Sci-ences Johns Hopkins University Baltimore MDUSA

References

Adolphs R Gosselin F Buchanan T W TranelD Schyns P amp Damasio A R (2005) Amechanism for impaired fear recognition afteramygdala damage Nature 433(7021) 68ndash72

Adolphs R Tranel D Damasio H amp Damasio A(1994) Impaired recognition of emotion in facialexpressions following bilateral damage to thehuman amygdala Nature 372 669ndash672

Aviezer H Hassin R R Ryan J Grady CSusskind J Anderson A Bentin S (2008)Angry disgusted or afraid Studies on the mal-leability of emotion perception PsychologicalScience 19(7) 724ndash732

Baron-Cohen S amp Wheelwright S (2004) Theempathy quotient An investigation of adults withAsperger syndrome or high functioning autism andnormal sex differences Journal of Autism andDevelopmental Disorders 34(2) 163ndash175

Beaupre M G amp Hess U (2005) Cross-culturalemotion recognition among Canadian ethnic

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 14

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

groups Journal of Cross-Cultural Psychology36(3) 355ndash370

Blair R J R (2005) Responding to the emotions ofothers Dissociating forms of empathy through thestudy of typical and psychiatric populationsConsciousness and Cognition 14(4) 698ndash718

Brainard D H (1997) The Psychophysics ToolboxSpatial Vision 10 443ndash446

Buchan J N Pare M amp Munhall K G (2007)Spatial statistics of gaze fixations during dynamicface processing Social Neuroscience 2(1) 1ndash13

Darwin C (1872) The expression of emotions in manand animals New York Philosophical Library

Ekman P amp Friesen W V (1975) Unmasking theface Englewood Cliffs NJ Prentice Hall

Ekman P amp Friesen W V (1978) The Facial ActionCoding System (FACS) A technique for themeasurement of facial action Palo Alto CAConsulting Psychologists Press

Elfenbein H A amp Ambady N (2002) Is there an in-group advantage in emotion recognition

Everdell I T Marsh H Yurick M D Munhall KG amp Pare M (2007) Gaze behaviour inaudiovisual speech perception Asymmetrical dis-tribution of face-directed fixations Perception 361535ndash1545

Gosselin F amp Schyns P G (2001) Bubbles Atechnique to reveal the use of information inrecognition tasks Vision Research 41(17) 2261ndash2271

Hargrave R Maddock R J amp Stone V (2002)Impaired recognition of facial expressions ofemotion in Alzheimerrsquos disease Journal of Neuro-psychiatry and Clinical Neurosciences 14 64ndash71

Hejmadi A Davidson R J amp Rozin P (2000)Exploring Hindu Indian emotion expressionsEvidence for accurate recognition by Americansand Indians Psychological Science 11(3) 183ndash187

Henderson J M Falk R Minut S Dyer F C ampMahadevan S (2000) Gaze control for facelearning and recognition by humans and machinesMichigan State University Eye Movement Labora-tory Technical Report 4 1ndash14

Henderson J M Williams C C amp Falk R J (2005)Eye movements are functional during face learningMemory amp Cognition 33(1) 98ndash106

Horley K Williams L M Gonsalvez C amp GordonE (2004) Face to face Visual scanpath evidencefor abnormal processing of facial expressions insocial phobia Psychiatry Research 127 43ndash53

Hsiao J H W amp Cottrell G (2008) Two fixations

suffice in face recognition Psychological Science19(10) 998ndash1006

Jack R E Blais C Scheepers C Schyns P G ampCaldara R (2009) Cultural confusions show thatfacial expressions are not universal Current Biol-ogy 19 1ndash6

Keltner D amp Buswell B N (1997) EmbarrassmentIts distinct form and appeasement functionsPsychological Bulletin 122(3) 250ndash270

Kowler E Anderson E Dosher B amp Blaser E(1995) The role of attention in the programming ofsaccades Vision Research 35(13) 1897ndash1916

Loughland C M Williams L M amp Gordon E(2002) Schizophrenia and affective disorder showdifferent visual scanning behaviour for faces Atrait versus state-based distinction BiologicalPsychiatry 52 338ndash348

Marsh A A Kozak M N amp Ambady N (2007)Accurate identification of fear facial expressionspredicts prosocial behavior Emotion 7(2) 239

Nahm F K D Perret A Amaral D G amp AlbrightT D (1997) How do monkeys look at facesJournal of Cognitive Neuroscience 9(5) 611ndash623

Ogrocki P K Hills A C amp Strauss M E (2000)Visual exploration of facial emotion by healthyolder adults and patients with Alzheimer diseaseNeuropsychiatry Neuropsychology and BehavioralNeurology 13 271ndash278

Oosterhof N N amp Todorov A (2009) Sharedperceptual basis of emotional expressions andtrustworthiness impressions from faces Emotion9(1) 128

OrsquoToole A J Deffenbacher K A Valentin D ampAbdi H (1994) Structural aspects of face recog-nition and the other-race effect Memory ampCognition 22(2) 208ndash224

Pelli D G (1997) The VideoToolbox software forvisual psychophysics Transforming numbers intomovies Spatial Vision 10 437ndash442

Pelphrey K A Sasson N J Reznick J Paul GGoldman B D amp Piven J (2002) Visualscanning of faces in autism Journal of Autism andDevelopmental Disorders 32(4) 1573ndash3432

Perlman S B Morris J P Vander Wyk B CGreen S R Doyle J L amp Pelphrey K A(2009) Individual differences in personality shapehow people look at faces PLoS ONE 4(6) e5952

Pflugshaupt T Mosimann U P Schmitt W J vonWartburg R Wurtz P Luthi M Muri RM (2007) To look or not to look at threatScanpath differences within a group of spiderphobics Journal of Anxiety Disorders 21(3) 353ndash366

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 15

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Sasson N Tsuchiya N Hurley R Couture S MPenn D L Adolphs R amp Piven J (2007)Orienting to social stimuli differentiates socialcognitive impairment in autism and schizophreniaNeuropsychologia 45 2580ndash2588

Schyns P G Petro L S amp Smith M L (2009)Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brainBehavioral and brain evidence PLoS One 4(5)e5625

Segerstrom S C (2001) Optimism and attentionalbias for negative and positive stimuli Personalityand Social Psychology Bulletin 27 1334ndash1343

Smith M L Cottrell G W Gosselin F amp SchynsP G (2005) Transmitting and decoding facialexpressions Psychological Science 16(3) 184ndash189

Spezio M L Adolphs R Hurley R S E amp PivenJ (2007) Analysis of face gaze in autism usinglsquolsquoBubblesrsquorsquo Neuropsychologia 45 144ndash151

Spezio M L Huang P-Y S Castelli F amp Adolphs

R (2007) Amygdala damage impairs eye contactduring conversations with real people The Journalof Neuroscience 27(15) 3994ndash3997

Susskind J M Lee D H Cusi A Feiman RGrabski W amp Anderson A K (2008) Expressingfear enhances sensory acquisition Nature Neuro-science 11(7) 843ndash850

Tracy J L amp Robins R W (2004) Show your prideEvidence for a discrete emotion expression Psy-chological Science 15 194ndash197

Walker-Smith G J Gale A G amp Findlay J M(1977) Eye movement strategies involved in faceperception Perception 6(3) 313ndash326

Wright D B amp Sladden B (2003) An own genderbias and the importance of hair in face recognitionActa Psychologica 114(1) 101ndash114

Zhao W Chellappa R Phillips P J amp RosenfeldA (2003) Face recognition A literature surveyAcm Computing Surveys (CSUR) 35(4) 399ndash458

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 16

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

  • Introduction
  • Experiment 1 Eye tracking of
  • f01
  • f02
  • f03
  • f04
  • f05
  • f06
  • f07
  • Experiment 2 Varying diagnostic information
  • f08
  • f09
  • General discussion
  • Conclusion
  • Adolphs1
  • Adolphs2
  • Aviezer1
  • BaronCohen1
  • Beaupre1
  • Blair1
  • Brainard1
  • Buchan1
  • Darwin1
  • Ekman1
  • Ekman2
  • Elfenbein1
  • Everdell1
  • Gosselin1
  • Hargrave1
  • Hejmadi1
  • Henderson1
  • Henderson2
  • Horley1
  • Hsiao1
  • Jack1
  • Keltner1
  • Kowler1
  • Loughland1
  • Marsh1
  • Nahm1
  • Ogrocki1
  • Oosterhof1
  • Otoole1
  • Pelli1
  • Pelphrey1
  • Perlman1
  • Pflugshaupt1
  • Sasson1
  • Schyns1
  • Segerstrom1
  • Smith1
  • Spezio1
  • Spezio2
  • Susskind1
  • Tracy1
  • WalkerSmith1
  • Wright1
  • Zhao1

compared to sadness both ts(49) 46 ps 0001 andhigher compared to fear and sadness both ts(49) 38both ps 0001 Shame (M frac14 585) and anger (M frac14589) were slightly lower but still higher than fear (Mfrac14 548) both ts(49) 32 both ps 0002

As a measure of the effect of emotional intensitymanipulation on ratings we compared the slope of thefunction relating emotion intensity (0 20 4060) in the face image to ratings We omitted 60emotion from this slope because ratings overall wereabove 90 and produced little slope variance Amongemotion blocks there were differences in the slope fromzero to 40 emotion F(5 245)frac14 177 p 0001 Joy(Mfrac14 77 difference in rating value) and disgust (Mfrac1474 difference) were higher than fear (M frac14 62difference) t(49) 24 p 0021 Anger (M frac14 47difference) sadness (M frac14 45) and shame (M frac14 43difference) were all lower than fear all ts(49) 247 p 001

Behavioral results Emotional judgment accuracy

We then evaluated whether these emotional judg-ments were lsquolsquocorrectrsquorsquo according to our definition thatany face containing 0 intensity of the emotional facewas lsquolsquoneutralrsquorsquo and any face containing 20ndash60intensity of the emotional face should be judged aslsquolsquoemotionalrsquorsquo This definition of lsquolsquocorrectrsquorsquo is as arbitraryas our threshold for emotional content The goal of thetask was to engage the participant in recognizing aparticular emotion and our analysis of performance isonly intended as confirmation that participants per-formed the task and that performance was at neitherfloor nor ceiling level

Our analysis confirmed these assumptions Partici-pantsrsquo overall accuracy ranged from 660 to 868with a mean of 755 Emotion judgment accuracy

differed across the six emotion block conditions F(5245)frac14 96 p 0001 Participants recognized joy (Mfrac14827 correct) and disgust (M frac14 795) with higheraccuracy than all others t(49) 23 all ps 003Participants showed lower accuracy when recognizingfear (M frac14 762) and anger (M frac14 735) Howeverparticipants were more accurate in recognizing fearcompared to sadness (M frac14 707) and shame (M frac14704) both t(49) 26 both p 0013 Theseaccuracy rates suggest that although there weredifferences among emotion blocks overall performancewas roughly similar and far from floor or ceiling levelsconfirming that participants performed the judgmenttask as designed

Gaze patterns across emotions

The following analyses only included the first fourfixations of each trial with the lsquolsquofirstrsquorsquo fixation definedas the first new landing of the eye after the appearanceof the image As will be detailed later participantsmade an average of 83 fixations per trial and we foundthe first few fixations to be the most diagnostic in termsof capturing differences across emotions Althoughsome of our later analyses suggest that only the firsttwo fixations are most critical for emotion recognitionhere we conservatively analyze the first four andexamine the changes in pattern across this initial rangein a later section For the following analyses isolatingthe data to the first four fixations produces a pattern ofresults that is qualitatively similar to using more (egeight) fixations All analyses were performed irrespec-tive of whether a response on a trial was correct orincorrect

Figure 3 Percentage of trials rated as lsquolsquoemotionalrsquorsquo as a function of emotion type and intensity These data demonstrate the varying

difficulty of the task which was meant to encourage selection of the most diagnostic regions of the face Across all emotions

performance did not reach above chance until 40 intensity and was near ceiling for 60 intensity images

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 5

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Emotional faces

For each region fixation rates on emotional faces(those with 20 40 or 60 emotional intensity) weresubmitted to one-way ANOVA with emotion block asa factor In these trials eye movements can be affectedby both the emotional content of the face images(stimulus-driven information) and the fact that theobserver is currently seeking that emotion type (goal-driven information) Figure 4A shows fixation rates for

each face region across the six emotions We summarizethe major findings at the end of this section withadditional information and statistical analyses detailedin the supplemental materials

For the eyes there was a main effect of face emotionF(5 250) frac14 272 p 0001 whereby participantslooked longer at the eye region in facial expressions ofanger (M frac14 352) fear (M frac14 308) sadness (M frac14342) and shame (Mfrac14 343) and looked less withinthe eye region for disgust (M frac14 197) and joy (M frac14

Figure 4 (A) For each emotional face fixation time spent within each main ROI (B) For each neutral face fixation time spent within

each main ROI Designates t test p 005 relative to the mean for each emotion

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 6

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

195) faces relative to the mean (M frac14 289) allts(50) 82 ps 0001 For the upper nose there wasan effect of emotion condition F(5 250) frac14 47 p

0001 whereby participants looked marginally less atjoyful faces (M frac14 187) relative to the mean (M frac14218) all ts(50) 158 all ps 01 For the lowernose there was no effect of emotion condition F 1For the upper lip there was an effect of emotioncondition F(5 250)frac14 272 p 0001 wherebyparticipants looked longer at the upper lip for disgust(Mfrac14 155) and joy faces (Mfrac14 209) and less at theupper lip for anger (M frac14 82) and sad (M frac14 68)faces relative to the mean (Mfrac14121) all ts(50) 24all ps 002 For the nasion there was an effect ofemotion condition F(5 250) frac14 31 p 001 wherebyparticipants looked less within the nasion for fear faces(M frac14 38) relative to the mean (M frac14 61) t(50) frac1439 p 0001

To better visualize the differences among emotionconditions Figure 5 graphically illustrates fixationrates for each of the top five regions across the sixemotion conditions relative to the average rate ofregion fixation collapsed across all emotions The toprow of the figure depicts a correlate of the mostdiagnostic information in the image that an idealobserver might use to distinguish between emotionaland neutral facesmdashan image subtraction of the 60emotional image from the neutral image for a singleface identity (this face was chosen because it haduniquely low head movement across photographsrequired for this technique to reveal a useful contrast)Areas of greater difference are depicted with white andlower difference with black respectively

The middle row of the figure graphically illustratesfixation rates for each of the top five regions across our

Figure 5 Top row Image subtractions between the emotional and neutral versions of a particular face from the image set used in

Experiment 1 revealing areas of maximal diagnostic information (white) for judging presence of emotion in the face Middle row

Emotional fixation rates for each of the top five regions across the six emotion conditions relative to the average rate of region

fixation collapsed across all emotional face trials Bottom row Diagnostic information for judgment of each emotion type using the

lsquolsquoBubblesrsquorsquo technique (adapted with permission from Smith et al 2005)

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 7

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

six emotion conditions relative to the average rate ofregion fixation collapsed across all emotions

The bottom row of the figure contains images fromSmith and colleagues (2005) using the lsquolsquobubblesrsquorsquotechnique to find the maximally diagnostic regions offaces for judging each emotion The similarities withincolumns of this figure are striking despite theindependence of the data sources for each Forexample in the middle row note the relatively highfixation rates on the lips for joy and disgust in contrastto the relatively high fixation rates on the eyes foranger sadness and shame Both the top and bottomrows confirm the relatively high diagnostic value ofthese areas for those particular emotion judgments

Neutral faces

In each emotion block half of the faces werenonemotional (neutral) Because these faces wereidentical across blocks any difference in fixation ratesrepresent solely goal-driven strategies resulting fromseeking a given emotion and cannot be due to stimulus-level differences in the images The results will show thatweaker versions of emotion-specific strategies endureacross these neutral faces Figure 4B depicts fixationrates within each block across these neutral faces Foreach region of the face we report the results of a one-way ANOVA seeking differences among the six emotionblock types but focusing only on the neutral faces

For the eyes the emotion block analysis revealed aneffect of emotion condition across neutral faces F(5250) frac14 77 p 0001 whereby participants lookedlonger for anger (Mfrac14334) fear (Mfrac14343) sad (Mfrac14 36) and shame (M frac14 361) faces By contrastparticipants looked less for disgust (M frac14 286) andjoy (M frac14 291) faces relative to the mean (M frac14329) all ts(50) 103 all ps 0001 For the uppernose the emotion block analysis revealed that therewas still no effect of emotion condition across neutralfaces F 1 For the lower nose the emotion blockanalysis revealed that there was still no effect ofemotion condition across neutral faces F 1 For theupper lip the emotion block analysis reveals that therewas still an effect of emotion condition across neutralfaces due to relatively high fixation rates for the joy (Mfrac14 14) condition (disgust was no longer significant)and lower fixation rates for sadness (M frac14 79)condition (anger was no longer significant) relative tothe mean (Mfrac14 99) all ts(50) 197 all ps 0055F(5 250)frac14 86 p 0001 For the nasion the emotionblock analysis revealed that there was still a marginaleffect of emotion condition F(5 250)frac14 23 p frac14 004Although the pattern was similar the change in themean value caused a new set of emotion conditions tobe different from the mean Fear was no longer

significantly different but anger (M frac14 87) hadrelatively high fixation rates t(50) frac14 184 p frac14 007

In summary even when the neutral faces wereidentical across emotion judgment blocks (eg angerjoy) there were still effects of emotion judgment typeon patterns of fixation In fact the majority of effectsfound in the emotional faces were still present in theneutral faces although the effects were not as robustOne exception was fixations to the nasion whereneutral faces showed a pattern of fixation not presentfor emotional faces but this effect may be due to thechanging baseline of the mean number of fixations fornasion fixations overall The fact that most fixationtrends remained for neutral faces that were identicalacross blocks suggests that a substantial portion of theeffect of seeking a specific emotion is due to goal-driveninfluences on fixation patterns

Summary of intensity level analyses

In addition to dividing face trials into emotional(20 40 and 60 emotional) and neutral (0emotional) sets we also examined differences among theemotional face trials as they transitioned from 20 to60 We present a summary of this analysis here andprovide more detailed information and statisticalanalyses in the supplemental materials As the intensityof angry expressions increased fixations increased to theeye region except for the anger faces shown at 60intensity which showed a reduction in fixations to theeye region It is possible that the eyes were mostdiagnostic when the emotion was more ambiguous Asimilar pattern was observed at the nasion where moreneutral faces were fixated to a greater extent than moreemotional faces For disgust judgment blocks there wasa particularly strong tradeoff in fixation position Whenfaces were more neutral participants fixated at the eyesmore frequently However as the intensity of the disgustexpression increased participants looked less at the eyeregion and more toward the upper nose and upper lipAs the intensity of fearful expressions increasedparticipants looked toward the upper nose and upper lipand less at the nasion As the intensity of joyfulexpressions increased there was another particularlystrong effect as fixations increased to the upper lip anddecreased to the eyes and lower nose As the intensity ofsad expressions increased the distribution of fixationsover the five face regions did not significantly change Asthe intensity of shameful expressions increased at 60intensity there were more fixations over the upper liptrading off with fewer fixations over the lower nose

Two of the largest effects of emotional intensity werefor disgust and joy expressions For disgust asemotional intensity conveyed in the face increasedpeople showed greater fixation particularly to the uppernose (close to the particularly salient wrinkle that

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 8

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

appears between the eyes during an expression ofdisgust) and for joy people showed greater fixation tothe upper lip (the location of the smile) In both caseswhen faces become more neutral these fixation ratesdecrease and instead the eyes are fixated morefrequently

Gaze patterns across time

The analyses above collapse the first four fixationsacross viewing time Here we examine how thesepatterns evolve over that time Understanding the timecourse of fixation patterns can establish whether thepatterns observed above are stable and persistentacross viewing time or if fixation patterns are more

variable and dynamic We examined both the proba-bility of fixating on a given region of each type ofemotional face within each 250-ms time interval as wellas for each fixation number Although the patternswere highly similar fixation number appeared to revealmore differences especially in the earlier viewingperiod In contrast for the time interval analysisdifferences in fixation patterns among conditionsappeared to be more lsquolsquosmearedrsquorsquo across time suggestingthat the fixation number analysis might reveal moreabout participantsrsquo fixation patterns

The average duration of each fixation was 301 msand participants made between one and 15 fixationsAlthough participants made up to 15 fixations therewere few trials with this many fixations Because theaverage number of fixations was 83 per image weexamined only the first through the eighth fixation Foreach of the following analyses we use the termlsquolsquofixation ratersquorsquo defined as lsquolsquofor all of the Nth fixationson an image in the current condition the percentagethat are within a given regionrsquorsquo This manner ofcalculating fixation rates removed the influence of thegeneral drop in the number of total fixations availableas fixation number increases (ie there are always morethird fixations than fourth fixations)

Across previous data sets using similar images andtasks as well as the present data set we observed thatfixation patterns differed primarily among the first onethrough four fixations The first few fixations weretypically more variable across time and condition andthe remaining fixation rates tended to stabilize Theselater fixations reveal differences among conditions butvary far less over time Therefore we decided to focusour analyses on the first four fixations (a detailedanalysis across all eight fixations is included in thesupplemental section)

In all of the following analyses we compare fixationrates for each region across the six emotion recognitionconditions To maximize the differences among condi-tions for these analyses involving time we use only theemotional face images and omit the neutral images Aseparate analysis of the neutral emotion imagesrevealed similar patterns but to a weaker extent similarto the neutral image analysis reported above

Patterns of gaze across time by emotion

Figure 6 provides fixation distributions for the firstthree emotional fixations (20ndash60) made to aparticular face presented in the study Overall the eyeswere fixated less across all fixations in the joy anddisgust conditions especially during the first threefixations In contrast the eyes were fixated at uniformlyhigh rates across time in the anger and sadnessconditions and at increasing rates across time in thefear and shame conditions The upper nose was a

Figure 6 Fixation distributions for the first three fixations on

emotional (20ndash60) face image trials for one face identity As

participants made more fixations distinct patterns emerged

across emotions in what regions were fixated most Specifically

at the first fixation the upper nose was a particular popular

fixation location although there was still considerable variability

at the region across emotions As participants made additional

fixations a general pattern emerged such that there was

increased fixation at the upper lip for joy and disgust whereas

there was increased fixation at the eyes for anger sadness and

shame

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 9

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

consistently high target across all emotions during thefirst fixation However despite this popularity demon-strated there was still considerable variability infixation patterns across emotions The lower noseshowed wide variability across all emotions at the firstfixation but afterward fixation rates remained muchmore consistent across emotions The upper lip wasfixated at particularly high rates during the secondfixation for joy and disgust and to a weaker degreefear The nasion showed wide variability in fixationpatterns across emotions although there was a highfixation preference at the first fixation for shame

Asymmetry analysis

Previous research has shown that initial fixationstoward faces are biased toward the left side of the facefollowed by a rightward fixation (Everdell MarshYurick Munhall amp Pare 2007) We tested for thiseffect in the present data by coding each fixation asbeing toward the left or right excluding center regionsof the face (UN LN UL LL) A one-way ANOVAwith fixation number (one through eight) and blocktype (emotional or neutral) as factors showed a maineffect of fixation number F(1 7) frac14 4208 p 0001This effect was driven by the first fixation andremoving that fixation from the analysis eliminatedthe main effect F(1 6) frac14 0579 p frac14 0747 The firstfixation during trials across block type had a strongbias toward the left side of the screen (M frac14 581) incomparison to fixations two through eight (M frac14

462) which demonstrated a slight bias toward theright side of the screen These effects confirm theinitial leftward bias in face fixation patterns Possibleroots of this effect including a learned strategy thatthe right side of the face contains the most informa-tion or a general perceptual bias toward the left visualhemifield are discussed in detail in Everdell et al(2007)

Fixation patterns predict facial emotion

If the type of facial emotion displayed (or sought bythe observer) systematically affects the observerrsquospattern of eye movements it should be possible topredict the type of face displayed (or sought) bystatistical analysis of the eye movement sequence forany given trial We separately submitted the first 1ndashNfixation regions (where N frac14 1 to 8) from every trial inthe data set to a naive Bayesian classifier for categoricaldata (the NaiveBayes class of Matlab) using two thirdsof the data for training and one third for testing Wecalculated the average classification accuracy across100 fittest iterations selecting traintest sets indepen-dently for each iteration for (a) all trials (b) emotionalimagendashonly trials and (c) neutral trials

Figure 6 depicts classification accuracy for eachcondition as well as chance prediction (for sixemotions 166) Emotional-image trial classifica-tion accuracy of test trials was highest (with a peakof around 25 accuracy) for the emotional imagendashonly trial set consistent with the stronger fixationpatterns described in our other analyses Neutral-image trial classification accuracy was lower but stillwell above chance performance (with a peak ofaround 20 accuracy) Including all trials led toperformance in between these two (with a peak ofaround 23 accuracy) consistent with Hsiao andCottrell (2008) who showed that the first one to twofixations were most critical for face processing thediagnostic value of our fixation data was primarily inthe first two fixations It is clear from Figure 7 thatthe bulk of predictive power is reached by the secondfixation

Finally we examined the confusion matrix pro-duced by the classifier (Figure 8) When the wrongemotion is predicted which emotions are systemat-ically confused because they are most similar Wefocused here on two subsets of trials the highestemotion (60) images and the neutral image trialsFor the highest emotion images two pairs ofemotions stood out as having the most confusablefixation patterns joy and disgust and anger andsadness This confusability is consistent with theresults depicted in Figure 5 which reveals strongsimilarities between the fixation patterns produced by

Figure 7 Classification accuracy for each condition (emotional

face trials neutral face trials both trial types and chance

prediction) Classification accuracy for emotional trials was

highest consistent with the stronger fixation patterns observed

in our other analyses Neutral trial classification accuracy was

lower but still well above chance Consistent with the observed

fixation patterns of humans the diagnostic value of fixation

data was primarily reached in the first two fixations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 10

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

these pairs of images Two pairs of emotions stoodout as the least confusable joy and sadness and joyand anger This is also consistent with the resultsdepicted in Figure 5 in which the fixation pattern forjoy contrasts strongly with the pattern for anger andsadness For the neutral-trial subset the mostconfusable image pair was again joy and disgust andthe least confusable was again joy and sadness Otherpairings did not stand out as strong outliers for thissubset

Experiment 1 summary

Participants moved their eyes in distinctive patternsfor each type of emotional image These patternsremained (although weaker) for neutral images in eachemotionrsquos block revealing that these patterns were atleast partially goal-directed The pattern of fixationsalone could predict the type of image being sought(even if it was not present) at above chance levels in anaive Bayesian classifier

Participants preferentially fixate certain regions thatthey at least implicitly think are more diagnostic forrecognition of different types of emotions Experiment2 verifies that the two broadest differentiable regionsmdashthe eyes and the mouthmdashdo indeed differ in how wellthey signal particular emotions

Experiment 2 Varying diagnosticinformation of faces altersperformance

In order to verify whether different areas of our facestimuli contain more or less diagnostic information foreach emotion we asked a new set of participants tomake similar judgments of the emotional content offaces but we occluded information either from the eyeregion or the mouth region We predicted that foremotion judgments for which participants in Experi-ment 1 fixated the eye region at relatively high rates(ie anger and shame) occluding that region woulddecrease emotion detection accuracy In contrast foremotions for which the mouth region was particularlyhighly fixated (ie joy and disgust) occluding thatregion would decrease emotion detection accuracy

Methods

Participants

Ten college-aged participants completed Experiment2 All participants gave consent and received coursecredit for their participation

Figure 8 Confusion matrices for the Bayesian classifier including information from the first eight fixations from trials with

photographs showing the highest emotion level (left column) and trials with neutral emotion photographs (right column) The top

row contains full confusion matrices with bold type indicating significant differences from expectations of chance (1667)

conservatively Bonferroni corrected for 36 comparisons To facilitate visual inspection cell values are redundantly color coded for

magnitude within each table The bottom row removes bold formatting and folds each table along its identity diagonal to depict

mutual confusability regardless of whether a pair of classifications is actual and predicted or predicted and actual respectively Boxed

values are highlighted and discussed in the text

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 11

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Stimuli

Stimuli were identical to that of Experiment 1 withthe following exceptions For emotional faces onlyone emotional morph was used per condition Angerdisgust fear and sadness consisted of 40 emotionalintensity morphs whereas joy consisted of only 20and shame 60 We determined the specific emotionalintensity per emotion during piloting in order toensure participants could perform above chance butnot at ceiling (70 correct) The values used areconsistent with the performance observed in Experi-ment 1 (see Figure 3) Additionally all face stimulihad a black box (838 middot 288) covering either the eyesor the mouth

Apparatus

All of the stimuli were created and displayed usingMATLAB with the Psychophysics Toolbox (Brainard1997 Pelli 1997) on an Intel Macintosh running OS X106 All stimuli were displayed on a 17-in View-SonicE70fB CRT monitor (1024 middot 786 resolution 75Hz 2897 pixels8 of visual angle) The viewing distancewas approximately 56 cm

Procedure

This experiment consisted of a total of 72 trialspresented as six blocks of trials (one block of trials peremotion type) shown in random order repeated twicefor a total of 144 trials per participant Each blockconsisted of 12 trials per block consisting of sixneutral and six emotional face trials presented inrandomized order At the beginning of each trialparticipants saw a fixation cross for 1 s followed by abriefly presented face for 100 ms For half of the trials

a black box covered either the eyes or the mouth (seeFigure 9) Participants then judged whether the photodepicted a specific emotion or not (ie angry or notangry)

Results

We examined whether or not overall performancediffered according to whether the mouth (ie eyescovered by black box) or eyes (ie mouth covered byblack box) were visible For joy we found performancewas significantly better when the mouth was visible (Mfrac14 633) relative to when it was covered (Mfrac14 517)t(9)frac14 281 pfrac14 002 A similar pattern was observed fordisgust (Mfrac14 758 Mfrac14 601) t(9)frac14 338 p 001Conversely we found performance was significantlybetter for anger when the eyes were visible (Mfrac14 692)compared to when they were covered (Mfrac14 608) t(9)frac14 335 p 001 A similar trend was observed forshame (M frac14 817 M frac14 683) t(9)frac14 181 p frac14 010We failed to find any significant differences for fear (Mfrac14 65 M frac14 675) and sadness (M frac14 717 M frac14717)

Discussion

Emotion judgment performance depended on whichregions of the face were covered in a mannerconsistent with the fixation patterns found in Exper-iment 1 Performance was best for joy and disgustwhen the eyes were covered and the mouth was visibleconsistent with the eye fixation rates we observed inExperiment 1 in which observers tended to fixate moreat the upper lip and less at the eyes relative to otheremotions Figure 9 provides a strong illustration ofthis effect Joy is easy to detect in the left face butharder to detect in the right face We failed to find anymeaningful differences for fearful faces which isconsistent with the diagnostic regions identified inExperiment 1 the eyes and upper lip are notdifferentially important to identify fear For angerperformance was significantly better when the eyeswere visible compared to when they were covered Asimilar trend was also observed for shameful facesBoth of these effects are consistent with the diagnosticareas identified in Experiment 1 We did not see anysignificant differences for detecting sadness whencovering the mouth versus the eyes even thoughExperiment 1 did suggest different levels of diagnos-ticity for those regions Overall five out of the sixemotion types showed results consistent with thepredictions given the diagnostic areas of emotionalfaces identified in Experiment 1

Figure 9 For all trials participants were briefly shown a face

with either the eyes or mouth occluded The task was to judge

whether the face had a particular emotion or not (A) An

example of a joyful face with the eyes covered (B) an example

of the same face but with the mouth covered

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 12

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

General discussion

Perceiving emotion expressed in a face is associatedwith characteristic patterns of attention both whenthat emotion is visible and when it is merely antici-pated Although previous research has segmented partsof the face into three or four distinct regions (seeAviezer et al 2008 Buchan Pare amp Munhall 2007Everdell et al 2007 Henderson Falk Minut Dyer ampMahadevan 2000) the present study offers a moredetailed analysis using 21 defined regions Our resultsshow that of these regions the five main facial regions(eyes upper nose lower nose upper lip nasion)accounted for 8803 of all fixations any other facialregion accounted for at most 3 of fixations Thesepatterns suggest that these five facial regions may be themost critical for emotional recognition within faces

Despite the dominance of these five regions therewere consistent differences in fixation patterns whenseeking different emotional cues within a face Figure 5summarizes the regions that were highly fixated foreach emotion (relative to the other emotions) and offersa comparison of these patterns to past work exploringwhich regions are most diagnostic for evaluating agiven emotion (see Figure 5 Smith et al 2005)

For joy participants fixated the least at the eyesand spent the most time fixating at the upper lipsThis is likely driven by the importance of the upperlip relative to the smile the most salient facialfeature of joy (see Figure 5) Importantly thesefixation differences were significant across emo-tional and neutral stimuli suggesting a particularlystrong goal-driven strategy for perceiving joy Foremotional faces there also appeared to be amarginal difference in which the upper nose wasfixated less frequently These differences wereprimarily driven by the first few fixations

For disgust participants fixated more often at theupper lip and less often at the eyes Thesedifferences are likely due to the importance of thefurrowing of the nose and mouth when making thedisgust facial expression thus making the upper lipmore salient These patterns were primarily drivenby the first few fixations

For fear participants fixated more at the eyes andrelatively less at the nasion (see Figure 5) Theredid not appear to be any significant differencesrelative to other emotions in fixations at the uppernose lower nose and upper lip These patternswere fairly consistent across time for the nasionalthough the eyes became increasingly fixated overtime

For anger participants fixated most at the eyesand least at the upper lip When emotion was notpresent participants fixated significantly more at

the nasion as well There did not appear to be anysignificant fixation differences at the upper andlower nose (see Figure 5) These results areconsistent with previous findings suggesting thediagnostic value of the eyes and nasion fordetecting anger (Smith et al 2005) These patternswere consistent across time

For sadness participants fixated most at the eyesIn contrast fixation time was significantly lower atthe upper lip These fixation patterns weresignificant across both emotional and neutralstimuli for both the eyes and upper lip suggestingthis pattern may be driven by a goal-drivenstrategy These patterns were consistent acrosstime

For shame participants fixated to a greater extentat the eyes This pattern was most pronouncedafter the first fixation For all other regions theredid not appear to be any significant differences (seeFigure 5)

Across all blocks it is interesting that there were nosignificant differences found in fixation time spent atthe lower nose whether for emotional or neutral facesit appears to serve as a central fixation point across aface Another finding in our data is that dynamicdifferences are apparent even within the first fixationfor example in the much greater percentage of fixationsto the eye region for anger relative to joyful expres-sions This provides further support for a goal-drivenstrategy during emotional recognition of faces becausesuch differences were typically maintained acrossemotional and neutral stimuli The values and differ-ences within many regions across emotion type werelarger and more pronounced while viewing emotionalversus neutral stimuli still suggesting an important rolefor stimulus-driven factors on eye movements

Gaze patterns were particularly variable acrossemotions in the first few fixations and became lessvariable for the remainder of the viewing time Theupper nose was fixated to the greatest extent as the firstfixation regardless of emotion type and then fixationrates remained relatively variable depending on theemotion type for the remainder of the trial In contrastfor the lower nose the first fixation demonstratedvariability across different emotion types but was areliable target at the second fixation regardless ofemotion For all emotions the upper lip was consis-tently a weaker target at the first fixation and fixatedmost during the second fixation There was no generalfixation pattern at the nasion demonstrating widevariability across emotions over time We also observeda significant left dominance for the initial fixationregardless of stimulus type

Although participants made an average of 83fixations per trial the prediction rates for the Bayesianclassifier showed that the first two fixation locations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 13

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

were most predictive of the type of emotion sought bythe observer This result is consistent with previouswork showing that face recognition can be completedafter only one to two eye movements (Hsiao amp Cottrell2008) although the number of fixations needed torecognize emotional content may be greater

Building on the diagnostic regions of the emotionalfaces identified in Experiment 1 Experiment 2 offered adirect evaluation to whether those regions wereimportant by covering either the mouth or the eyesduring an emotional judgment task Consistent with thepredictions of Experiment 1 emotional judgmentperformance was impaired for emotions for which themouth was more diagnostic (joy disgust) as well as foremotions for which the eyes were more diagnostic(anger shame) Although the present data cannotdirectly confirm that eye movements to these morediagnostic regions helped performance they do showthat attentional strategies were aligned with locationsthat carry diagnostic information

Conclusion

When asked to evaluate the presence of a specificemotion within a face participants focused on acommon set of face regions but also used emotion-specific eye-movement strategies in both emotional andneutral faces These results are consistent with the ideathat focusing attention on certain diagnostic regions isbeneficial for emotion processing and that thesestrategies may be driven by both stimulus-driven andgoal-driven factors The evidence for goal-drivenstrategies is consistent with previous research suggest-ing that the goals and characteristics of the perceivermay influence how eye movements are deployed duringfacial emotion recognition (Jack et al 2009 Perlman etal 2009) Face and emotion recognition are even morebroadly affected by a facersquos gender (Wright amp Sladden2003) race or culture (Jack et al 2009 OrsquoToole et al1994) and individual facial morphology (Oosterhof ampTodorov 2009) These effects likely interact with thegoal-driven factors observed in the current study andwe hope that future research explores such interactions

The presence of a goal-driven strategy duringemotional recognition has numerous practical impli-cations specifically to individuals with disorders thatmay result in social deficits Patients with certaindisorders such as autism (Pelphrey et al 2002 SpezioAdolphs et al 2007 Spezio Huang Castelli ampAdolphs 2007) schizophrenia (Loughland Williamsamp Gordon 2002) social phobias (Horley WilliamsGonsalvez amp Gordon 2004) and Alzheimerrsquos disease(Hargrave Maddock amp Stone 2002 Ogrocki Hills ampStrauss 2000) demonstrate eye-movement patterns

that are remarkably different from normal participantsand these anomalous patterns may impair their abilityto correctly identify the emotional valence of facesThese individuals typically focus longer at nondiag-nostic regions (such as the brow or cheeks) relative todiagnostic regions of the face (eyes mouth nose)during emotional recognition These abnormal eye-movement patterns contribute to social or emotionaldeficits that accompany abnormal face recognition Bycomparing such abnormal eye-movement patterns toour results it may be possible to identify ways in whichaltered eye-movement patterns could improve emotionrecognition

Keywords face recognition emotion eye movementsattention fixation

Acknowledgments

We thank Trixie Lipke for assistance with datacollection and Sumeeth Jonathan for programming andhelp with movie creation

Commercial relationships noneCorresponding author Mark W SchurginEmail maschurginjhueduAddress Department of Psychological amp Brain Sci-ences Johns Hopkins University Baltimore MDUSA

References

Adolphs R Gosselin F Buchanan T W TranelD Schyns P amp Damasio A R (2005) Amechanism for impaired fear recognition afteramygdala damage Nature 433(7021) 68ndash72

Adolphs R Tranel D Damasio H amp Damasio A(1994) Impaired recognition of emotion in facialexpressions following bilateral damage to thehuman amygdala Nature 372 669ndash672

Aviezer H Hassin R R Ryan J Grady CSusskind J Anderson A Bentin S (2008)Angry disgusted or afraid Studies on the mal-leability of emotion perception PsychologicalScience 19(7) 724ndash732

Baron-Cohen S amp Wheelwright S (2004) Theempathy quotient An investigation of adults withAsperger syndrome or high functioning autism andnormal sex differences Journal of Autism andDevelopmental Disorders 34(2) 163ndash175

Beaupre M G amp Hess U (2005) Cross-culturalemotion recognition among Canadian ethnic

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 14

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

groups Journal of Cross-Cultural Psychology36(3) 355ndash370

Blair R J R (2005) Responding to the emotions ofothers Dissociating forms of empathy through thestudy of typical and psychiatric populationsConsciousness and Cognition 14(4) 698ndash718

Brainard D H (1997) The Psychophysics ToolboxSpatial Vision 10 443ndash446

Buchan J N Pare M amp Munhall K G (2007)Spatial statistics of gaze fixations during dynamicface processing Social Neuroscience 2(1) 1ndash13

Darwin C (1872) The expression of emotions in manand animals New York Philosophical Library

Ekman P amp Friesen W V (1975) Unmasking theface Englewood Cliffs NJ Prentice Hall

Ekman P amp Friesen W V (1978) The Facial ActionCoding System (FACS) A technique for themeasurement of facial action Palo Alto CAConsulting Psychologists Press

Elfenbein H A amp Ambady N (2002) Is there an in-group advantage in emotion recognition

Everdell I T Marsh H Yurick M D Munhall KG amp Pare M (2007) Gaze behaviour inaudiovisual speech perception Asymmetrical dis-tribution of face-directed fixations Perception 361535ndash1545

Gosselin F amp Schyns P G (2001) Bubbles Atechnique to reveal the use of information inrecognition tasks Vision Research 41(17) 2261ndash2271

Hargrave R Maddock R J amp Stone V (2002)Impaired recognition of facial expressions ofemotion in Alzheimerrsquos disease Journal of Neuro-psychiatry and Clinical Neurosciences 14 64ndash71

Hejmadi A Davidson R J amp Rozin P (2000)Exploring Hindu Indian emotion expressionsEvidence for accurate recognition by Americansand Indians Psychological Science 11(3) 183ndash187

Henderson J M Falk R Minut S Dyer F C ampMahadevan S (2000) Gaze control for facelearning and recognition by humans and machinesMichigan State University Eye Movement Labora-tory Technical Report 4 1ndash14

Henderson J M Williams C C amp Falk R J (2005)Eye movements are functional during face learningMemory amp Cognition 33(1) 98ndash106

Horley K Williams L M Gonsalvez C amp GordonE (2004) Face to face Visual scanpath evidencefor abnormal processing of facial expressions insocial phobia Psychiatry Research 127 43ndash53

Hsiao J H W amp Cottrell G (2008) Two fixations

suffice in face recognition Psychological Science19(10) 998ndash1006

Jack R E Blais C Scheepers C Schyns P G ampCaldara R (2009) Cultural confusions show thatfacial expressions are not universal Current Biol-ogy 19 1ndash6

Keltner D amp Buswell B N (1997) EmbarrassmentIts distinct form and appeasement functionsPsychological Bulletin 122(3) 250ndash270

Kowler E Anderson E Dosher B amp Blaser E(1995) The role of attention in the programming ofsaccades Vision Research 35(13) 1897ndash1916

Loughland C M Williams L M amp Gordon E(2002) Schizophrenia and affective disorder showdifferent visual scanning behaviour for faces Atrait versus state-based distinction BiologicalPsychiatry 52 338ndash348

Marsh A A Kozak M N amp Ambady N (2007)Accurate identification of fear facial expressionspredicts prosocial behavior Emotion 7(2) 239

Nahm F K D Perret A Amaral D G amp AlbrightT D (1997) How do monkeys look at facesJournal of Cognitive Neuroscience 9(5) 611ndash623

Ogrocki P K Hills A C amp Strauss M E (2000)Visual exploration of facial emotion by healthyolder adults and patients with Alzheimer diseaseNeuropsychiatry Neuropsychology and BehavioralNeurology 13 271ndash278

Oosterhof N N amp Todorov A (2009) Sharedperceptual basis of emotional expressions andtrustworthiness impressions from faces Emotion9(1) 128

OrsquoToole A J Deffenbacher K A Valentin D ampAbdi H (1994) Structural aspects of face recog-nition and the other-race effect Memory ampCognition 22(2) 208ndash224

Pelli D G (1997) The VideoToolbox software forvisual psychophysics Transforming numbers intomovies Spatial Vision 10 437ndash442

Pelphrey K A Sasson N J Reznick J Paul GGoldman B D amp Piven J (2002) Visualscanning of faces in autism Journal of Autism andDevelopmental Disorders 32(4) 1573ndash3432

Perlman S B Morris J P Vander Wyk B CGreen S R Doyle J L amp Pelphrey K A(2009) Individual differences in personality shapehow people look at faces PLoS ONE 4(6) e5952

Pflugshaupt T Mosimann U P Schmitt W J vonWartburg R Wurtz P Luthi M Muri RM (2007) To look or not to look at threatScanpath differences within a group of spiderphobics Journal of Anxiety Disorders 21(3) 353ndash366

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 15

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Sasson N Tsuchiya N Hurley R Couture S MPenn D L Adolphs R amp Piven J (2007)Orienting to social stimuli differentiates socialcognitive impairment in autism and schizophreniaNeuropsychologia 45 2580ndash2588

Schyns P G Petro L S amp Smith M L (2009)Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brainBehavioral and brain evidence PLoS One 4(5)e5625

Segerstrom S C (2001) Optimism and attentionalbias for negative and positive stimuli Personalityand Social Psychology Bulletin 27 1334ndash1343

Smith M L Cottrell G W Gosselin F amp SchynsP G (2005) Transmitting and decoding facialexpressions Psychological Science 16(3) 184ndash189

Spezio M L Adolphs R Hurley R S E amp PivenJ (2007) Analysis of face gaze in autism usinglsquolsquoBubblesrsquorsquo Neuropsychologia 45 144ndash151

Spezio M L Huang P-Y S Castelli F amp Adolphs

R (2007) Amygdala damage impairs eye contactduring conversations with real people The Journalof Neuroscience 27(15) 3994ndash3997

Susskind J M Lee D H Cusi A Feiman RGrabski W amp Anderson A K (2008) Expressingfear enhances sensory acquisition Nature Neuro-science 11(7) 843ndash850

Tracy J L amp Robins R W (2004) Show your prideEvidence for a discrete emotion expression Psy-chological Science 15 194ndash197

Walker-Smith G J Gale A G amp Findlay J M(1977) Eye movement strategies involved in faceperception Perception 6(3) 313ndash326

Wright D B amp Sladden B (2003) An own genderbias and the importance of hair in face recognitionActa Psychologica 114(1) 101ndash114

Zhao W Chellappa R Phillips P J amp RosenfeldA (2003) Face recognition A literature surveyAcm Computing Surveys (CSUR) 35(4) 399ndash458

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 16

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

  • Introduction
  • Experiment 1 Eye tracking of
  • f01
  • f02
  • f03
  • f04
  • f05
  • f06
  • f07
  • Experiment 2 Varying diagnostic information
  • f08
  • f09
  • General discussion
  • Conclusion
  • Adolphs1
  • Adolphs2
  • Aviezer1
  • BaronCohen1
  • Beaupre1
  • Blair1
  • Brainard1
  • Buchan1
  • Darwin1
  • Ekman1
  • Ekman2
  • Elfenbein1
  • Everdell1
  • Gosselin1
  • Hargrave1
  • Hejmadi1
  • Henderson1
  • Henderson2
  • Horley1
  • Hsiao1
  • Jack1
  • Keltner1
  • Kowler1
  • Loughland1
  • Marsh1
  • Nahm1
  • Ogrocki1
  • Oosterhof1
  • Otoole1
  • Pelli1
  • Pelphrey1
  • Perlman1
  • Pflugshaupt1
  • Sasson1
  • Schyns1
  • Segerstrom1
  • Smith1
  • Spezio1
  • Spezio2
  • Susskind1
  • Tracy1
  • WalkerSmith1
  • Wright1
  • Zhao1

Emotional faces

For each region fixation rates on emotional faces(those with 20 40 or 60 emotional intensity) weresubmitted to one-way ANOVA with emotion block asa factor In these trials eye movements can be affectedby both the emotional content of the face images(stimulus-driven information) and the fact that theobserver is currently seeking that emotion type (goal-driven information) Figure 4A shows fixation rates for

each face region across the six emotions We summarizethe major findings at the end of this section withadditional information and statistical analyses detailedin the supplemental materials

For the eyes there was a main effect of face emotionF(5 250) frac14 272 p 0001 whereby participantslooked longer at the eye region in facial expressions ofanger (M frac14 352) fear (M frac14 308) sadness (M frac14342) and shame (Mfrac14 343) and looked less withinthe eye region for disgust (M frac14 197) and joy (M frac14

Figure 4 (A) For each emotional face fixation time spent within each main ROI (B) For each neutral face fixation time spent within

each main ROI Designates t test p 005 relative to the mean for each emotion

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 6

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

195) faces relative to the mean (M frac14 289) allts(50) 82 ps 0001 For the upper nose there wasan effect of emotion condition F(5 250) frac14 47 p

0001 whereby participants looked marginally less atjoyful faces (M frac14 187) relative to the mean (M frac14218) all ts(50) 158 all ps 01 For the lowernose there was no effect of emotion condition F 1For the upper lip there was an effect of emotioncondition F(5 250)frac14 272 p 0001 wherebyparticipants looked longer at the upper lip for disgust(Mfrac14 155) and joy faces (Mfrac14 209) and less at theupper lip for anger (M frac14 82) and sad (M frac14 68)faces relative to the mean (Mfrac14121) all ts(50) 24all ps 002 For the nasion there was an effect ofemotion condition F(5 250) frac14 31 p 001 wherebyparticipants looked less within the nasion for fear faces(M frac14 38) relative to the mean (M frac14 61) t(50) frac1439 p 0001

To better visualize the differences among emotionconditions Figure 5 graphically illustrates fixationrates for each of the top five regions across the sixemotion conditions relative to the average rate ofregion fixation collapsed across all emotions The toprow of the figure depicts a correlate of the mostdiagnostic information in the image that an idealobserver might use to distinguish between emotionaland neutral facesmdashan image subtraction of the 60emotional image from the neutral image for a singleface identity (this face was chosen because it haduniquely low head movement across photographsrequired for this technique to reveal a useful contrast)Areas of greater difference are depicted with white andlower difference with black respectively

The middle row of the figure graphically illustratesfixation rates for each of the top five regions across our

Figure 5 Top row Image subtractions between the emotional and neutral versions of a particular face from the image set used in

Experiment 1 revealing areas of maximal diagnostic information (white) for judging presence of emotion in the face Middle row

Emotional fixation rates for each of the top five regions across the six emotion conditions relative to the average rate of region

fixation collapsed across all emotional face trials Bottom row Diagnostic information for judgment of each emotion type using the

lsquolsquoBubblesrsquorsquo technique (adapted with permission from Smith et al 2005)

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 7

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

six emotion conditions relative to the average rate ofregion fixation collapsed across all emotions

The bottom row of the figure contains images fromSmith and colleagues (2005) using the lsquolsquobubblesrsquorsquotechnique to find the maximally diagnostic regions offaces for judging each emotion The similarities withincolumns of this figure are striking despite theindependence of the data sources for each Forexample in the middle row note the relatively highfixation rates on the lips for joy and disgust in contrastto the relatively high fixation rates on the eyes foranger sadness and shame Both the top and bottomrows confirm the relatively high diagnostic value ofthese areas for those particular emotion judgments

Neutral faces

In each emotion block half of the faces werenonemotional (neutral) Because these faces wereidentical across blocks any difference in fixation ratesrepresent solely goal-driven strategies resulting fromseeking a given emotion and cannot be due to stimulus-level differences in the images The results will show thatweaker versions of emotion-specific strategies endureacross these neutral faces Figure 4B depicts fixationrates within each block across these neutral faces Foreach region of the face we report the results of a one-way ANOVA seeking differences among the six emotionblock types but focusing only on the neutral faces

For the eyes the emotion block analysis revealed aneffect of emotion condition across neutral faces F(5250) frac14 77 p 0001 whereby participants lookedlonger for anger (Mfrac14334) fear (Mfrac14343) sad (Mfrac14 36) and shame (M frac14 361) faces By contrastparticipants looked less for disgust (M frac14 286) andjoy (M frac14 291) faces relative to the mean (M frac14329) all ts(50) 103 all ps 0001 For the uppernose the emotion block analysis revealed that therewas still no effect of emotion condition across neutralfaces F 1 For the lower nose the emotion blockanalysis revealed that there was still no effect ofemotion condition across neutral faces F 1 For theupper lip the emotion block analysis reveals that therewas still an effect of emotion condition across neutralfaces due to relatively high fixation rates for the joy (Mfrac14 14) condition (disgust was no longer significant)and lower fixation rates for sadness (M frac14 79)condition (anger was no longer significant) relative tothe mean (Mfrac14 99) all ts(50) 197 all ps 0055F(5 250)frac14 86 p 0001 For the nasion the emotionblock analysis revealed that there was still a marginaleffect of emotion condition F(5 250)frac14 23 p frac14 004Although the pattern was similar the change in themean value caused a new set of emotion conditions tobe different from the mean Fear was no longer

significantly different but anger (M frac14 87) hadrelatively high fixation rates t(50) frac14 184 p frac14 007

In summary even when the neutral faces wereidentical across emotion judgment blocks (eg angerjoy) there were still effects of emotion judgment typeon patterns of fixation In fact the majority of effectsfound in the emotional faces were still present in theneutral faces although the effects were not as robustOne exception was fixations to the nasion whereneutral faces showed a pattern of fixation not presentfor emotional faces but this effect may be due to thechanging baseline of the mean number of fixations fornasion fixations overall The fact that most fixationtrends remained for neutral faces that were identicalacross blocks suggests that a substantial portion of theeffect of seeking a specific emotion is due to goal-driveninfluences on fixation patterns

Summary of intensity level analyses

In addition to dividing face trials into emotional(20 40 and 60 emotional) and neutral (0emotional) sets we also examined differences among theemotional face trials as they transitioned from 20 to60 We present a summary of this analysis here andprovide more detailed information and statisticalanalyses in the supplemental materials As the intensityof angry expressions increased fixations increased to theeye region except for the anger faces shown at 60intensity which showed a reduction in fixations to theeye region It is possible that the eyes were mostdiagnostic when the emotion was more ambiguous Asimilar pattern was observed at the nasion where moreneutral faces were fixated to a greater extent than moreemotional faces For disgust judgment blocks there wasa particularly strong tradeoff in fixation position Whenfaces were more neutral participants fixated at the eyesmore frequently However as the intensity of the disgustexpression increased participants looked less at the eyeregion and more toward the upper nose and upper lipAs the intensity of fearful expressions increasedparticipants looked toward the upper nose and upper lipand less at the nasion As the intensity of joyfulexpressions increased there was another particularlystrong effect as fixations increased to the upper lip anddecreased to the eyes and lower nose As the intensity ofsad expressions increased the distribution of fixationsover the five face regions did not significantly change Asthe intensity of shameful expressions increased at 60intensity there were more fixations over the upper liptrading off with fewer fixations over the lower nose

Two of the largest effects of emotional intensity werefor disgust and joy expressions For disgust asemotional intensity conveyed in the face increasedpeople showed greater fixation particularly to the uppernose (close to the particularly salient wrinkle that

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 8

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

appears between the eyes during an expression ofdisgust) and for joy people showed greater fixation tothe upper lip (the location of the smile) In both caseswhen faces become more neutral these fixation ratesdecrease and instead the eyes are fixated morefrequently

Gaze patterns across time

The analyses above collapse the first four fixationsacross viewing time Here we examine how thesepatterns evolve over that time Understanding the timecourse of fixation patterns can establish whether thepatterns observed above are stable and persistentacross viewing time or if fixation patterns are more

variable and dynamic We examined both the proba-bility of fixating on a given region of each type ofemotional face within each 250-ms time interval as wellas for each fixation number Although the patternswere highly similar fixation number appeared to revealmore differences especially in the earlier viewingperiod In contrast for the time interval analysisdifferences in fixation patterns among conditionsappeared to be more lsquolsquosmearedrsquorsquo across time suggestingthat the fixation number analysis might reveal moreabout participantsrsquo fixation patterns

The average duration of each fixation was 301 msand participants made between one and 15 fixationsAlthough participants made up to 15 fixations therewere few trials with this many fixations Because theaverage number of fixations was 83 per image weexamined only the first through the eighth fixation Foreach of the following analyses we use the termlsquolsquofixation ratersquorsquo defined as lsquolsquofor all of the Nth fixationson an image in the current condition the percentagethat are within a given regionrsquorsquo This manner ofcalculating fixation rates removed the influence of thegeneral drop in the number of total fixations availableas fixation number increases (ie there are always morethird fixations than fourth fixations)

Across previous data sets using similar images andtasks as well as the present data set we observed thatfixation patterns differed primarily among the first onethrough four fixations The first few fixations weretypically more variable across time and condition andthe remaining fixation rates tended to stabilize Theselater fixations reveal differences among conditions butvary far less over time Therefore we decided to focusour analyses on the first four fixations (a detailedanalysis across all eight fixations is included in thesupplemental section)

In all of the following analyses we compare fixationrates for each region across the six emotion recognitionconditions To maximize the differences among condi-tions for these analyses involving time we use only theemotional face images and omit the neutral images Aseparate analysis of the neutral emotion imagesrevealed similar patterns but to a weaker extent similarto the neutral image analysis reported above

Patterns of gaze across time by emotion

Figure 6 provides fixation distributions for the firstthree emotional fixations (20ndash60) made to aparticular face presented in the study Overall the eyeswere fixated less across all fixations in the joy anddisgust conditions especially during the first threefixations In contrast the eyes were fixated at uniformlyhigh rates across time in the anger and sadnessconditions and at increasing rates across time in thefear and shame conditions The upper nose was a

Figure 6 Fixation distributions for the first three fixations on

emotional (20ndash60) face image trials for one face identity As

participants made more fixations distinct patterns emerged

across emotions in what regions were fixated most Specifically

at the first fixation the upper nose was a particular popular

fixation location although there was still considerable variability

at the region across emotions As participants made additional

fixations a general pattern emerged such that there was

increased fixation at the upper lip for joy and disgust whereas

there was increased fixation at the eyes for anger sadness and

shame

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 9

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

consistently high target across all emotions during thefirst fixation However despite this popularity demon-strated there was still considerable variability infixation patterns across emotions The lower noseshowed wide variability across all emotions at the firstfixation but afterward fixation rates remained muchmore consistent across emotions The upper lip wasfixated at particularly high rates during the secondfixation for joy and disgust and to a weaker degreefear The nasion showed wide variability in fixationpatterns across emotions although there was a highfixation preference at the first fixation for shame

Asymmetry analysis

Previous research has shown that initial fixationstoward faces are biased toward the left side of the facefollowed by a rightward fixation (Everdell MarshYurick Munhall amp Pare 2007) We tested for thiseffect in the present data by coding each fixation asbeing toward the left or right excluding center regionsof the face (UN LN UL LL) A one-way ANOVAwith fixation number (one through eight) and blocktype (emotional or neutral) as factors showed a maineffect of fixation number F(1 7) frac14 4208 p 0001This effect was driven by the first fixation andremoving that fixation from the analysis eliminatedthe main effect F(1 6) frac14 0579 p frac14 0747 The firstfixation during trials across block type had a strongbias toward the left side of the screen (M frac14 581) incomparison to fixations two through eight (M frac14

462) which demonstrated a slight bias toward theright side of the screen These effects confirm theinitial leftward bias in face fixation patterns Possibleroots of this effect including a learned strategy thatthe right side of the face contains the most informa-tion or a general perceptual bias toward the left visualhemifield are discussed in detail in Everdell et al(2007)

Fixation patterns predict facial emotion

If the type of facial emotion displayed (or sought bythe observer) systematically affects the observerrsquospattern of eye movements it should be possible topredict the type of face displayed (or sought) bystatistical analysis of the eye movement sequence forany given trial We separately submitted the first 1ndashNfixation regions (where N frac14 1 to 8) from every trial inthe data set to a naive Bayesian classifier for categoricaldata (the NaiveBayes class of Matlab) using two thirdsof the data for training and one third for testing Wecalculated the average classification accuracy across100 fittest iterations selecting traintest sets indepen-dently for each iteration for (a) all trials (b) emotionalimagendashonly trials and (c) neutral trials

Figure 6 depicts classification accuracy for eachcondition as well as chance prediction (for sixemotions 166) Emotional-image trial classifica-tion accuracy of test trials was highest (with a peakof around 25 accuracy) for the emotional imagendashonly trial set consistent with the stronger fixationpatterns described in our other analyses Neutral-image trial classification accuracy was lower but stillwell above chance performance (with a peak ofaround 20 accuracy) Including all trials led toperformance in between these two (with a peak ofaround 23 accuracy) consistent with Hsiao andCottrell (2008) who showed that the first one to twofixations were most critical for face processing thediagnostic value of our fixation data was primarily inthe first two fixations It is clear from Figure 7 thatthe bulk of predictive power is reached by the secondfixation

Finally we examined the confusion matrix pro-duced by the classifier (Figure 8) When the wrongemotion is predicted which emotions are systemat-ically confused because they are most similar Wefocused here on two subsets of trials the highestemotion (60) images and the neutral image trialsFor the highest emotion images two pairs ofemotions stood out as having the most confusablefixation patterns joy and disgust and anger andsadness This confusability is consistent with theresults depicted in Figure 5 which reveals strongsimilarities between the fixation patterns produced by

Figure 7 Classification accuracy for each condition (emotional

face trials neutral face trials both trial types and chance

prediction) Classification accuracy for emotional trials was

highest consistent with the stronger fixation patterns observed

in our other analyses Neutral trial classification accuracy was

lower but still well above chance Consistent with the observed

fixation patterns of humans the diagnostic value of fixation

data was primarily reached in the first two fixations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 10

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

these pairs of images Two pairs of emotions stoodout as the least confusable joy and sadness and joyand anger This is also consistent with the resultsdepicted in Figure 5 in which the fixation pattern forjoy contrasts strongly with the pattern for anger andsadness For the neutral-trial subset the mostconfusable image pair was again joy and disgust andthe least confusable was again joy and sadness Otherpairings did not stand out as strong outliers for thissubset

Experiment 1 summary

Participants moved their eyes in distinctive patternsfor each type of emotional image These patternsremained (although weaker) for neutral images in eachemotionrsquos block revealing that these patterns were atleast partially goal-directed The pattern of fixationsalone could predict the type of image being sought(even if it was not present) at above chance levels in anaive Bayesian classifier

Participants preferentially fixate certain regions thatthey at least implicitly think are more diagnostic forrecognition of different types of emotions Experiment2 verifies that the two broadest differentiable regionsmdashthe eyes and the mouthmdashdo indeed differ in how wellthey signal particular emotions

Experiment 2 Varying diagnosticinformation of faces altersperformance

In order to verify whether different areas of our facestimuli contain more or less diagnostic information foreach emotion we asked a new set of participants tomake similar judgments of the emotional content offaces but we occluded information either from the eyeregion or the mouth region We predicted that foremotion judgments for which participants in Experi-ment 1 fixated the eye region at relatively high rates(ie anger and shame) occluding that region woulddecrease emotion detection accuracy In contrast foremotions for which the mouth region was particularlyhighly fixated (ie joy and disgust) occluding thatregion would decrease emotion detection accuracy

Methods

Participants

Ten college-aged participants completed Experiment2 All participants gave consent and received coursecredit for their participation

Figure 8 Confusion matrices for the Bayesian classifier including information from the first eight fixations from trials with

photographs showing the highest emotion level (left column) and trials with neutral emotion photographs (right column) The top

row contains full confusion matrices with bold type indicating significant differences from expectations of chance (1667)

conservatively Bonferroni corrected for 36 comparisons To facilitate visual inspection cell values are redundantly color coded for

magnitude within each table The bottom row removes bold formatting and folds each table along its identity diagonal to depict

mutual confusability regardless of whether a pair of classifications is actual and predicted or predicted and actual respectively Boxed

values are highlighted and discussed in the text

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 11

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Stimuli

Stimuli were identical to that of Experiment 1 withthe following exceptions For emotional faces onlyone emotional morph was used per condition Angerdisgust fear and sadness consisted of 40 emotionalintensity morphs whereas joy consisted of only 20and shame 60 We determined the specific emotionalintensity per emotion during piloting in order toensure participants could perform above chance butnot at ceiling (70 correct) The values used areconsistent with the performance observed in Experi-ment 1 (see Figure 3) Additionally all face stimulihad a black box (838 middot 288) covering either the eyesor the mouth

Apparatus

All of the stimuli were created and displayed usingMATLAB with the Psychophysics Toolbox (Brainard1997 Pelli 1997) on an Intel Macintosh running OS X106 All stimuli were displayed on a 17-in View-SonicE70fB CRT monitor (1024 middot 786 resolution 75Hz 2897 pixels8 of visual angle) The viewing distancewas approximately 56 cm

Procedure

This experiment consisted of a total of 72 trialspresented as six blocks of trials (one block of trials peremotion type) shown in random order repeated twicefor a total of 144 trials per participant Each blockconsisted of 12 trials per block consisting of sixneutral and six emotional face trials presented inrandomized order At the beginning of each trialparticipants saw a fixation cross for 1 s followed by abriefly presented face for 100 ms For half of the trials

a black box covered either the eyes or the mouth (seeFigure 9) Participants then judged whether the photodepicted a specific emotion or not (ie angry or notangry)

Results

We examined whether or not overall performancediffered according to whether the mouth (ie eyescovered by black box) or eyes (ie mouth covered byblack box) were visible For joy we found performancewas significantly better when the mouth was visible (Mfrac14 633) relative to when it was covered (Mfrac14 517)t(9)frac14 281 pfrac14 002 A similar pattern was observed fordisgust (Mfrac14 758 Mfrac14 601) t(9)frac14 338 p 001Conversely we found performance was significantlybetter for anger when the eyes were visible (Mfrac14 692)compared to when they were covered (Mfrac14 608) t(9)frac14 335 p 001 A similar trend was observed forshame (M frac14 817 M frac14 683) t(9)frac14 181 p frac14 010We failed to find any significant differences for fear (Mfrac14 65 M frac14 675) and sadness (M frac14 717 M frac14717)

Discussion

Emotion judgment performance depended on whichregions of the face were covered in a mannerconsistent with the fixation patterns found in Exper-iment 1 Performance was best for joy and disgustwhen the eyes were covered and the mouth was visibleconsistent with the eye fixation rates we observed inExperiment 1 in which observers tended to fixate moreat the upper lip and less at the eyes relative to otheremotions Figure 9 provides a strong illustration ofthis effect Joy is easy to detect in the left face butharder to detect in the right face We failed to find anymeaningful differences for fearful faces which isconsistent with the diagnostic regions identified inExperiment 1 the eyes and upper lip are notdifferentially important to identify fear For angerperformance was significantly better when the eyeswere visible compared to when they were covered Asimilar trend was also observed for shameful facesBoth of these effects are consistent with the diagnosticareas identified in Experiment 1 We did not see anysignificant differences for detecting sadness whencovering the mouth versus the eyes even thoughExperiment 1 did suggest different levels of diagnos-ticity for those regions Overall five out of the sixemotion types showed results consistent with thepredictions given the diagnostic areas of emotionalfaces identified in Experiment 1

Figure 9 For all trials participants were briefly shown a face

with either the eyes or mouth occluded The task was to judge

whether the face had a particular emotion or not (A) An

example of a joyful face with the eyes covered (B) an example

of the same face but with the mouth covered

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 12

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

General discussion

Perceiving emotion expressed in a face is associatedwith characteristic patterns of attention both whenthat emotion is visible and when it is merely antici-pated Although previous research has segmented partsof the face into three or four distinct regions (seeAviezer et al 2008 Buchan Pare amp Munhall 2007Everdell et al 2007 Henderson Falk Minut Dyer ampMahadevan 2000) the present study offers a moredetailed analysis using 21 defined regions Our resultsshow that of these regions the five main facial regions(eyes upper nose lower nose upper lip nasion)accounted for 8803 of all fixations any other facialregion accounted for at most 3 of fixations Thesepatterns suggest that these five facial regions may be themost critical for emotional recognition within faces

Despite the dominance of these five regions therewere consistent differences in fixation patterns whenseeking different emotional cues within a face Figure 5summarizes the regions that were highly fixated foreach emotion (relative to the other emotions) and offersa comparison of these patterns to past work exploringwhich regions are most diagnostic for evaluating agiven emotion (see Figure 5 Smith et al 2005)

For joy participants fixated the least at the eyesand spent the most time fixating at the upper lipsThis is likely driven by the importance of the upperlip relative to the smile the most salient facialfeature of joy (see Figure 5) Importantly thesefixation differences were significant across emo-tional and neutral stimuli suggesting a particularlystrong goal-driven strategy for perceiving joy Foremotional faces there also appeared to be amarginal difference in which the upper nose wasfixated less frequently These differences wereprimarily driven by the first few fixations

For disgust participants fixated more often at theupper lip and less often at the eyes Thesedifferences are likely due to the importance of thefurrowing of the nose and mouth when making thedisgust facial expression thus making the upper lipmore salient These patterns were primarily drivenby the first few fixations

For fear participants fixated more at the eyes andrelatively less at the nasion (see Figure 5) Theredid not appear to be any significant differencesrelative to other emotions in fixations at the uppernose lower nose and upper lip These patternswere fairly consistent across time for the nasionalthough the eyes became increasingly fixated overtime

For anger participants fixated most at the eyesand least at the upper lip When emotion was notpresent participants fixated significantly more at

the nasion as well There did not appear to be anysignificant fixation differences at the upper andlower nose (see Figure 5) These results areconsistent with previous findings suggesting thediagnostic value of the eyes and nasion fordetecting anger (Smith et al 2005) These patternswere consistent across time

For sadness participants fixated most at the eyesIn contrast fixation time was significantly lower atthe upper lip These fixation patterns weresignificant across both emotional and neutralstimuli for both the eyes and upper lip suggestingthis pattern may be driven by a goal-drivenstrategy These patterns were consistent acrosstime

For shame participants fixated to a greater extentat the eyes This pattern was most pronouncedafter the first fixation For all other regions theredid not appear to be any significant differences (seeFigure 5)

Across all blocks it is interesting that there were nosignificant differences found in fixation time spent atthe lower nose whether for emotional or neutral facesit appears to serve as a central fixation point across aface Another finding in our data is that dynamicdifferences are apparent even within the first fixationfor example in the much greater percentage of fixationsto the eye region for anger relative to joyful expres-sions This provides further support for a goal-drivenstrategy during emotional recognition of faces becausesuch differences were typically maintained acrossemotional and neutral stimuli The values and differ-ences within many regions across emotion type werelarger and more pronounced while viewing emotionalversus neutral stimuli still suggesting an important rolefor stimulus-driven factors on eye movements

Gaze patterns were particularly variable acrossemotions in the first few fixations and became lessvariable for the remainder of the viewing time Theupper nose was fixated to the greatest extent as the firstfixation regardless of emotion type and then fixationrates remained relatively variable depending on theemotion type for the remainder of the trial In contrastfor the lower nose the first fixation demonstratedvariability across different emotion types but was areliable target at the second fixation regardless ofemotion For all emotions the upper lip was consis-tently a weaker target at the first fixation and fixatedmost during the second fixation There was no generalfixation pattern at the nasion demonstrating widevariability across emotions over time We also observeda significant left dominance for the initial fixationregardless of stimulus type

Although participants made an average of 83fixations per trial the prediction rates for the Bayesianclassifier showed that the first two fixation locations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 13

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

were most predictive of the type of emotion sought bythe observer This result is consistent with previouswork showing that face recognition can be completedafter only one to two eye movements (Hsiao amp Cottrell2008) although the number of fixations needed torecognize emotional content may be greater

Building on the diagnostic regions of the emotionalfaces identified in Experiment 1 Experiment 2 offered adirect evaluation to whether those regions wereimportant by covering either the mouth or the eyesduring an emotional judgment task Consistent with thepredictions of Experiment 1 emotional judgmentperformance was impaired for emotions for which themouth was more diagnostic (joy disgust) as well as foremotions for which the eyes were more diagnostic(anger shame) Although the present data cannotdirectly confirm that eye movements to these morediagnostic regions helped performance they do showthat attentional strategies were aligned with locationsthat carry diagnostic information

Conclusion

When asked to evaluate the presence of a specificemotion within a face participants focused on acommon set of face regions but also used emotion-specific eye-movement strategies in both emotional andneutral faces These results are consistent with the ideathat focusing attention on certain diagnostic regions isbeneficial for emotion processing and that thesestrategies may be driven by both stimulus-driven andgoal-driven factors The evidence for goal-drivenstrategies is consistent with previous research suggest-ing that the goals and characteristics of the perceivermay influence how eye movements are deployed duringfacial emotion recognition (Jack et al 2009 Perlman etal 2009) Face and emotion recognition are even morebroadly affected by a facersquos gender (Wright amp Sladden2003) race or culture (Jack et al 2009 OrsquoToole et al1994) and individual facial morphology (Oosterhof ampTodorov 2009) These effects likely interact with thegoal-driven factors observed in the current study andwe hope that future research explores such interactions

The presence of a goal-driven strategy duringemotional recognition has numerous practical impli-cations specifically to individuals with disorders thatmay result in social deficits Patients with certaindisorders such as autism (Pelphrey et al 2002 SpezioAdolphs et al 2007 Spezio Huang Castelli ampAdolphs 2007) schizophrenia (Loughland Williamsamp Gordon 2002) social phobias (Horley WilliamsGonsalvez amp Gordon 2004) and Alzheimerrsquos disease(Hargrave Maddock amp Stone 2002 Ogrocki Hills ampStrauss 2000) demonstrate eye-movement patterns

that are remarkably different from normal participantsand these anomalous patterns may impair their abilityto correctly identify the emotional valence of facesThese individuals typically focus longer at nondiag-nostic regions (such as the brow or cheeks) relative todiagnostic regions of the face (eyes mouth nose)during emotional recognition These abnormal eye-movement patterns contribute to social or emotionaldeficits that accompany abnormal face recognition Bycomparing such abnormal eye-movement patterns toour results it may be possible to identify ways in whichaltered eye-movement patterns could improve emotionrecognition

Keywords face recognition emotion eye movementsattention fixation

Acknowledgments

We thank Trixie Lipke for assistance with datacollection and Sumeeth Jonathan for programming andhelp with movie creation

Commercial relationships noneCorresponding author Mark W SchurginEmail maschurginjhueduAddress Department of Psychological amp Brain Sci-ences Johns Hopkins University Baltimore MDUSA

References

Adolphs R Gosselin F Buchanan T W TranelD Schyns P amp Damasio A R (2005) Amechanism for impaired fear recognition afteramygdala damage Nature 433(7021) 68ndash72

Adolphs R Tranel D Damasio H amp Damasio A(1994) Impaired recognition of emotion in facialexpressions following bilateral damage to thehuman amygdala Nature 372 669ndash672

Aviezer H Hassin R R Ryan J Grady CSusskind J Anderson A Bentin S (2008)Angry disgusted or afraid Studies on the mal-leability of emotion perception PsychologicalScience 19(7) 724ndash732

Baron-Cohen S amp Wheelwright S (2004) Theempathy quotient An investigation of adults withAsperger syndrome or high functioning autism andnormal sex differences Journal of Autism andDevelopmental Disorders 34(2) 163ndash175

Beaupre M G amp Hess U (2005) Cross-culturalemotion recognition among Canadian ethnic

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 14

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

groups Journal of Cross-Cultural Psychology36(3) 355ndash370

Blair R J R (2005) Responding to the emotions ofothers Dissociating forms of empathy through thestudy of typical and psychiatric populationsConsciousness and Cognition 14(4) 698ndash718

Brainard D H (1997) The Psychophysics ToolboxSpatial Vision 10 443ndash446

Buchan J N Pare M amp Munhall K G (2007)Spatial statistics of gaze fixations during dynamicface processing Social Neuroscience 2(1) 1ndash13

Darwin C (1872) The expression of emotions in manand animals New York Philosophical Library

Ekman P amp Friesen W V (1975) Unmasking theface Englewood Cliffs NJ Prentice Hall

Ekman P amp Friesen W V (1978) The Facial ActionCoding System (FACS) A technique for themeasurement of facial action Palo Alto CAConsulting Psychologists Press

Elfenbein H A amp Ambady N (2002) Is there an in-group advantage in emotion recognition

Everdell I T Marsh H Yurick M D Munhall KG amp Pare M (2007) Gaze behaviour inaudiovisual speech perception Asymmetrical dis-tribution of face-directed fixations Perception 361535ndash1545

Gosselin F amp Schyns P G (2001) Bubbles Atechnique to reveal the use of information inrecognition tasks Vision Research 41(17) 2261ndash2271

Hargrave R Maddock R J amp Stone V (2002)Impaired recognition of facial expressions ofemotion in Alzheimerrsquos disease Journal of Neuro-psychiatry and Clinical Neurosciences 14 64ndash71

Hejmadi A Davidson R J amp Rozin P (2000)Exploring Hindu Indian emotion expressionsEvidence for accurate recognition by Americansand Indians Psychological Science 11(3) 183ndash187

Henderson J M Falk R Minut S Dyer F C ampMahadevan S (2000) Gaze control for facelearning and recognition by humans and machinesMichigan State University Eye Movement Labora-tory Technical Report 4 1ndash14

Henderson J M Williams C C amp Falk R J (2005)Eye movements are functional during face learningMemory amp Cognition 33(1) 98ndash106

Horley K Williams L M Gonsalvez C amp GordonE (2004) Face to face Visual scanpath evidencefor abnormal processing of facial expressions insocial phobia Psychiatry Research 127 43ndash53

Hsiao J H W amp Cottrell G (2008) Two fixations

suffice in face recognition Psychological Science19(10) 998ndash1006

Jack R E Blais C Scheepers C Schyns P G ampCaldara R (2009) Cultural confusions show thatfacial expressions are not universal Current Biol-ogy 19 1ndash6

Keltner D amp Buswell B N (1997) EmbarrassmentIts distinct form and appeasement functionsPsychological Bulletin 122(3) 250ndash270

Kowler E Anderson E Dosher B amp Blaser E(1995) The role of attention in the programming ofsaccades Vision Research 35(13) 1897ndash1916

Loughland C M Williams L M amp Gordon E(2002) Schizophrenia and affective disorder showdifferent visual scanning behaviour for faces Atrait versus state-based distinction BiologicalPsychiatry 52 338ndash348

Marsh A A Kozak M N amp Ambady N (2007)Accurate identification of fear facial expressionspredicts prosocial behavior Emotion 7(2) 239

Nahm F K D Perret A Amaral D G amp AlbrightT D (1997) How do monkeys look at facesJournal of Cognitive Neuroscience 9(5) 611ndash623

Ogrocki P K Hills A C amp Strauss M E (2000)Visual exploration of facial emotion by healthyolder adults and patients with Alzheimer diseaseNeuropsychiatry Neuropsychology and BehavioralNeurology 13 271ndash278

Oosterhof N N amp Todorov A (2009) Sharedperceptual basis of emotional expressions andtrustworthiness impressions from faces Emotion9(1) 128

OrsquoToole A J Deffenbacher K A Valentin D ampAbdi H (1994) Structural aspects of face recog-nition and the other-race effect Memory ampCognition 22(2) 208ndash224

Pelli D G (1997) The VideoToolbox software forvisual psychophysics Transforming numbers intomovies Spatial Vision 10 437ndash442

Pelphrey K A Sasson N J Reznick J Paul GGoldman B D amp Piven J (2002) Visualscanning of faces in autism Journal of Autism andDevelopmental Disorders 32(4) 1573ndash3432

Perlman S B Morris J P Vander Wyk B CGreen S R Doyle J L amp Pelphrey K A(2009) Individual differences in personality shapehow people look at faces PLoS ONE 4(6) e5952

Pflugshaupt T Mosimann U P Schmitt W J vonWartburg R Wurtz P Luthi M Muri RM (2007) To look or not to look at threatScanpath differences within a group of spiderphobics Journal of Anxiety Disorders 21(3) 353ndash366

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 15

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Sasson N Tsuchiya N Hurley R Couture S MPenn D L Adolphs R amp Piven J (2007)Orienting to social stimuli differentiates socialcognitive impairment in autism and schizophreniaNeuropsychologia 45 2580ndash2588

Schyns P G Petro L S amp Smith M L (2009)Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brainBehavioral and brain evidence PLoS One 4(5)e5625

Segerstrom S C (2001) Optimism and attentionalbias for negative and positive stimuli Personalityand Social Psychology Bulletin 27 1334ndash1343

Smith M L Cottrell G W Gosselin F amp SchynsP G (2005) Transmitting and decoding facialexpressions Psychological Science 16(3) 184ndash189

Spezio M L Adolphs R Hurley R S E amp PivenJ (2007) Analysis of face gaze in autism usinglsquolsquoBubblesrsquorsquo Neuropsychologia 45 144ndash151

Spezio M L Huang P-Y S Castelli F amp Adolphs

R (2007) Amygdala damage impairs eye contactduring conversations with real people The Journalof Neuroscience 27(15) 3994ndash3997

Susskind J M Lee D H Cusi A Feiman RGrabski W amp Anderson A K (2008) Expressingfear enhances sensory acquisition Nature Neuro-science 11(7) 843ndash850

Tracy J L amp Robins R W (2004) Show your prideEvidence for a discrete emotion expression Psy-chological Science 15 194ndash197

Walker-Smith G J Gale A G amp Findlay J M(1977) Eye movement strategies involved in faceperception Perception 6(3) 313ndash326

Wright D B amp Sladden B (2003) An own genderbias and the importance of hair in face recognitionActa Psychologica 114(1) 101ndash114

Zhao W Chellappa R Phillips P J amp RosenfeldA (2003) Face recognition A literature surveyAcm Computing Surveys (CSUR) 35(4) 399ndash458

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 16

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

  • Introduction
  • Experiment 1 Eye tracking of
  • f01
  • f02
  • f03
  • f04
  • f05
  • f06
  • f07
  • Experiment 2 Varying diagnostic information
  • f08
  • f09
  • General discussion
  • Conclusion
  • Adolphs1
  • Adolphs2
  • Aviezer1
  • BaronCohen1
  • Beaupre1
  • Blair1
  • Brainard1
  • Buchan1
  • Darwin1
  • Ekman1
  • Ekman2
  • Elfenbein1
  • Everdell1
  • Gosselin1
  • Hargrave1
  • Hejmadi1
  • Henderson1
  • Henderson2
  • Horley1
  • Hsiao1
  • Jack1
  • Keltner1
  • Kowler1
  • Loughland1
  • Marsh1
  • Nahm1
  • Ogrocki1
  • Oosterhof1
  • Otoole1
  • Pelli1
  • Pelphrey1
  • Perlman1
  • Pflugshaupt1
  • Sasson1
  • Schyns1
  • Segerstrom1
  • Smith1
  • Spezio1
  • Spezio2
  • Susskind1
  • Tracy1
  • WalkerSmith1
  • Wright1
  • Zhao1

195) faces relative to the mean (M frac14 289) allts(50) 82 ps 0001 For the upper nose there wasan effect of emotion condition F(5 250) frac14 47 p

0001 whereby participants looked marginally less atjoyful faces (M frac14 187) relative to the mean (M frac14218) all ts(50) 158 all ps 01 For the lowernose there was no effect of emotion condition F 1For the upper lip there was an effect of emotioncondition F(5 250)frac14 272 p 0001 wherebyparticipants looked longer at the upper lip for disgust(Mfrac14 155) and joy faces (Mfrac14 209) and less at theupper lip for anger (M frac14 82) and sad (M frac14 68)faces relative to the mean (Mfrac14121) all ts(50) 24all ps 002 For the nasion there was an effect ofemotion condition F(5 250) frac14 31 p 001 wherebyparticipants looked less within the nasion for fear faces(M frac14 38) relative to the mean (M frac14 61) t(50) frac1439 p 0001

To better visualize the differences among emotionconditions Figure 5 graphically illustrates fixationrates for each of the top five regions across the sixemotion conditions relative to the average rate ofregion fixation collapsed across all emotions The toprow of the figure depicts a correlate of the mostdiagnostic information in the image that an idealobserver might use to distinguish between emotionaland neutral facesmdashan image subtraction of the 60emotional image from the neutral image for a singleface identity (this face was chosen because it haduniquely low head movement across photographsrequired for this technique to reveal a useful contrast)Areas of greater difference are depicted with white andlower difference with black respectively

The middle row of the figure graphically illustratesfixation rates for each of the top five regions across our

Figure 5 Top row Image subtractions between the emotional and neutral versions of a particular face from the image set used in

Experiment 1 revealing areas of maximal diagnostic information (white) for judging presence of emotion in the face Middle row

Emotional fixation rates for each of the top five regions across the six emotion conditions relative to the average rate of region

fixation collapsed across all emotional face trials Bottom row Diagnostic information for judgment of each emotion type using the

lsquolsquoBubblesrsquorsquo technique (adapted with permission from Smith et al 2005)

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 7

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

six emotion conditions relative to the average rate ofregion fixation collapsed across all emotions

The bottom row of the figure contains images fromSmith and colleagues (2005) using the lsquolsquobubblesrsquorsquotechnique to find the maximally diagnostic regions offaces for judging each emotion The similarities withincolumns of this figure are striking despite theindependence of the data sources for each Forexample in the middle row note the relatively highfixation rates on the lips for joy and disgust in contrastto the relatively high fixation rates on the eyes foranger sadness and shame Both the top and bottomrows confirm the relatively high diagnostic value ofthese areas for those particular emotion judgments

Neutral faces

In each emotion block half of the faces werenonemotional (neutral) Because these faces wereidentical across blocks any difference in fixation ratesrepresent solely goal-driven strategies resulting fromseeking a given emotion and cannot be due to stimulus-level differences in the images The results will show thatweaker versions of emotion-specific strategies endureacross these neutral faces Figure 4B depicts fixationrates within each block across these neutral faces Foreach region of the face we report the results of a one-way ANOVA seeking differences among the six emotionblock types but focusing only on the neutral faces

For the eyes the emotion block analysis revealed aneffect of emotion condition across neutral faces F(5250) frac14 77 p 0001 whereby participants lookedlonger for anger (Mfrac14334) fear (Mfrac14343) sad (Mfrac14 36) and shame (M frac14 361) faces By contrastparticipants looked less for disgust (M frac14 286) andjoy (M frac14 291) faces relative to the mean (M frac14329) all ts(50) 103 all ps 0001 For the uppernose the emotion block analysis revealed that therewas still no effect of emotion condition across neutralfaces F 1 For the lower nose the emotion blockanalysis revealed that there was still no effect ofemotion condition across neutral faces F 1 For theupper lip the emotion block analysis reveals that therewas still an effect of emotion condition across neutralfaces due to relatively high fixation rates for the joy (Mfrac14 14) condition (disgust was no longer significant)and lower fixation rates for sadness (M frac14 79)condition (anger was no longer significant) relative tothe mean (Mfrac14 99) all ts(50) 197 all ps 0055F(5 250)frac14 86 p 0001 For the nasion the emotionblock analysis revealed that there was still a marginaleffect of emotion condition F(5 250)frac14 23 p frac14 004Although the pattern was similar the change in themean value caused a new set of emotion conditions tobe different from the mean Fear was no longer

significantly different but anger (M frac14 87) hadrelatively high fixation rates t(50) frac14 184 p frac14 007

In summary even when the neutral faces wereidentical across emotion judgment blocks (eg angerjoy) there were still effects of emotion judgment typeon patterns of fixation In fact the majority of effectsfound in the emotional faces were still present in theneutral faces although the effects were not as robustOne exception was fixations to the nasion whereneutral faces showed a pattern of fixation not presentfor emotional faces but this effect may be due to thechanging baseline of the mean number of fixations fornasion fixations overall The fact that most fixationtrends remained for neutral faces that were identicalacross blocks suggests that a substantial portion of theeffect of seeking a specific emotion is due to goal-driveninfluences on fixation patterns

Summary of intensity level analyses

In addition to dividing face trials into emotional(20 40 and 60 emotional) and neutral (0emotional) sets we also examined differences among theemotional face trials as they transitioned from 20 to60 We present a summary of this analysis here andprovide more detailed information and statisticalanalyses in the supplemental materials As the intensityof angry expressions increased fixations increased to theeye region except for the anger faces shown at 60intensity which showed a reduction in fixations to theeye region It is possible that the eyes were mostdiagnostic when the emotion was more ambiguous Asimilar pattern was observed at the nasion where moreneutral faces were fixated to a greater extent than moreemotional faces For disgust judgment blocks there wasa particularly strong tradeoff in fixation position Whenfaces were more neutral participants fixated at the eyesmore frequently However as the intensity of the disgustexpression increased participants looked less at the eyeregion and more toward the upper nose and upper lipAs the intensity of fearful expressions increasedparticipants looked toward the upper nose and upper lipand less at the nasion As the intensity of joyfulexpressions increased there was another particularlystrong effect as fixations increased to the upper lip anddecreased to the eyes and lower nose As the intensity ofsad expressions increased the distribution of fixationsover the five face regions did not significantly change Asthe intensity of shameful expressions increased at 60intensity there were more fixations over the upper liptrading off with fewer fixations over the lower nose

Two of the largest effects of emotional intensity werefor disgust and joy expressions For disgust asemotional intensity conveyed in the face increasedpeople showed greater fixation particularly to the uppernose (close to the particularly salient wrinkle that

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 8

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

appears between the eyes during an expression ofdisgust) and for joy people showed greater fixation tothe upper lip (the location of the smile) In both caseswhen faces become more neutral these fixation ratesdecrease and instead the eyes are fixated morefrequently

Gaze patterns across time

The analyses above collapse the first four fixationsacross viewing time Here we examine how thesepatterns evolve over that time Understanding the timecourse of fixation patterns can establish whether thepatterns observed above are stable and persistentacross viewing time or if fixation patterns are more

variable and dynamic We examined both the proba-bility of fixating on a given region of each type ofemotional face within each 250-ms time interval as wellas for each fixation number Although the patternswere highly similar fixation number appeared to revealmore differences especially in the earlier viewingperiod In contrast for the time interval analysisdifferences in fixation patterns among conditionsappeared to be more lsquolsquosmearedrsquorsquo across time suggestingthat the fixation number analysis might reveal moreabout participantsrsquo fixation patterns

The average duration of each fixation was 301 msand participants made between one and 15 fixationsAlthough participants made up to 15 fixations therewere few trials with this many fixations Because theaverage number of fixations was 83 per image weexamined only the first through the eighth fixation Foreach of the following analyses we use the termlsquolsquofixation ratersquorsquo defined as lsquolsquofor all of the Nth fixationson an image in the current condition the percentagethat are within a given regionrsquorsquo This manner ofcalculating fixation rates removed the influence of thegeneral drop in the number of total fixations availableas fixation number increases (ie there are always morethird fixations than fourth fixations)

Across previous data sets using similar images andtasks as well as the present data set we observed thatfixation patterns differed primarily among the first onethrough four fixations The first few fixations weretypically more variable across time and condition andthe remaining fixation rates tended to stabilize Theselater fixations reveal differences among conditions butvary far less over time Therefore we decided to focusour analyses on the first four fixations (a detailedanalysis across all eight fixations is included in thesupplemental section)

In all of the following analyses we compare fixationrates for each region across the six emotion recognitionconditions To maximize the differences among condi-tions for these analyses involving time we use only theemotional face images and omit the neutral images Aseparate analysis of the neutral emotion imagesrevealed similar patterns but to a weaker extent similarto the neutral image analysis reported above

Patterns of gaze across time by emotion

Figure 6 provides fixation distributions for the firstthree emotional fixations (20ndash60) made to aparticular face presented in the study Overall the eyeswere fixated less across all fixations in the joy anddisgust conditions especially during the first threefixations In contrast the eyes were fixated at uniformlyhigh rates across time in the anger and sadnessconditions and at increasing rates across time in thefear and shame conditions The upper nose was a

Figure 6 Fixation distributions for the first three fixations on

emotional (20ndash60) face image trials for one face identity As

participants made more fixations distinct patterns emerged

across emotions in what regions were fixated most Specifically

at the first fixation the upper nose was a particular popular

fixation location although there was still considerable variability

at the region across emotions As participants made additional

fixations a general pattern emerged such that there was

increased fixation at the upper lip for joy and disgust whereas

there was increased fixation at the eyes for anger sadness and

shame

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 9

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

consistently high target across all emotions during thefirst fixation However despite this popularity demon-strated there was still considerable variability infixation patterns across emotions The lower noseshowed wide variability across all emotions at the firstfixation but afterward fixation rates remained muchmore consistent across emotions The upper lip wasfixated at particularly high rates during the secondfixation for joy and disgust and to a weaker degreefear The nasion showed wide variability in fixationpatterns across emotions although there was a highfixation preference at the first fixation for shame

Asymmetry analysis

Previous research has shown that initial fixationstoward faces are biased toward the left side of the facefollowed by a rightward fixation (Everdell MarshYurick Munhall amp Pare 2007) We tested for thiseffect in the present data by coding each fixation asbeing toward the left or right excluding center regionsof the face (UN LN UL LL) A one-way ANOVAwith fixation number (one through eight) and blocktype (emotional or neutral) as factors showed a maineffect of fixation number F(1 7) frac14 4208 p 0001This effect was driven by the first fixation andremoving that fixation from the analysis eliminatedthe main effect F(1 6) frac14 0579 p frac14 0747 The firstfixation during trials across block type had a strongbias toward the left side of the screen (M frac14 581) incomparison to fixations two through eight (M frac14

462) which demonstrated a slight bias toward theright side of the screen These effects confirm theinitial leftward bias in face fixation patterns Possibleroots of this effect including a learned strategy thatthe right side of the face contains the most informa-tion or a general perceptual bias toward the left visualhemifield are discussed in detail in Everdell et al(2007)

Fixation patterns predict facial emotion

If the type of facial emotion displayed (or sought bythe observer) systematically affects the observerrsquospattern of eye movements it should be possible topredict the type of face displayed (or sought) bystatistical analysis of the eye movement sequence forany given trial We separately submitted the first 1ndashNfixation regions (where N frac14 1 to 8) from every trial inthe data set to a naive Bayesian classifier for categoricaldata (the NaiveBayes class of Matlab) using two thirdsof the data for training and one third for testing Wecalculated the average classification accuracy across100 fittest iterations selecting traintest sets indepen-dently for each iteration for (a) all trials (b) emotionalimagendashonly trials and (c) neutral trials

Figure 6 depicts classification accuracy for eachcondition as well as chance prediction (for sixemotions 166) Emotional-image trial classifica-tion accuracy of test trials was highest (with a peakof around 25 accuracy) for the emotional imagendashonly trial set consistent with the stronger fixationpatterns described in our other analyses Neutral-image trial classification accuracy was lower but stillwell above chance performance (with a peak ofaround 20 accuracy) Including all trials led toperformance in between these two (with a peak ofaround 23 accuracy) consistent with Hsiao andCottrell (2008) who showed that the first one to twofixations were most critical for face processing thediagnostic value of our fixation data was primarily inthe first two fixations It is clear from Figure 7 thatthe bulk of predictive power is reached by the secondfixation

Finally we examined the confusion matrix pro-duced by the classifier (Figure 8) When the wrongemotion is predicted which emotions are systemat-ically confused because they are most similar Wefocused here on two subsets of trials the highestemotion (60) images and the neutral image trialsFor the highest emotion images two pairs ofemotions stood out as having the most confusablefixation patterns joy and disgust and anger andsadness This confusability is consistent with theresults depicted in Figure 5 which reveals strongsimilarities between the fixation patterns produced by

Figure 7 Classification accuracy for each condition (emotional

face trials neutral face trials both trial types and chance

prediction) Classification accuracy for emotional trials was

highest consistent with the stronger fixation patterns observed

in our other analyses Neutral trial classification accuracy was

lower but still well above chance Consistent with the observed

fixation patterns of humans the diagnostic value of fixation

data was primarily reached in the first two fixations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 10

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

these pairs of images Two pairs of emotions stoodout as the least confusable joy and sadness and joyand anger This is also consistent with the resultsdepicted in Figure 5 in which the fixation pattern forjoy contrasts strongly with the pattern for anger andsadness For the neutral-trial subset the mostconfusable image pair was again joy and disgust andthe least confusable was again joy and sadness Otherpairings did not stand out as strong outliers for thissubset

Experiment 1 summary

Participants moved their eyes in distinctive patternsfor each type of emotional image These patternsremained (although weaker) for neutral images in eachemotionrsquos block revealing that these patterns were atleast partially goal-directed The pattern of fixationsalone could predict the type of image being sought(even if it was not present) at above chance levels in anaive Bayesian classifier

Participants preferentially fixate certain regions thatthey at least implicitly think are more diagnostic forrecognition of different types of emotions Experiment2 verifies that the two broadest differentiable regionsmdashthe eyes and the mouthmdashdo indeed differ in how wellthey signal particular emotions

Experiment 2 Varying diagnosticinformation of faces altersperformance

In order to verify whether different areas of our facestimuli contain more or less diagnostic information foreach emotion we asked a new set of participants tomake similar judgments of the emotional content offaces but we occluded information either from the eyeregion or the mouth region We predicted that foremotion judgments for which participants in Experi-ment 1 fixated the eye region at relatively high rates(ie anger and shame) occluding that region woulddecrease emotion detection accuracy In contrast foremotions for which the mouth region was particularlyhighly fixated (ie joy and disgust) occluding thatregion would decrease emotion detection accuracy

Methods

Participants

Ten college-aged participants completed Experiment2 All participants gave consent and received coursecredit for their participation

Figure 8 Confusion matrices for the Bayesian classifier including information from the first eight fixations from trials with

photographs showing the highest emotion level (left column) and trials with neutral emotion photographs (right column) The top

row contains full confusion matrices with bold type indicating significant differences from expectations of chance (1667)

conservatively Bonferroni corrected for 36 comparisons To facilitate visual inspection cell values are redundantly color coded for

magnitude within each table The bottom row removes bold formatting and folds each table along its identity diagonal to depict

mutual confusability regardless of whether a pair of classifications is actual and predicted or predicted and actual respectively Boxed

values are highlighted and discussed in the text

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 11

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Stimuli

Stimuli were identical to that of Experiment 1 withthe following exceptions For emotional faces onlyone emotional morph was used per condition Angerdisgust fear and sadness consisted of 40 emotionalintensity morphs whereas joy consisted of only 20and shame 60 We determined the specific emotionalintensity per emotion during piloting in order toensure participants could perform above chance butnot at ceiling (70 correct) The values used areconsistent with the performance observed in Experi-ment 1 (see Figure 3) Additionally all face stimulihad a black box (838 middot 288) covering either the eyesor the mouth

Apparatus

All of the stimuli were created and displayed usingMATLAB with the Psychophysics Toolbox (Brainard1997 Pelli 1997) on an Intel Macintosh running OS X106 All stimuli were displayed on a 17-in View-SonicE70fB CRT monitor (1024 middot 786 resolution 75Hz 2897 pixels8 of visual angle) The viewing distancewas approximately 56 cm

Procedure

This experiment consisted of a total of 72 trialspresented as six blocks of trials (one block of trials peremotion type) shown in random order repeated twicefor a total of 144 trials per participant Each blockconsisted of 12 trials per block consisting of sixneutral and six emotional face trials presented inrandomized order At the beginning of each trialparticipants saw a fixation cross for 1 s followed by abriefly presented face for 100 ms For half of the trials

a black box covered either the eyes or the mouth (seeFigure 9) Participants then judged whether the photodepicted a specific emotion or not (ie angry or notangry)

Results

We examined whether or not overall performancediffered according to whether the mouth (ie eyescovered by black box) or eyes (ie mouth covered byblack box) were visible For joy we found performancewas significantly better when the mouth was visible (Mfrac14 633) relative to when it was covered (Mfrac14 517)t(9)frac14 281 pfrac14 002 A similar pattern was observed fordisgust (Mfrac14 758 Mfrac14 601) t(9)frac14 338 p 001Conversely we found performance was significantlybetter for anger when the eyes were visible (Mfrac14 692)compared to when they were covered (Mfrac14 608) t(9)frac14 335 p 001 A similar trend was observed forshame (M frac14 817 M frac14 683) t(9)frac14 181 p frac14 010We failed to find any significant differences for fear (Mfrac14 65 M frac14 675) and sadness (M frac14 717 M frac14717)

Discussion

Emotion judgment performance depended on whichregions of the face were covered in a mannerconsistent with the fixation patterns found in Exper-iment 1 Performance was best for joy and disgustwhen the eyes were covered and the mouth was visibleconsistent with the eye fixation rates we observed inExperiment 1 in which observers tended to fixate moreat the upper lip and less at the eyes relative to otheremotions Figure 9 provides a strong illustration ofthis effect Joy is easy to detect in the left face butharder to detect in the right face We failed to find anymeaningful differences for fearful faces which isconsistent with the diagnostic regions identified inExperiment 1 the eyes and upper lip are notdifferentially important to identify fear For angerperformance was significantly better when the eyeswere visible compared to when they were covered Asimilar trend was also observed for shameful facesBoth of these effects are consistent with the diagnosticareas identified in Experiment 1 We did not see anysignificant differences for detecting sadness whencovering the mouth versus the eyes even thoughExperiment 1 did suggest different levels of diagnos-ticity for those regions Overall five out of the sixemotion types showed results consistent with thepredictions given the diagnostic areas of emotionalfaces identified in Experiment 1

Figure 9 For all trials participants were briefly shown a face

with either the eyes or mouth occluded The task was to judge

whether the face had a particular emotion or not (A) An

example of a joyful face with the eyes covered (B) an example

of the same face but with the mouth covered

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 12

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

General discussion

Perceiving emotion expressed in a face is associatedwith characteristic patterns of attention both whenthat emotion is visible and when it is merely antici-pated Although previous research has segmented partsof the face into three or four distinct regions (seeAviezer et al 2008 Buchan Pare amp Munhall 2007Everdell et al 2007 Henderson Falk Minut Dyer ampMahadevan 2000) the present study offers a moredetailed analysis using 21 defined regions Our resultsshow that of these regions the five main facial regions(eyes upper nose lower nose upper lip nasion)accounted for 8803 of all fixations any other facialregion accounted for at most 3 of fixations Thesepatterns suggest that these five facial regions may be themost critical for emotional recognition within faces

Despite the dominance of these five regions therewere consistent differences in fixation patterns whenseeking different emotional cues within a face Figure 5summarizes the regions that were highly fixated foreach emotion (relative to the other emotions) and offersa comparison of these patterns to past work exploringwhich regions are most diagnostic for evaluating agiven emotion (see Figure 5 Smith et al 2005)

For joy participants fixated the least at the eyesand spent the most time fixating at the upper lipsThis is likely driven by the importance of the upperlip relative to the smile the most salient facialfeature of joy (see Figure 5) Importantly thesefixation differences were significant across emo-tional and neutral stimuli suggesting a particularlystrong goal-driven strategy for perceiving joy Foremotional faces there also appeared to be amarginal difference in which the upper nose wasfixated less frequently These differences wereprimarily driven by the first few fixations

For disgust participants fixated more often at theupper lip and less often at the eyes Thesedifferences are likely due to the importance of thefurrowing of the nose and mouth when making thedisgust facial expression thus making the upper lipmore salient These patterns were primarily drivenby the first few fixations

For fear participants fixated more at the eyes andrelatively less at the nasion (see Figure 5) Theredid not appear to be any significant differencesrelative to other emotions in fixations at the uppernose lower nose and upper lip These patternswere fairly consistent across time for the nasionalthough the eyes became increasingly fixated overtime

For anger participants fixated most at the eyesand least at the upper lip When emotion was notpresent participants fixated significantly more at

the nasion as well There did not appear to be anysignificant fixation differences at the upper andlower nose (see Figure 5) These results areconsistent with previous findings suggesting thediagnostic value of the eyes and nasion fordetecting anger (Smith et al 2005) These patternswere consistent across time

For sadness participants fixated most at the eyesIn contrast fixation time was significantly lower atthe upper lip These fixation patterns weresignificant across both emotional and neutralstimuli for both the eyes and upper lip suggestingthis pattern may be driven by a goal-drivenstrategy These patterns were consistent acrosstime

For shame participants fixated to a greater extentat the eyes This pattern was most pronouncedafter the first fixation For all other regions theredid not appear to be any significant differences (seeFigure 5)

Across all blocks it is interesting that there were nosignificant differences found in fixation time spent atthe lower nose whether for emotional or neutral facesit appears to serve as a central fixation point across aface Another finding in our data is that dynamicdifferences are apparent even within the first fixationfor example in the much greater percentage of fixationsto the eye region for anger relative to joyful expres-sions This provides further support for a goal-drivenstrategy during emotional recognition of faces becausesuch differences were typically maintained acrossemotional and neutral stimuli The values and differ-ences within many regions across emotion type werelarger and more pronounced while viewing emotionalversus neutral stimuli still suggesting an important rolefor stimulus-driven factors on eye movements

Gaze patterns were particularly variable acrossemotions in the first few fixations and became lessvariable for the remainder of the viewing time Theupper nose was fixated to the greatest extent as the firstfixation regardless of emotion type and then fixationrates remained relatively variable depending on theemotion type for the remainder of the trial In contrastfor the lower nose the first fixation demonstratedvariability across different emotion types but was areliable target at the second fixation regardless ofemotion For all emotions the upper lip was consis-tently a weaker target at the first fixation and fixatedmost during the second fixation There was no generalfixation pattern at the nasion demonstrating widevariability across emotions over time We also observeda significant left dominance for the initial fixationregardless of stimulus type

Although participants made an average of 83fixations per trial the prediction rates for the Bayesianclassifier showed that the first two fixation locations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 13

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

were most predictive of the type of emotion sought bythe observer This result is consistent with previouswork showing that face recognition can be completedafter only one to two eye movements (Hsiao amp Cottrell2008) although the number of fixations needed torecognize emotional content may be greater

Building on the diagnostic regions of the emotionalfaces identified in Experiment 1 Experiment 2 offered adirect evaluation to whether those regions wereimportant by covering either the mouth or the eyesduring an emotional judgment task Consistent with thepredictions of Experiment 1 emotional judgmentperformance was impaired for emotions for which themouth was more diagnostic (joy disgust) as well as foremotions for which the eyes were more diagnostic(anger shame) Although the present data cannotdirectly confirm that eye movements to these morediagnostic regions helped performance they do showthat attentional strategies were aligned with locationsthat carry diagnostic information

Conclusion

When asked to evaluate the presence of a specificemotion within a face participants focused on acommon set of face regions but also used emotion-specific eye-movement strategies in both emotional andneutral faces These results are consistent with the ideathat focusing attention on certain diagnostic regions isbeneficial for emotion processing and that thesestrategies may be driven by both stimulus-driven andgoal-driven factors The evidence for goal-drivenstrategies is consistent with previous research suggest-ing that the goals and characteristics of the perceivermay influence how eye movements are deployed duringfacial emotion recognition (Jack et al 2009 Perlman etal 2009) Face and emotion recognition are even morebroadly affected by a facersquos gender (Wright amp Sladden2003) race or culture (Jack et al 2009 OrsquoToole et al1994) and individual facial morphology (Oosterhof ampTodorov 2009) These effects likely interact with thegoal-driven factors observed in the current study andwe hope that future research explores such interactions

The presence of a goal-driven strategy duringemotional recognition has numerous practical impli-cations specifically to individuals with disorders thatmay result in social deficits Patients with certaindisorders such as autism (Pelphrey et al 2002 SpezioAdolphs et al 2007 Spezio Huang Castelli ampAdolphs 2007) schizophrenia (Loughland Williamsamp Gordon 2002) social phobias (Horley WilliamsGonsalvez amp Gordon 2004) and Alzheimerrsquos disease(Hargrave Maddock amp Stone 2002 Ogrocki Hills ampStrauss 2000) demonstrate eye-movement patterns

that are remarkably different from normal participantsand these anomalous patterns may impair their abilityto correctly identify the emotional valence of facesThese individuals typically focus longer at nondiag-nostic regions (such as the brow or cheeks) relative todiagnostic regions of the face (eyes mouth nose)during emotional recognition These abnormal eye-movement patterns contribute to social or emotionaldeficits that accompany abnormal face recognition Bycomparing such abnormal eye-movement patterns toour results it may be possible to identify ways in whichaltered eye-movement patterns could improve emotionrecognition

Keywords face recognition emotion eye movementsattention fixation

Acknowledgments

We thank Trixie Lipke for assistance with datacollection and Sumeeth Jonathan for programming andhelp with movie creation

Commercial relationships noneCorresponding author Mark W SchurginEmail maschurginjhueduAddress Department of Psychological amp Brain Sci-ences Johns Hopkins University Baltimore MDUSA

References

Adolphs R Gosselin F Buchanan T W TranelD Schyns P amp Damasio A R (2005) Amechanism for impaired fear recognition afteramygdala damage Nature 433(7021) 68ndash72

Adolphs R Tranel D Damasio H amp Damasio A(1994) Impaired recognition of emotion in facialexpressions following bilateral damage to thehuman amygdala Nature 372 669ndash672

Aviezer H Hassin R R Ryan J Grady CSusskind J Anderson A Bentin S (2008)Angry disgusted or afraid Studies on the mal-leability of emotion perception PsychologicalScience 19(7) 724ndash732

Baron-Cohen S amp Wheelwright S (2004) Theempathy quotient An investigation of adults withAsperger syndrome or high functioning autism andnormal sex differences Journal of Autism andDevelopmental Disorders 34(2) 163ndash175

Beaupre M G amp Hess U (2005) Cross-culturalemotion recognition among Canadian ethnic

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 14

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

groups Journal of Cross-Cultural Psychology36(3) 355ndash370

Blair R J R (2005) Responding to the emotions ofothers Dissociating forms of empathy through thestudy of typical and psychiatric populationsConsciousness and Cognition 14(4) 698ndash718

Brainard D H (1997) The Psychophysics ToolboxSpatial Vision 10 443ndash446

Buchan J N Pare M amp Munhall K G (2007)Spatial statistics of gaze fixations during dynamicface processing Social Neuroscience 2(1) 1ndash13

Darwin C (1872) The expression of emotions in manand animals New York Philosophical Library

Ekman P amp Friesen W V (1975) Unmasking theface Englewood Cliffs NJ Prentice Hall

Ekman P amp Friesen W V (1978) The Facial ActionCoding System (FACS) A technique for themeasurement of facial action Palo Alto CAConsulting Psychologists Press

Elfenbein H A amp Ambady N (2002) Is there an in-group advantage in emotion recognition

Everdell I T Marsh H Yurick M D Munhall KG amp Pare M (2007) Gaze behaviour inaudiovisual speech perception Asymmetrical dis-tribution of face-directed fixations Perception 361535ndash1545

Gosselin F amp Schyns P G (2001) Bubbles Atechnique to reveal the use of information inrecognition tasks Vision Research 41(17) 2261ndash2271

Hargrave R Maddock R J amp Stone V (2002)Impaired recognition of facial expressions ofemotion in Alzheimerrsquos disease Journal of Neuro-psychiatry and Clinical Neurosciences 14 64ndash71

Hejmadi A Davidson R J amp Rozin P (2000)Exploring Hindu Indian emotion expressionsEvidence for accurate recognition by Americansand Indians Psychological Science 11(3) 183ndash187

Henderson J M Falk R Minut S Dyer F C ampMahadevan S (2000) Gaze control for facelearning and recognition by humans and machinesMichigan State University Eye Movement Labora-tory Technical Report 4 1ndash14

Henderson J M Williams C C amp Falk R J (2005)Eye movements are functional during face learningMemory amp Cognition 33(1) 98ndash106

Horley K Williams L M Gonsalvez C amp GordonE (2004) Face to face Visual scanpath evidencefor abnormal processing of facial expressions insocial phobia Psychiatry Research 127 43ndash53

Hsiao J H W amp Cottrell G (2008) Two fixations

suffice in face recognition Psychological Science19(10) 998ndash1006

Jack R E Blais C Scheepers C Schyns P G ampCaldara R (2009) Cultural confusions show thatfacial expressions are not universal Current Biol-ogy 19 1ndash6

Keltner D amp Buswell B N (1997) EmbarrassmentIts distinct form and appeasement functionsPsychological Bulletin 122(3) 250ndash270

Kowler E Anderson E Dosher B amp Blaser E(1995) The role of attention in the programming ofsaccades Vision Research 35(13) 1897ndash1916

Loughland C M Williams L M amp Gordon E(2002) Schizophrenia and affective disorder showdifferent visual scanning behaviour for faces Atrait versus state-based distinction BiologicalPsychiatry 52 338ndash348

Marsh A A Kozak M N amp Ambady N (2007)Accurate identification of fear facial expressionspredicts prosocial behavior Emotion 7(2) 239

Nahm F K D Perret A Amaral D G amp AlbrightT D (1997) How do monkeys look at facesJournal of Cognitive Neuroscience 9(5) 611ndash623

Ogrocki P K Hills A C amp Strauss M E (2000)Visual exploration of facial emotion by healthyolder adults and patients with Alzheimer diseaseNeuropsychiatry Neuropsychology and BehavioralNeurology 13 271ndash278

Oosterhof N N amp Todorov A (2009) Sharedperceptual basis of emotional expressions andtrustworthiness impressions from faces Emotion9(1) 128

OrsquoToole A J Deffenbacher K A Valentin D ampAbdi H (1994) Structural aspects of face recog-nition and the other-race effect Memory ampCognition 22(2) 208ndash224

Pelli D G (1997) The VideoToolbox software forvisual psychophysics Transforming numbers intomovies Spatial Vision 10 437ndash442

Pelphrey K A Sasson N J Reznick J Paul GGoldman B D amp Piven J (2002) Visualscanning of faces in autism Journal of Autism andDevelopmental Disorders 32(4) 1573ndash3432

Perlman S B Morris J P Vander Wyk B CGreen S R Doyle J L amp Pelphrey K A(2009) Individual differences in personality shapehow people look at faces PLoS ONE 4(6) e5952

Pflugshaupt T Mosimann U P Schmitt W J vonWartburg R Wurtz P Luthi M Muri RM (2007) To look or not to look at threatScanpath differences within a group of spiderphobics Journal of Anxiety Disorders 21(3) 353ndash366

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 15

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Sasson N Tsuchiya N Hurley R Couture S MPenn D L Adolphs R amp Piven J (2007)Orienting to social stimuli differentiates socialcognitive impairment in autism and schizophreniaNeuropsychologia 45 2580ndash2588

Schyns P G Petro L S amp Smith M L (2009)Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brainBehavioral and brain evidence PLoS One 4(5)e5625

Segerstrom S C (2001) Optimism and attentionalbias for negative and positive stimuli Personalityand Social Psychology Bulletin 27 1334ndash1343

Smith M L Cottrell G W Gosselin F amp SchynsP G (2005) Transmitting and decoding facialexpressions Psychological Science 16(3) 184ndash189

Spezio M L Adolphs R Hurley R S E amp PivenJ (2007) Analysis of face gaze in autism usinglsquolsquoBubblesrsquorsquo Neuropsychologia 45 144ndash151

Spezio M L Huang P-Y S Castelli F amp Adolphs

R (2007) Amygdala damage impairs eye contactduring conversations with real people The Journalof Neuroscience 27(15) 3994ndash3997

Susskind J M Lee D H Cusi A Feiman RGrabski W amp Anderson A K (2008) Expressingfear enhances sensory acquisition Nature Neuro-science 11(7) 843ndash850

Tracy J L amp Robins R W (2004) Show your prideEvidence for a discrete emotion expression Psy-chological Science 15 194ndash197

Walker-Smith G J Gale A G amp Findlay J M(1977) Eye movement strategies involved in faceperception Perception 6(3) 313ndash326

Wright D B amp Sladden B (2003) An own genderbias and the importance of hair in face recognitionActa Psychologica 114(1) 101ndash114

Zhao W Chellappa R Phillips P J amp RosenfeldA (2003) Face recognition A literature surveyAcm Computing Surveys (CSUR) 35(4) 399ndash458

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 16

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

  • Introduction
  • Experiment 1 Eye tracking of
  • f01
  • f02
  • f03
  • f04
  • f05
  • f06
  • f07
  • Experiment 2 Varying diagnostic information
  • f08
  • f09
  • General discussion
  • Conclusion
  • Adolphs1
  • Adolphs2
  • Aviezer1
  • BaronCohen1
  • Beaupre1
  • Blair1
  • Brainard1
  • Buchan1
  • Darwin1
  • Ekman1
  • Ekman2
  • Elfenbein1
  • Everdell1
  • Gosselin1
  • Hargrave1
  • Hejmadi1
  • Henderson1
  • Henderson2
  • Horley1
  • Hsiao1
  • Jack1
  • Keltner1
  • Kowler1
  • Loughland1
  • Marsh1
  • Nahm1
  • Ogrocki1
  • Oosterhof1
  • Otoole1
  • Pelli1
  • Pelphrey1
  • Perlman1
  • Pflugshaupt1
  • Sasson1
  • Schyns1
  • Segerstrom1
  • Smith1
  • Spezio1
  • Spezio2
  • Susskind1
  • Tracy1
  • WalkerSmith1
  • Wright1
  • Zhao1

six emotion conditions relative to the average rate ofregion fixation collapsed across all emotions

The bottom row of the figure contains images fromSmith and colleagues (2005) using the lsquolsquobubblesrsquorsquotechnique to find the maximally diagnostic regions offaces for judging each emotion The similarities withincolumns of this figure are striking despite theindependence of the data sources for each Forexample in the middle row note the relatively highfixation rates on the lips for joy and disgust in contrastto the relatively high fixation rates on the eyes foranger sadness and shame Both the top and bottomrows confirm the relatively high diagnostic value ofthese areas for those particular emotion judgments

Neutral faces

In each emotion block half of the faces werenonemotional (neutral) Because these faces wereidentical across blocks any difference in fixation ratesrepresent solely goal-driven strategies resulting fromseeking a given emotion and cannot be due to stimulus-level differences in the images The results will show thatweaker versions of emotion-specific strategies endureacross these neutral faces Figure 4B depicts fixationrates within each block across these neutral faces Foreach region of the face we report the results of a one-way ANOVA seeking differences among the six emotionblock types but focusing only on the neutral faces

For the eyes the emotion block analysis revealed aneffect of emotion condition across neutral faces F(5250) frac14 77 p 0001 whereby participants lookedlonger for anger (Mfrac14334) fear (Mfrac14343) sad (Mfrac14 36) and shame (M frac14 361) faces By contrastparticipants looked less for disgust (M frac14 286) andjoy (M frac14 291) faces relative to the mean (M frac14329) all ts(50) 103 all ps 0001 For the uppernose the emotion block analysis revealed that therewas still no effect of emotion condition across neutralfaces F 1 For the lower nose the emotion blockanalysis revealed that there was still no effect ofemotion condition across neutral faces F 1 For theupper lip the emotion block analysis reveals that therewas still an effect of emotion condition across neutralfaces due to relatively high fixation rates for the joy (Mfrac14 14) condition (disgust was no longer significant)and lower fixation rates for sadness (M frac14 79)condition (anger was no longer significant) relative tothe mean (Mfrac14 99) all ts(50) 197 all ps 0055F(5 250)frac14 86 p 0001 For the nasion the emotionblock analysis revealed that there was still a marginaleffect of emotion condition F(5 250)frac14 23 p frac14 004Although the pattern was similar the change in themean value caused a new set of emotion conditions tobe different from the mean Fear was no longer

significantly different but anger (M frac14 87) hadrelatively high fixation rates t(50) frac14 184 p frac14 007

In summary even when the neutral faces wereidentical across emotion judgment blocks (eg angerjoy) there were still effects of emotion judgment typeon patterns of fixation In fact the majority of effectsfound in the emotional faces were still present in theneutral faces although the effects were not as robustOne exception was fixations to the nasion whereneutral faces showed a pattern of fixation not presentfor emotional faces but this effect may be due to thechanging baseline of the mean number of fixations fornasion fixations overall The fact that most fixationtrends remained for neutral faces that were identicalacross blocks suggests that a substantial portion of theeffect of seeking a specific emotion is due to goal-driveninfluences on fixation patterns

Summary of intensity level analyses

In addition to dividing face trials into emotional(20 40 and 60 emotional) and neutral (0emotional) sets we also examined differences among theemotional face trials as they transitioned from 20 to60 We present a summary of this analysis here andprovide more detailed information and statisticalanalyses in the supplemental materials As the intensityof angry expressions increased fixations increased to theeye region except for the anger faces shown at 60intensity which showed a reduction in fixations to theeye region It is possible that the eyes were mostdiagnostic when the emotion was more ambiguous Asimilar pattern was observed at the nasion where moreneutral faces were fixated to a greater extent than moreemotional faces For disgust judgment blocks there wasa particularly strong tradeoff in fixation position Whenfaces were more neutral participants fixated at the eyesmore frequently However as the intensity of the disgustexpression increased participants looked less at the eyeregion and more toward the upper nose and upper lipAs the intensity of fearful expressions increasedparticipants looked toward the upper nose and upper lipand less at the nasion As the intensity of joyfulexpressions increased there was another particularlystrong effect as fixations increased to the upper lip anddecreased to the eyes and lower nose As the intensity ofsad expressions increased the distribution of fixationsover the five face regions did not significantly change Asthe intensity of shameful expressions increased at 60intensity there were more fixations over the upper liptrading off with fewer fixations over the lower nose

Two of the largest effects of emotional intensity werefor disgust and joy expressions For disgust asemotional intensity conveyed in the face increasedpeople showed greater fixation particularly to the uppernose (close to the particularly salient wrinkle that

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 8

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

appears between the eyes during an expression ofdisgust) and for joy people showed greater fixation tothe upper lip (the location of the smile) In both caseswhen faces become more neutral these fixation ratesdecrease and instead the eyes are fixated morefrequently

Gaze patterns across time

The analyses above collapse the first four fixationsacross viewing time Here we examine how thesepatterns evolve over that time Understanding the timecourse of fixation patterns can establish whether thepatterns observed above are stable and persistentacross viewing time or if fixation patterns are more

variable and dynamic We examined both the proba-bility of fixating on a given region of each type ofemotional face within each 250-ms time interval as wellas for each fixation number Although the patternswere highly similar fixation number appeared to revealmore differences especially in the earlier viewingperiod In contrast for the time interval analysisdifferences in fixation patterns among conditionsappeared to be more lsquolsquosmearedrsquorsquo across time suggestingthat the fixation number analysis might reveal moreabout participantsrsquo fixation patterns

The average duration of each fixation was 301 msand participants made between one and 15 fixationsAlthough participants made up to 15 fixations therewere few trials with this many fixations Because theaverage number of fixations was 83 per image weexamined only the first through the eighth fixation Foreach of the following analyses we use the termlsquolsquofixation ratersquorsquo defined as lsquolsquofor all of the Nth fixationson an image in the current condition the percentagethat are within a given regionrsquorsquo This manner ofcalculating fixation rates removed the influence of thegeneral drop in the number of total fixations availableas fixation number increases (ie there are always morethird fixations than fourth fixations)

Across previous data sets using similar images andtasks as well as the present data set we observed thatfixation patterns differed primarily among the first onethrough four fixations The first few fixations weretypically more variable across time and condition andthe remaining fixation rates tended to stabilize Theselater fixations reveal differences among conditions butvary far less over time Therefore we decided to focusour analyses on the first four fixations (a detailedanalysis across all eight fixations is included in thesupplemental section)

In all of the following analyses we compare fixationrates for each region across the six emotion recognitionconditions To maximize the differences among condi-tions for these analyses involving time we use only theemotional face images and omit the neutral images Aseparate analysis of the neutral emotion imagesrevealed similar patterns but to a weaker extent similarto the neutral image analysis reported above

Patterns of gaze across time by emotion

Figure 6 provides fixation distributions for the firstthree emotional fixations (20ndash60) made to aparticular face presented in the study Overall the eyeswere fixated less across all fixations in the joy anddisgust conditions especially during the first threefixations In contrast the eyes were fixated at uniformlyhigh rates across time in the anger and sadnessconditions and at increasing rates across time in thefear and shame conditions The upper nose was a

Figure 6 Fixation distributions for the first three fixations on

emotional (20ndash60) face image trials for one face identity As

participants made more fixations distinct patterns emerged

across emotions in what regions were fixated most Specifically

at the first fixation the upper nose was a particular popular

fixation location although there was still considerable variability

at the region across emotions As participants made additional

fixations a general pattern emerged such that there was

increased fixation at the upper lip for joy and disgust whereas

there was increased fixation at the eyes for anger sadness and

shame

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 9

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

consistently high target across all emotions during thefirst fixation However despite this popularity demon-strated there was still considerable variability infixation patterns across emotions The lower noseshowed wide variability across all emotions at the firstfixation but afterward fixation rates remained muchmore consistent across emotions The upper lip wasfixated at particularly high rates during the secondfixation for joy and disgust and to a weaker degreefear The nasion showed wide variability in fixationpatterns across emotions although there was a highfixation preference at the first fixation for shame

Asymmetry analysis

Previous research has shown that initial fixationstoward faces are biased toward the left side of the facefollowed by a rightward fixation (Everdell MarshYurick Munhall amp Pare 2007) We tested for thiseffect in the present data by coding each fixation asbeing toward the left or right excluding center regionsof the face (UN LN UL LL) A one-way ANOVAwith fixation number (one through eight) and blocktype (emotional or neutral) as factors showed a maineffect of fixation number F(1 7) frac14 4208 p 0001This effect was driven by the first fixation andremoving that fixation from the analysis eliminatedthe main effect F(1 6) frac14 0579 p frac14 0747 The firstfixation during trials across block type had a strongbias toward the left side of the screen (M frac14 581) incomparison to fixations two through eight (M frac14

462) which demonstrated a slight bias toward theright side of the screen These effects confirm theinitial leftward bias in face fixation patterns Possibleroots of this effect including a learned strategy thatthe right side of the face contains the most informa-tion or a general perceptual bias toward the left visualhemifield are discussed in detail in Everdell et al(2007)

Fixation patterns predict facial emotion

If the type of facial emotion displayed (or sought bythe observer) systematically affects the observerrsquospattern of eye movements it should be possible topredict the type of face displayed (or sought) bystatistical analysis of the eye movement sequence forany given trial We separately submitted the first 1ndashNfixation regions (where N frac14 1 to 8) from every trial inthe data set to a naive Bayesian classifier for categoricaldata (the NaiveBayes class of Matlab) using two thirdsof the data for training and one third for testing Wecalculated the average classification accuracy across100 fittest iterations selecting traintest sets indepen-dently for each iteration for (a) all trials (b) emotionalimagendashonly trials and (c) neutral trials

Figure 6 depicts classification accuracy for eachcondition as well as chance prediction (for sixemotions 166) Emotional-image trial classifica-tion accuracy of test trials was highest (with a peakof around 25 accuracy) for the emotional imagendashonly trial set consistent with the stronger fixationpatterns described in our other analyses Neutral-image trial classification accuracy was lower but stillwell above chance performance (with a peak ofaround 20 accuracy) Including all trials led toperformance in between these two (with a peak ofaround 23 accuracy) consistent with Hsiao andCottrell (2008) who showed that the first one to twofixations were most critical for face processing thediagnostic value of our fixation data was primarily inthe first two fixations It is clear from Figure 7 thatthe bulk of predictive power is reached by the secondfixation

Finally we examined the confusion matrix pro-duced by the classifier (Figure 8) When the wrongemotion is predicted which emotions are systemat-ically confused because they are most similar Wefocused here on two subsets of trials the highestemotion (60) images and the neutral image trialsFor the highest emotion images two pairs ofemotions stood out as having the most confusablefixation patterns joy and disgust and anger andsadness This confusability is consistent with theresults depicted in Figure 5 which reveals strongsimilarities between the fixation patterns produced by

Figure 7 Classification accuracy for each condition (emotional

face trials neutral face trials both trial types and chance

prediction) Classification accuracy for emotional trials was

highest consistent with the stronger fixation patterns observed

in our other analyses Neutral trial classification accuracy was

lower but still well above chance Consistent with the observed

fixation patterns of humans the diagnostic value of fixation

data was primarily reached in the first two fixations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 10

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

these pairs of images Two pairs of emotions stoodout as the least confusable joy and sadness and joyand anger This is also consistent with the resultsdepicted in Figure 5 in which the fixation pattern forjoy contrasts strongly with the pattern for anger andsadness For the neutral-trial subset the mostconfusable image pair was again joy and disgust andthe least confusable was again joy and sadness Otherpairings did not stand out as strong outliers for thissubset

Experiment 1 summary

Participants moved their eyes in distinctive patternsfor each type of emotional image These patternsremained (although weaker) for neutral images in eachemotionrsquos block revealing that these patterns were atleast partially goal-directed The pattern of fixationsalone could predict the type of image being sought(even if it was not present) at above chance levels in anaive Bayesian classifier

Participants preferentially fixate certain regions thatthey at least implicitly think are more diagnostic forrecognition of different types of emotions Experiment2 verifies that the two broadest differentiable regionsmdashthe eyes and the mouthmdashdo indeed differ in how wellthey signal particular emotions

Experiment 2 Varying diagnosticinformation of faces altersperformance

In order to verify whether different areas of our facestimuli contain more or less diagnostic information foreach emotion we asked a new set of participants tomake similar judgments of the emotional content offaces but we occluded information either from the eyeregion or the mouth region We predicted that foremotion judgments for which participants in Experi-ment 1 fixated the eye region at relatively high rates(ie anger and shame) occluding that region woulddecrease emotion detection accuracy In contrast foremotions for which the mouth region was particularlyhighly fixated (ie joy and disgust) occluding thatregion would decrease emotion detection accuracy

Methods

Participants

Ten college-aged participants completed Experiment2 All participants gave consent and received coursecredit for their participation

Figure 8 Confusion matrices for the Bayesian classifier including information from the first eight fixations from trials with

photographs showing the highest emotion level (left column) and trials with neutral emotion photographs (right column) The top

row contains full confusion matrices with bold type indicating significant differences from expectations of chance (1667)

conservatively Bonferroni corrected for 36 comparisons To facilitate visual inspection cell values are redundantly color coded for

magnitude within each table The bottom row removes bold formatting and folds each table along its identity diagonal to depict

mutual confusability regardless of whether a pair of classifications is actual and predicted or predicted and actual respectively Boxed

values are highlighted and discussed in the text

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 11

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Stimuli

Stimuli were identical to that of Experiment 1 withthe following exceptions For emotional faces onlyone emotional morph was used per condition Angerdisgust fear and sadness consisted of 40 emotionalintensity morphs whereas joy consisted of only 20and shame 60 We determined the specific emotionalintensity per emotion during piloting in order toensure participants could perform above chance butnot at ceiling (70 correct) The values used areconsistent with the performance observed in Experi-ment 1 (see Figure 3) Additionally all face stimulihad a black box (838 middot 288) covering either the eyesor the mouth

Apparatus

All of the stimuli were created and displayed usingMATLAB with the Psychophysics Toolbox (Brainard1997 Pelli 1997) on an Intel Macintosh running OS X106 All stimuli were displayed on a 17-in View-SonicE70fB CRT monitor (1024 middot 786 resolution 75Hz 2897 pixels8 of visual angle) The viewing distancewas approximately 56 cm

Procedure

This experiment consisted of a total of 72 trialspresented as six blocks of trials (one block of trials peremotion type) shown in random order repeated twicefor a total of 144 trials per participant Each blockconsisted of 12 trials per block consisting of sixneutral and six emotional face trials presented inrandomized order At the beginning of each trialparticipants saw a fixation cross for 1 s followed by abriefly presented face for 100 ms For half of the trials

a black box covered either the eyes or the mouth (seeFigure 9) Participants then judged whether the photodepicted a specific emotion or not (ie angry or notangry)

Results

We examined whether or not overall performancediffered according to whether the mouth (ie eyescovered by black box) or eyes (ie mouth covered byblack box) were visible For joy we found performancewas significantly better when the mouth was visible (Mfrac14 633) relative to when it was covered (Mfrac14 517)t(9)frac14 281 pfrac14 002 A similar pattern was observed fordisgust (Mfrac14 758 Mfrac14 601) t(9)frac14 338 p 001Conversely we found performance was significantlybetter for anger when the eyes were visible (Mfrac14 692)compared to when they were covered (Mfrac14 608) t(9)frac14 335 p 001 A similar trend was observed forshame (M frac14 817 M frac14 683) t(9)frac14 181 p frac14 010We failed to find any significant differences for fear (Mfrac14 65 M frac14 675) and sadness (M frac14 717 M frac14717)

Discussion

Emotion judgment performance depended on whichregions of the face were covered in a mannerconsistent with the fixation patterns found in Exper-iment 1 Performance was best for joy and disgustwhen the eyes were covered and the mouth was visibleconsistent with the eye fixation rates we observed inExperiment 1 in which observers tended to fixate moreat the upper lip and less at the eyes relative to otheremotions Figure 9 provides a strong illustration ofthis effect Joy is easy to detect in the left face butharder to detect in the right face We failed to find anymeaningful differences for fearful faces which isconsistent with the diagnostic regions identified inExperiment 1 the eyes and upper lip are notdifferentially important to identify fear For angerperformance was significantly better when the eyeswere visible compared to when they were covered Asimilar trend was also observed for shameful facesBoth of these effects are consistent with the diagnosticareas identified in Experiment 1 We did not see anysignificant differences for detecting sadness whencovering the mouth versus the eyes even thoughExperiment 1 did suggest different levels of diagnos-ticity for those regions Overall five out of the sixemotion types showed results consistent with thepredictions given the diagnostic areas of emotionalfaces identified in Experiment 1

Figure 9 For all trials participants were briefly shown a face

with either the eyes or mouth occluded The task was to judge

whether the face had a particular emotion or not (A) An

example of a joyful face with the eyes covered (B) an example

of the same face but with the mouth covered

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 12

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

General discussion

Perceiving emotion expressed in a face is associatedwith characteristic patterns of attention both whenthat emotion is visible and when it is merely antici-pated Although previous research has segmented partsof the face into three or four distinct regions (seeAviezer et al 2008 Buchan Pare amp Munhall 2007Everdell et al 2007 Henderson Falk Minut Dyer ampMahadevan 2000) the present study offers a moredetailed analysis using 21 defined regions Our resultsshow that of these regions the five main facial regions(eyes upper nose lower nose upper lip nasion)accounted for 8803 of all fixations any other facialregion accounted for at most 3 of fixations Thesepatterns suggest that these five facial regions may be themost critical for emotional recognition within faces

Despite the dominance of these five regions therewere consistent differences in fixation patterns whenseeking different emotional cues within a face Figure 5summarizes the regions that were highly fixated foreach emotion (relative to the other emotions) and offersa comparison of these patterns to past work exploringwhich regions are most diagnostic for evaluating agiven emotion (see Figure 5 Smith et al 2005)

For joy participants fixated the least at the eyesand spent the most time fixating at the upper lipsThis is likely driven by the importance of the upperlip relative to the smile the most salient facialfeature of joy (see Figure 5) Importantly thesefixation differences were significant across emo-tional and neutral stimuli suggesting a particularlystrong goal-driven strategy for perceiving joy Foremotional faces there also appeared to be amarginal difference in which the upper nose wasfixated less frequently These differences wereprimarily driven by the first few fixations

For disgust participants fixated more often at theupper lip and less often at the eyes Thesedifferences are likely due to the importance of thefurrowing of the nose and mouth when making thedisgust facial expression thus making the upper lipmore salient These patterns were primarily drivenby the first few fixations

For fear participants fixated more at the eyes andrelatively less at the nasion (see Figure 5) Theredid not appear to be any significant differencesrelative to other emotions in fixations at the uppernose lower nose and upper lip These patternswere fairly consistent across time for the nasionalthough the eyes became increasingly fixated overtime

For anger participants fixated most at the eyesand least at the upper lip When emotion was notpresent participants fixated significantly more at

the nasion as well There did not appear to be anysignificant fixation differences at the upper andlower nose (see Figure 5) These results areconsistent with previous findings suggesting thediagnostic value of the eyes and nasion fordetecting anger (Smith et al 2005) These patternswere consistent across time

For sadness participants fixated most at the eyesIn contrast fixation time was significantly lower atthe upper lip These fixation patterns weresignificant across both emotional and neutralstimuli for both the eyes and upper lip suggestingthis pattern may be driven by a goal-drivenstrategy These patterns were consistent acrosstime

For shame participants fixated to a greater extentat the eyes This pattern was most pronouncedafter the first fixation For all other regions theredid not appear to be any significant differences (seeFigure 5)

Across all blocks it is interesting that there were nosignificant differences found in fixation time spent atthe lower nose whether for emotional or neutral facesit appears to serve as a central fixation point across aface Another finding in our data is that dynamicdifferences are apparent even within the first fixationfor example in the much greater percentage of fixationsto the eye region for anger relative to joyful expres-sions This provides further support for a goal-drivenstrategy during emotional recognition of faces becausesuch differences were typically maintained acrossemotional and neutral stimuli The values and differ-ences within many regions across emotion type werelarger and more pronounced while viewing emotionalversus neutral stimuli still suggesting an important rolefor stimulus-driven factors on eye movements

Gaze patterns were particularly variable acrossemotions in the first few fixations and became lessvariable for the remainder of the viewing time Theupper nose was fixated to the greatest extent as the firstfixation regardless of emotion type and then fixationrates remained relatively variable depending on theemotion type for the remainder of the trial In contrastfor the lower nose the first fixation demonstratedvariability across different emotion types but was areliable target at the second fixation regardless ofemotion For all emotions the upper lip was consis-tently a weaker target at the first fixation and fixatedmost during the second fixation There was no generalfixation pattern at the nasion demonstrating widevariability across emotions over time We also observeda significant left dominance for the initial fixationregardless of stimulus type

Although participants made an average of 83fixations per trial the prediction rates for the Bayesianclassifier showed that the first two fixation locations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 13

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

were most predictive of the type of emotion sought bythe observer This result is consistent with previouswork showing that face recognition can be completedafter only one to two eye movements (Hsiao amp Cottrell2008) although the number of fixations needed torecognize emotional content may be greater

Building on the diagnostic regions of the emotionalfaces identified in Experiment 1 Experiment 2 offered adirect evaluation to whether those regions wereimportant by covering either the mouth or the eyesduring an emotional judgment task Consistent with thepredictions of Experiment 1 emotional judgmentperformance was impaired for emotions for which themouth was more diagnostic (joy disgust) as well as foremotions for which the eyes were more diagnostic(anger shame) Although the present data cannotdirectly confirm that eye movements to these morediagnostic regions helped performance they do showthat attentional strategies were aligned with locationsthat carry diagnostic information

Conclusion

When asked to evaluate the presence of a specificemotion within a face participants focused on acommon set of face regions but also used emotion-specific eye-movement strategies in both emotional andneutral faces These results are consistent with the ideathat focusing attention on certain diagnostic regions isbeneficial for emotion processing and that thesestrategies may be driven by both stimulus-driven andgoal-driven factors The evidence for goal-drivenstrategies is consistent with previous research suggest-ing that the goals and characteristics of the perceivermay influence how eye movements are deployed duringfacial emotion recognition (Jack et al 2009 Perlman etal 2009) Face and emotion recognition are even morebroadly affected by a facersquos gender (Wright amp Sladden2003) race or culture (Jack et al 2009 OrsquoToole et al1994) and individual facial morphology (Oosterhof ampTodorov 2009) These effects likely interact with thegoal-driven factors observed in the current study andwe hope that future research explores such interactions

The presence of a goal-driven strategy duringemotional recognition has numerous practical impli-cations specifically to individuals with disorders thatmay result in social deficits Patients with certaindisorders such as autism (Pelphrey et al 2002 SpezioAdolphs et al 2007 Spezio Huang Castelli ampAdolphs 2007) schizophrenia (Loughland Williamsamp Gordon 2002) social phobias (Horley WilliamsGonsalvez amp Gordon 2004) and Alzheimerrsquos disease(Hargrave Maddock amp Stone 2002 Ogrocki Hills ampStrauss 2000) demonstrate eye-movement patterns

that are remarkably different from normal participantsand these anomalous patterns may impair their abilityto correctly identify the emotional valence of facesThese individuals typically focus longer at nondiag-nostic regions (such as the brow or cheeks) relative todiagnostic regions of the face (eyes mouth nose)during emotional recognition These abnormal eye-movement patterns contribute to social or emotionaldeficits that accompany abnormal face recognition Bycomparing such abnormal eye-movement patterns toour results it may be possible to identify ways in whichaltered eye-movement patterns could improve emotionrecognition

Keywords face recognition emotion eye movementsattention fixation

Acknowledgments

We thank Trixie Lipke for assistance with datacollection and Sumeeth Jonathan for programming andhelp with movie creation

Commercial relationships noneCorresponding author Mark W SchurginEmail maschurginjhueduAddress Department of Psychological amp Brain Sci-ences Johns Hopkins University Baltimore MDUSA

References

Adolphs R Gosselin F Buchanan T W TranelD Schyns P amp Damasio A R (2005) Amechanism for impaired fear recognition afteramygdala damage Nature 433(7021) 68ndash72

Adolphs R Tranel D Damasio H amp Damasio A(1994) Impaired recognition of emotion in facialexpressions following bilateral damage to thehuman amygdala Nature 372 669ndash672

Aviezer H Hassin R R Ryan J Grady CSusskind J Anderson A Bentin S (2008)Angry disgusted or afraid Studies on the mal-leability of emotion perception PsychologicalScience 19(7) 724ndash732

Baron-Cohen S amp Wheelwright S (2004) Theempathy quotient An investigation of adults withAsperger syndrome or high functioning autism andnormal sex differences Journal of Autism andDevelopmental Disorders 34(2) 163ndash175

Beaupre M G amp Hess U (2005) Cross-culturalemotion recognition among Canadian ethnic

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 14

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

groups Journal of Cross-Cultural Psychology36(3) 355ndash370

Blair R J R (2005) Responding to the emotions ofothers Dissociating forms of empathy through thestudy of typical and psychiatric populationsConsciousness and Cognition 14(4) 698ndash718

Brainard D H (1997) The Psychophysics ToolboxSpatial Vision 10 443ndash446

Buchan J N Pare M amp Munhall K G (2007)Spatial statistics of gaze fixations during dynamicface processing Social Neuroscience 2(1) 1ndash13

Darwin C (1872) The expression of emotions in manand animals New York Philosophical Library

Ekman P amp Friesen W V (1975) Unmasking theface Englewood Cliffs NJ Prentice Hall

Ekman P amp Friesen W V (1978) The Facial ActionCoding System (FACS) A technique for themeasurement of facial action Palo Alto CAConsulting Psychologists Press

Elfenbein H A amp Ambady N (2002) Is there an in-group advantage in emotion recognition

Everdell I T Marsh H Yurick M D Munhall KG amp Pare M (2007) Gaze behaviour inaudiovisual speech perception Asymmetrical dis-tribution of face-directed fixations Perception 361535ndash1545

Gosselin F amp Schyns P G (2001) Bubbles Atechnique to reveal the use of information inrecognition tasks Vision Research 41(17) 2261ndash2271

Hargrave R Maddock R J amp Stone V (2002)Impaired recognition of facial expressions ofemotion in Alzheimerrsquos disease Journal of Neuro-psychiatry and Clinical Neurosciences 14 64ndash71

Hejmadi A Davidson R J amp Rozin P (2000)Exploring Hindu Indian emotion expressionsEvidence for accurate recognition by Americansand Indians Psychological Science 11(3) 183ndash187

Henderson J M Falk R Minut S Dyer F C ampMahadevan S (2000) Gaze control for facelearning and recognition by humans and machinesMichigan State University Eye Movement Labora-tory Technical Report 4 1ndash14

Henderson J M Williams C C amp Falk R J (2005)Eye movements are functional during face learningMemory amp Cognition 33(1) 98ndash106

Horley K Williams L M Gonsalvez C amp GordonE (2004) Face to face Visual scanpath evidencefor abnormal processing of facial expressions insocial phobia Psychiatry Research 127 43ndash53

Hsiao J H W amp Cottrell G (2008) Two fixations

suffice in face recognition Psychological Science19(10) 998ndash1006

Jack R E Blais C Scheepers C Schyns P G ampCaldara R (2009) Cultural confusions show thatfacial expressions are not universal Current Biol-ogy 19 1ndash6

Keltner D amp Buswell B N (1997) EmbarrassmentIts distinct form and appeasement functionsPsychological Bulletin 122(3) 250ndash270

Kowler E Anderson E Dosher B amp Blaser E(1995) The role of attention in the programming ofsaccades Vision Research 35(13) 1897ndash1916

Loughland C M Williams L M amp Gordon E(2002) Schizophrenia and affective disorder showdifferent visual scanning behaviour for faces Atrait versus state-based distinction BiologicalPsychiatry 52 338ndash348

Marsh A A Kozak M N amp Ambady N (2007)Accurate identification of fear facial expressionspredicts prosocial behavior Emotion 7(2) 239

Nahm F K D Perret A Amaral D G amp AlbrightT D (1997) How do monkeys look at facesJournal of Cognitive Neuroscience 9(5) 611ndash623

Ogrocki P K Hills A C amp Strauss M E (2000)Visual exploration of facial emotion by healthyolder adults and patients with Alzheimer diseaseNeuropsychiatry Neuropsychology and BehavioralNeurology 13 271ndash278

Oosterhof N N amp Todorov A (2009) Sharedperceptual basis of emotional expressions andtrustworthiness impressions from faces Emotion9(1) 128

OrsquoToole A J Deffenbacher K A Valentin D ampAbdi H (1994) Structural aspects of face recog-nition and the other-race effect Memory ampCognition 22(2) 208ndash224

Pelli D G (1997) The VideoToolbox software forvisual psychophysics Transforming numbers intomovies Spatial Vision 10 437ndash442

Pelphrey K A Sasson N J Reznick J Paul GGoldman B D amp Piven J (2002) Visualscanning of faces in autism Journal of Autism andDevelopmental Disorders 32(4) 1573ndash3432

Perlman S B Morris J P Vander Wyk B CGreen S R Doyle J L amp Pelphrey K A(2009) Individual differences in personality shapehow people look at faces PLoS ONE 4(6) e5952

Pflugshaupt T Mosimann U P Schmitt W J vonWartburg R Wurtz P Luthi M Muri RM (2007) To look or not to look at threatScanpath differences within a group of spiderphobics Journal of Anxiety Disorders 21(3) 353ndash366

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 15

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Sasson N Tsuchiya N Hurley R Couture S MPenn D L Adolphs R amp Piven J (2007)Orienting to social stimuli differentiates socialcognitive impairment in autism and schizophreniaNeuropsychologia 45 2580ndash2588

Schyns P G Petro L S amp Smith M L (2009)Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brainBehavioral and brain evidence PLoS One 4(5)e5625

Segerstrom S C (2001) Optimism and attentionalbias for negative and positive stimuli Personalityand Social Psychology Bulletin 27 1334ndash1343

Smith M L Cottrell G W Gosselin F amp SchynsP G (2005) Transmitting and decoding facialexpressions Psychological Science 16(3) 184ndash189

Spezio M L Adolphs R Hurley R S E amp PivenJ (2007) Analysis of face gaze in autism usinglsquolsquoBubblesrsquorsquo Neuropsychologia 45 144ndash151

Spezio M L Huang P-Y S Castelli F amp Adolphs

R (2007) Amygdala damage impairs eye contactduring conversations with real people The Journalof Neuroscience 27(15) 3994ndash3997

Susskind J M Lee D H Cusi A Feiman RGrabski W amp Anderson A K (2008) Expressingfear enhances sensory acquisition Nature Neuro-science 11(7) 843ndash850

Tracy J L amp Robins R W (2004) Show your prideEvidence for a discrete emotion expression Psy-chological Science 15 194ndash197

Walker-Smith G J Gale A G amp Findlay J M(1977) Eye movement strategies involved in faceperception Perception 6(3) 313ndash326

Wright D B amp Sladden B (2003) An own genderbias and the importance of hair in face recognitionActa Psychologica 114(1) 101ndash114

Zhao W Chellappa R Phillips P J amp RosenfeldA (2003) Face recognition A literature surveyAcm Computing Surveys (CSUR) 35(4) 399ndash458

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 16

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

  • Introduction
  • Experiment 1 Eye tracking of
  • f01
  • f02
  • f03
  • f04
  • f05
  • f06
  • f07
  • Experiment 2 Varying diagnostic information
  • f08
  • f09
  • General discussion
  • Conclusion
  • Adolphs1
  • Adolphs2
  • Aviezer1
  • BaronCohen1
  • Beaupre1
  • Blair1
  • Brainard1
  • Buchan1
  • Darwin1
  • Ekman1
  • Ekman2
  • Elfenbein1
  • Everdell1
  • Gosselin1
  • Hargrave1
  • Hejmadi1
  • Henderson1
  • Henderson2
  • Horley1
  • Hsiao1
  • Jack1
  • Keltner1
  • Kowler1
  • Loughland1
  • Marsh1
  • Nahm1
  • Ogrocki1
  • Oosterhof1
  • Otoole1
  • Pelli1
  • Pelphrey1
  • Perlman1
  • Pflugshaupt1
  • Sasson1
  • Schyns1
  • Segerstrom1
  • Smith1
  • Spezio1
  • Spezio2
  • Susskind1
  • Tracy1
  • WalkerSmith1
  • Wright1
  • Zhao1

appears between the eyes during an expression ofdisgust) and for joy people showed greater fixation tothe upper lip (the location of the smile) In both caseswhen faces become more neutral these fixation ratesdecrease and instead the eyes are fixated morefrequently

Gaze patterns across time

The analyses above collapse the first four fixationsacross viewing time Here we examine how thesepatterns evolve over that time Understanding the timecourse of fixation patterns can establish whether thepatterns observed above are stable and persistentacross viewing time or if fixation patterns are more

variable and dynamic We examined both the proba-bility of fixating on a given region of each type ofemotional face within each 250-ms time interval as wellas for each fixation number Although the patternswere highly similar fixation number appeared to revealmore differences especially in the earlier viewingperiod In contrast for the time interval analysisdifferences in fixation patterns among conditionsappeared to be more lsquolsquosmearedrsquorsquo across time suggestingthat the fixation number analysis might reveal moreabout participantsrsquo fixation patterns

The average duration of each fixation was 301 msand participants made between one and 15 fixationsAlthough participants made up to 15 fixations therewere few trials with this many fixations Because theaverage number of fixations was 83 per image weexamined only the first through the eighth fixation Foreach of the following analyses we use the termlsquolsquofixation ratersquorsquo defined as lsquolsquofor all of the Nth fixationson an image in the current condition the percentagethat are within a given regionrsquorsquo This manner ofcalculating fixation rates removed the influence of thegeneral drop in the number of total fixations availableas fixation number increases (ie there are always morethird fixations than fourth fixations)

Across previous data sets using similar images andtasks as well as the present data set we observed thatfixation patterns differed primarily among the first onethrough four fixations The first few fixations weretypically more variable across time and condition andthe remaining fixation rates tended to stabilize Theselater fixations reveal differences among conditions butvary far less over time Therefore we decided to focusour analyses on the first four fixations (a detailedanalysis across all eight fixations is included in thesupplemental section)

In all of the following analyses we compare fixationrates for each region across the six emotion recognitionconditions To maximize the differences among condi-tions for these analyses involving time we use only theemotional face images and omit the neutral images Aseparate analysis of the neutral emotion imagesrevealed similar patterns but to a weaker extent similarto the neutral image analysis reported above

Patterns of gaze across time by emotion

Figure 6 provides fixation distributions for the firstthree emotional fixations (20ndash60) made to aparticular face presented in the study Overall the eyeswere fixated less across all fixations in the joy anddisgust conditions especially during the first threefixations In contrast the eyes were fixated at uniformlyhigh rates across time in the anger and sadnessconditions and at increasing rates across time in thefear and shame conditions The upper nose was a

Figure 6 Fixation distributions for the first three fixations on

emotional (20ndash60) face image trials for one face identity As

participants made more fixations distinct patterns emerged

across emotions in what regions were fixated most Specifically

at the first fixation the upper nose was a particular popular

fixation location although there was still considerable variability

at the region across emotions As participants made additional

fixations a general pattern emerged such that there was

increased fixation at the upper lip for joy and disgust whereas

there was increased fixation at the eyes for anger sadness and

shame

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 9

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

consistently high target across all emotions during thefirst fixation However despite this popularity demon-strated there was still considerable variability infixation patterns across emotions The lower noseshowed wide variability across all emotions at the firstfixation but afterward fixation rates remained muchmore consistent across emotions The upper lip wasfixated at particularly high rates during the secondfixation for joy and disgust and to a weaker degreefear The nasion showed wide variability in fixationpatterns across emotions although there was a highfixation preference at the first fixation for shame

Asymmetry analysis

Previous research has shown that initial fixationstoward faces are biased toward the left side of the facefollowed by a rightward fixation (Everdell MarshYurick Munhall amp Pare 2007) We tested for thiseffect in the present data by coding each fixation asbeing toward the left or right excluding center regionsof the face (UN LN UL LL) A one-way ANOVAwith fixation number (one through eight) and blocktype (emotional or neutral) as factors showed a maineffect of fixation number F(1 7) frac14 4208 p 0001This effect was driven by the first fixation andremoving that fixation from the analysis eliminatedthe main effect F(1 6) frac14 0579 p frac14 0747 The firstfixation during trials across block type had a strongbias toward the left side of the screen (M frac14 581) incomparison to fixations two through eight (M frac14

462) which demonstrated a slight bias toward theright side of the screen These effects confirm theinitial leftward bias in face fixation patterns Possibleroots of this effect including a learned strategy thatthe right side of the face contains the most informa-tion or a general perceptual bias toward the left visualhemifield are discussed in detail in Everdell et al(2007)

Fixation patterns predict facial emotion

If the type of facial emotion displayed (or sought bythe observer) systematically affects the observerrsquospattern of eye movements it should be possible topredict the type of face displayed (or sought) bystatistical analysis of the eye movement sequence forany given trial We separately submitted the first 1ndashNfixation regions (where N frac14 1 to 8) from every trial inthe data set to a naive Bayesian classifier for categoricaldata (the NaiveBayes class of Matlab) using two thirdsof the data for training and one third for testing Wecalculated the average classification accuracy across100 fittest iterations selecting traintest sets indepen-dently for each iteration for (a) all trials (b) emotionalimagendashonly trials and (c) neutral trials

Figure 6 depicts classification accuracy for eachcondition as well as chance prediction (for sixemotions 166) Emotional-image trial classifica-tion accuracy of test trials was highest (with a peakof around 25 accuracy) for the emotional imagendashonly trial set consistent with the stronger fixationpatterns described in our other analyses Neutral-image trial classification accuracy was lower but stillwell above chance performance (with a peak ofaround 20 accuracy) Including all trials led toperformance in between these two (with a peak ofaround 23 accuracy) consistent with Hsiao andCottrell (2008) who showed that the first one to twofixations were most critical for face processing thediagnostic value of our fixation data was primarily inthe first two fixations It is clear from Figure 7 thatthe bulk of predictive power is reached by the secondfixation

Finally we examined the confusion matrix pro-duced by the classifier (Figure 8) When the wrongemotion is predicted which emotions are systemat-ically confused because they are most similar Wefocused here on two subsets of trials the highestemotion (60) images and the neutral image trialsFor the highest emotion images two pairs ofemotions stood out as having the most confusablefixation patterns joy and disgust and anger andsadness This confusability is consistent with theresults depicted in Figure 5 which reveals strongsimilarities between the fixation patterns produced by

Figure 7 Classification accuracy for each condition (emotional

face trials neutral face trials both trial types and chance

prediction) Classification accuracy for emotional trials was

highest consistent with the stronger fixation patterns observed

in our other analyses Neutral trial classification accuracy was

lower but still well above chance Consistent with the observed

fixation patterns of humans the diagnostic value of fixation

data was primarily reached in the first two fixations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 10

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

these pairs of images Two pairs of emotions stoodout as the least confusable joy and sadness and joyand anger This is also consistent with the resultsdepicted in Figure 5 in which the fixation pattern forjoy contrasts strongly with the pattern for anger andsadness For the neutral-trial subset the mostconfusable image pair was again joy and disgust andthe least confusable was again joy and sadness Otherpairings did not stand out as strong outliers for thissubset

Experiment 1 summary

Participants moved their eyes in distinctive patternsfor each type of emotional image These patternsremained (although weaker) for neutral images in eachemotionrsquos block revealing that these patterns were atleast partially goal-directed The pattern of fixationsalone could predict the type of image being sought(even if it was not present) at above chance levels in anaive Bayesian classifier

Participants preferentially fixate certain regions thatthey at least implicitly think are more diagnostic forrecognition of different types of emotions Experiment2 verifies that the two broadest differentiable regionsmdashthe eyes and the mouthmdashdo indeed differ in how wellthey signal particular emotions

Experiment 2 Varying diagnosticinformation of faces altersperformance

In order to verify whether different areas of our facestimuli contain more or less diagnostic information foreach emotion we asked a new set of participants tomake similar judgments of the emotional content offaces but we occluded information either from the eyeregion or the mouth region We predicted that foremotion judgments for which participants in Experi-ment 1 fixated the eye region at relatively high rates(ie anger and shame) occluding that region woulddecrease emotion detection accuracy In contrast foremotions for which the mouth region was particularlyhighly fixated (ie joy and disgust) occluding thatregion would decrease emotion detection accuracy

Methods

Participants

Ten college-aged participants completed Experiment2 All participants gave consent and received coursecredit for their participation

Figure 8 Confusion matrices for the Bayesian classifier including information from the first eight fixations from trials with

photographs showing the highest emotion level (left column) and trials with neutral emotion photographs (right column) The top

row contains full confusion matrices with bold type indicating significant differences from expectations of chance (1667)

conservatively Bonferroni corrected for 36 comparisons To facilitate visual inspection cell values are redundantly color coded for

magnitude within each table The bottom row removes bold formatting and folds each table along its identity diagonal to depict

mutual confusability regardless of whether a pair of classifications is actual and predicted or predicted and actual respectively Boxed

values are highlighted and discussed in the text

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 11

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Stimuli

Stimuli were identical to that of Experiment 1 withthe following exceptions For emotional faces onlyone emotional morph was used per condition Angerdisgust fear and sadness consisted of 40 emotionalintensity morphs whereas joy consisted of only 20and shame 60 We determined the specific emotionalintensity per emotion during piloting in order toensure participants could perform above chance butnot at ceiling (70 correct) The values used areconsistent with the performance observed in Experi-ment 1 (see Figure 3) Additionally all face stimulihad a black box (838 middot 288) covering either the eyesor the mouth

Apparatus

All of the stimuli were created and displayed usingMATLAB with the Psychophysics Toolbox (Brainard1997 Pelli 1997) on an Intel Macintosh running OS X106 All stimuli were displayed on a 17-in View-SonicE70fB CRT monitor (1024 middot 786 resolution 75Hz 2897 pixels8 of visual angle) The viewing distancewas approximately 56 cm

Procedure

This experiment consisted of a total of 72 trialspresented as six blocks of trials (one block of trials peremotion type) shown in random order repeated twicefor a total of 144 trials per participant Each blockconsisted of 12 trials per block consisting of sixneutral and six emotional face trials presented inrandomized order At the beginning of each trialparticipants saw a fixation cross for 1 s followed by abriefly presented face for 100 ms For half of the trials

a black box covered either the eyes or the mouth (seeFigure 9) Participants then judged whether the photodepicted a specific emotion or not (ie angry or notangry)

Results

We examined whether or not overall performancediffered according to whether the mouth (ie eyescovered by black box) or eyes (ie mouth covered byblack box) were visible For joy we found performancewas significantly better when the mouth was visible (Mfrac14 633) relative to when it was covered (Mfrac14 517)t(9)frac14 281 pfrac14 002 A similar pattern was observed fordisgust (Mfrac14 758 Mfrac14 601) t(9)frac14 338 p 001Conversely we found performance was significantlybetter for anger when the eyes were visible (Mfrac14 692)compared to when they were covered (Mfrac14 608) t(9)frac14 335 p 001 A similar trend was observed forshame (M frac14 817 M frac14 683) t(9)frac14 181 p frac14 010We failed to find any significant differences for fear (Mfrac14 65 M frac14 675) and sadness (M frac14 717 M frac14717)

Discussion

Emotion judgment performance depended on whichregions of the face were covered in a mannerconsistent with the fixation patterns found in Exper-iment 1 Performance was best for joy and disgustwhen the eyes were covered and the mouth was visibleconsistent with the eye fixation rates we observed inExperiment 1 in which observers tended to fixate moreat the upper lip and less at the eyes relative to otheremotions Figure 9 provides a strong illustration ofthis effect Joy is easy to detect in the left face butharder to detect in the right face We failed to find anymeaningful differences for fearful faces which isconsistent with the diagnostic regions identified inExperiment 1 the eyes and upper lip are notdifferentially important to identify fear For angerperformance was significantly better when the eyeswere visible compared to when they were covered Asimilar trend was also observed for shameful facesBoth of these effects are consistent with the diagnosticareas identified in Experiment 1 We did not see anysignificant differences for detecting sadness whencovering the mouth versus the eyes even thoughExperiment 1 did suggest different levels of diagnos-ticity for those regions Overall five out of the sixemotion types showed results consistent with thepredictions given the diagnostic areas of emotionalfaces identified in Experiment 1

Figure 9 For all trials participants were briefly shown a face

with either the eyes or mouth occluded The task was to judge

whether the face had a particular emotion or not (A) An

example of a joyful face with the eyes covered (B) an example

of the same face but with the mouth covered

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 12

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

General discussion

Perceiving emotion expressed in a face is associatedwith characteristic patterns of attention both whenthat emotion is visible and when it is merely antici-pated Although previous research has segmented partsof the face into three or four distinct regions (seeAviezer et al 2008 Buchan Pare amp Munhall 2007Everdell et al 2007 Henderson Falk Minut Dyer ampMahadevan 2000) the present study offers a moredetailed analysis using 21 defined regions Our resultsshow that of these regions the five main facial regions(eyes upper nose lower nose upper lip nasion)accounted for 8803 of all fixations any other facialregion accounted for at most 3 of fixations Thesepatterns suggest that these five facial regions may be themost critical for emotional recognition within faces

Despite the dominance of these five regions therewere consistent differences in fixation patterns whenseeking different emotional cues within a face Figure 5summarizes the regions that were highly fixated foreach emotion (relative to the other emotions) and offersa comparison of these patterns to past work exploringwhich regions are most diagnostic for evaluating agiven emotion (see Figure 5 Smith et al 2005)

For joy participants fixated the least at the eyesand spent the most time fixating at the upper lipsThis is likely driven by the importance of the upperlip relative to the smile the most salient facialfeature of joy (see Figure 5) Importantly thesefixation differences were significant across emo-tional and neutral stimuli suggesting a particularlystrong goal-driven strategy for perceiving joy Foremotional faces there also appeared to be amarginal difference in which the upper nose wasfixated less frequently These differences wereprimarily driven by the first few fixations

For disgust participants fixated more often at theupper lip and less often at the eyes Thesedifferences are likely due to the importance of thefurrowing of the nose and mouth when making thedisgust facial expression thus making the upper lipmore salient These patterns were primarily drivenby the first few fixations

For fear participants fixated more at the eyes andrelatively less at the nasion (see Figure 5) Theredid not appear to be any significant differencesrelative to other emotions in fixations at the uppernose lower nose and upper lip These patternswere fairly consistent across time for the nasionalthough the eyes became increasingly fixated overtime

For anger participants fixated most at the eyesand least at the upper lip When emotion was notpresent participants fixated significantly more at

the nasion as well There did not appear to be anysignificant fixation differences at the upper andlower nose (see Figure 5) These results areconsistent with previous findings suggesting thediagnostic value of the eyes and nasion fordetecting anger (Smith et al 2005) These patternswere consistent across time

For sadness participants fixated most at the eyesIn contrast fixation time was significantly lower atthe upper lip These fixation patterns weresignificant across both emotional and neutralstimuli for both the eyes and upper lip suggestingthis pattern may be driven by a goal-drivenstrategy These patterns were consistent acrosstime

For shame participants fixated to a greater extentat the eyes This pattern was most pronouncedafter the first fixation For all other regions theredid not appear to be any significant differences (seeFigure 5)

Across all blocks it is interesting that there were nosignificant differences found in fixation time spent atthe lower nose whether for emotional or neutral facesit appears to serve as a central fixation point across aface Another finding in our data is that dynamicdifferences are apparent even within the first fixationfor example in the much greater percentage of fixationsto the eye region for anger relative to joyful expres-sions This provides further support for a goal-drivenstrategy during emotional recognition of faces becausesuch differences were typically maintained acrossemotional and neutral stimuli The values and differ-ences within many regions across emotion type werelarger and more pronounced while viewing emotionalversus neutral stimuli still suggesting an important rolefor stimulus-driven factors on eye movements

Gaze patterns were particularly variable acrossemotions in the first few fixations and became lessvariable for the remainder of the viewing time Theupper nose was fixated to the greatest extent as the firstfixation regardless of emotion type and then fixationrates remained relatively variable depending on theemotion type for the remainder of the trial In contrastfor the lower nose the first fixation demonstratedvariability across different emotion types but was areliable target at the second fixation regardless ofemotion For all emotions the upper lip was consis-tently a weaker target at the first fixation and fixatedmost during the second fixation There was no generalfixation pattern at the nasion demonstrating widevariability across emotions over time We also observeda significant left dominance for the initial fixationregardless of stimulus type

Although participants made an average of 83fixations per trial the prediction rates for the Bayesianclassifier showed that the first two fixation locations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 13

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

were most predictive of the type of emotion sought bythe observer This result is consistent with previouswork showing that face recognition can be completedafter only one to two eye movements (Hsiao amp Cottrell2008) although the number of fixations needed torecognize emotional content may be greater

Building on the diagnostic regions of the emotionalfaces identified in Experiment 1 Experiment 2 offered adirect evaluation to whether those regions wereimportant by covering either the mouth or the eyesduring an emotional judgment task Consistent with thepredictions of Experiment 1 emotional judgmentperformance was impaired for emotions for which themouth was more diagnostic (joy disgust) as well as foremotions for which the eyes were more diagnostic(anger shame) Although the present data cannotdirectly confirm that eye movements to these morediagnostic regions helped performance they do showthat attentional strategies were aligned with locationsthat carry diagnostic information

Conclusion

When asked to evaluate the presence of a specificemotion within a face participants focused on acommon set of face regions but also used emotion-specific eye-movement strategies in both emotional andneutral faces These results are consistent with the ideathat focusing attention on certain diagnostic regions isbeneficial for emotion processing and that thesestrategies may be driven by both stimulus-driven andgoal-driven factors The evidence for goal-drivenstrategies is consistent with previous research suggest-ing that the goals and characteristics of the perceivermay influence how eye movements are deployed duringfacial emotion recognition (Jack et al 2009 Perlman etal 2009) Face and emotion recognition are even morebroadly affected by a facersquos gender (Wright amp Sladden2003) race or culture (Jack et al 2009 OrsquoToole et al1994) and individual facial morphology (Oosterhof ampTodorov 2009) These effects likely interact with thegoal-driven factors observed in the current study andwe hope that future research explores such interactions

The presence of a goal-driven strategy duringemotional recognition has numerous practical impli-cations specifically to individuals with disorders thatmay result in social deficits Patients with certaindisorders such as autism (Pelphrey et al 2002 SpezioAdolphs et al 2007 Spezio Huang Castelli ampAdolphs 2007) schizophrenia (Loughland Williamsamp Gordon 2002) social phobias (Horley WilliamsGonsalvez amp Gordon 2004) and Alzheimerrsquos disease(Hargrave Maddock amp Stone 2002 Ogrocki Hills ampStrauss 2000) demonstrate eye-movement patterns

that are remarkably different from normal participantsand these anomalous patterns may impair their abilityto correctly identify the emotional valence of facesThese individuals typically focus longer at nondiag-nostic regions (such as the brow or cheeks) relative todiagnostic regions of the face (eyes mouth nose)during emotional recognition These abnormal eye-movement patterns contribute to social or emotionaldeficits that accompany abnormal face recognition Bycomparing such abnormal eye-movement patterns toour results it may be possible to identify ways in whichaltered eye-movement patterns could improve emotionrecognition

Keywords face recognition emotion eye movementsattention fixation

Acknowledgments

We thank Trixie Lipke for assistance with datacollection and Sumeeth Jonathan for programming andhelp with movie creation

Commercial relationships noneCorresponding author Mark W SchurginEmail maschurginjhueduAddress Department of Psychological amp Brain Sci-ences Johns Hopkins University Baltimore MDUSA

References

Adolphs R Gosselin F Buchanan T W TranelD Schyns P amp Damasio A R (2005) Amechanism for impaired fear recognition afteramygdala damage Nature 433(7021) 68ndash72

Adolphs R Tranel D Damasio H amp Damasio A(1994) Impaired recognition of emotion in facialexpressions following bilateral damage to thehuman amygdala Nature 372 669ndash672

Aviezer H Hassin R R Ryan J Grady CSusskind J Anderson A Bentin S (2008)Angry disgusted or afraid Studies on the mal-leability of emotion perception PsychologicalScience 19(7) 724ndash732

Baron-Cohen S amp Wheelwright S (2004) Theempathy quotient An investigation of adults withAsperger syndrome or high functioning autism andnormal sex differences Journal of Autism andDevelopmental Disorders 34(2) 163ndash175

Beaupre M G amp Hess U (2005) Cross-culturalemotion recognition among Canadian ethnic

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 14

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

groups Journal of Cross-Cultural Psychology36(3) 355ndash370

Blair R J R (2005) Responding to the emotions ofothers Dissociating forms of empathy through thestudy of typical and psychiatric populationsConsciousness and Cognition 14(4) 698ndash718

Brainard D H (1997) The Psychophysics ToolboxSpatial Vision 10 443ndash446

Buchan J N Pare M amp Munhall K G (2007)Spatial statistics of gaze fixations during dynamicface processing Social Neuroscience 2(1) 1ndash13

Darwin C (1872) The expression of emotions in manand animals New York Philosophical Library

Ekman P amp Friesen W V (1975) Unmasking theface Englewood Cliffs NJ Prentice Hall

Ekman P amp Friesen W V (1978) The Facial ActionCoding System (FACS) A technique for themeasurement of facial action Palo Alto CAConsulting Psychologists Press

Elfenbein H A amp Ambady N (2002) Is there an in-group advantage in emotion recognition

Everdell I T Marsh H Yurick M D Munhall KG amp Pare M (2007) Gaze behaviour inaudiovisual speech perception Asymmetrical dis-tribution of face-directed fixations Perception 361535ndash1545

Gosselin F amp Schyns P G (2001) Bubbles Atechnique to reveal the use of information inrecognition tasks Vision Research 41(17) 2261ndash2271

Hargrave R Maddock R J amp Stone V (2002)Impaired recognition of facial expressions ofemotion in Alzheimerrsquos disease Journal of Neuro-psychiatry and Clinical Neurosciences 14 64ndash71

Hejmadi A Davidson R J amp Rozin P (2000)Exploring Hindu Indian emotion expressionsEvidence for accurate recognition by Americansand Indians Psychological Science 11(3) 183ndash187

Henderson J M Falk R Minut S Dyer F C ampMahadevan S (2000) Gaze control for facelearning and recognition by humans and machinesMichigan State University Eye Movement Labora-tory Technical Report 4 1ndash14

Henderson J M Williams C C amp Falk R J (2005)Eye movements are functional during face learningMemory amp Cognition 33(1) 98ndash106

Horley K Williams L M Gonsalvez C amp GordonE (2004) Face to face Visual scanpath evidencefor abnormal processing of facial expressions insocial phobia Psychiatry Research 127 43ndash53

Hsiao J H W amp Cottrell G (2008) Two fixations

suffice in face recognition Psychological Science19(10) 998ndash1006

Jack R E Blais C Scheepers C Schyns P G ampCaldara R (2009) Cultural confusions show thatfacial expressions are not universal Current Biol-ogy 19 1ndash6

Keltner D amp Buswell B N (1997) EmbarrassmentIts distinct form and appeasement functionsPsychological Bulletin 122(3) 250ndash270

Kowler E Anderson E Dosher B amp Blaser E(1995) The role of attention in the programming ofsaccades Vision Research 35(13) 1897ndash1916

Loughland C M Williams L M amp Gordon E(2002) Schizophrenia and affective disorder showdifferent visual scanning behaviour for faces Atrait versus state-based distinction BiologicalPsychiatry 52 338ndash348

Marsh A A Kozak M N amp Ambady N (2007)Accurate identification of fear facial expressionspredicts prosocial behavior Emotion 7(2) 239

Nahm F K D Perret A Amaral D G amp AlbrightT D (1997) How do monkeys look at facesJournal of Cognitive Neuroscience 9(5) 611ndash623

Ogrocki P K Hills A C amp Strauss M E (2000)Visual exploration of facial emotion by healthyolder adults and patients with Alzheimer diseaseNeuropsychiatry Neuropsychology and BehavioralNeurology 13 271ndash278

Oosterhof N N amp Todorov A (2009) Sharedperceptual basis of emotional expressions andtrustworthiness impressions from faces Emotion9(1) 128

OrsquoToole A J Deffenbacher K A Valentin D ampAbdi H (1994) Structural aspects of face recog-nition and the other-race effect Memory ampCognition 22(2) 208ndash224

Pelli D G (1997) The VideoToolbox software forvisual psychophysics Transforming numbers intomovies Spatial Vision 10 437ndash442

Pelphrey K A Sasson N J Reznick J Paul GGoldman B D amp Piven J (2002) Visualscanning of faces in autism Journal of Autism andDevelopmental Disorders 32(4) 1573ndash3432

Perlman S B Morris J P Vander Wyk B CGreen S R Doyle J L amp Pelphrey K A(2009) Individual differences in personality shapehow people look at faces PLoS ONE 4(6) e5952

Pflugshaupt T Mosimann U P Schmitt W J vonWartburg R Wurtz P Luthi M Muri RM (2007) To look or not to look at threatScanpath differences within a group of spiderphobics Journal of Anxiety Disorders 21(3) 353ndash366

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 15

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Sasson N Tsuchiya N Hurley R Couture S MPenn D L Adolphs R amp Piven J (2007)Orienting to social stimuli differentiates socialcognitive impairment in autism and schizophreniaNeuropsychologia 45 2580ndash2588

Schyns P G Petro L S amp Smith M L (2009)Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brainBehavioral and brain evidence PLoS One 4(5)e5625

Segerstrom S C (2001) Optimism and attentionalbias for negative and positive stimuli Personalityand Social Psychology Bulletin 27 1334ndash1343

Smith M L Cottrell G W Gosselin F amp SchynsP G (2005) Transmitting and decoding facialexpressions Psychological Science 16(3) 184ndash189

Spezio M L Adolphs R Hurley R S E amp PivenJ (2007) Analysis of face gaze in autism usinglsquolsquoBubblesrsquorsquo Neuropsychologia 45 144ndash151

Spezio M L Huang P-Y S Castelli F amp Adolphs

R (2007) Amygdala damage impairs eye contactduring conversations with real people The Journalof Neuroscience 27(15) 3994ndash3997

Susskind J M Lee D H Cusi A Feiman RGrabski W amp Anderson A K (2008) Expressingfear enhances sensory acquisition Nature Neuro-science 11(7) 843ndash850

Tracy J L amp Robins R W (2004) Show your prideEvidence for a discrete emotion expression Psy-chological Science 15 194ndash197

Walker-Smith G J Gale A G amp Findlay J M(1977) Eye movement strategies involved in faceperception Perception 6(3) 313ndash326

Wright D B amp Sladden B (2003) An own genderbias and the importance of hair in face recognitionActa Psychologica 114(1) 101ndash114

Zhao W Chellappa R Phillips P J amp RosenfeldA (2003) Face recognition A literature surveyAcm Computing Surveys (CSUR) 35(4) 399ndash458

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 16

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

  • Introduction
  • Experiment 1 Eye tracking of
  • f01
  • f02
  • f03
  • f04
  • f05
  • f06
  • f07
  • Experiment 2 Varying diagnostic information
  • f08
  • f09
  • General discussion
  • Conclusion
  • Adolphs1
  • Adolphs2
  • Aviezer1
  • BaronCohen1
  • Beaupre1
  • Blair1
  • Brainard1
  • Buchan1
  • Darwin1
  • Ekman1
  • Ekman2
  • Elfenbein1
  • Everdell1
  • Gosselin1
  • Hargrave1
  • Hejmadi1
  • Henderson1
  • Henderson2
  • Horley1
  • Hsiao1
  • Jack1
  • Keltner1
  • Kowler1
  • Loughland1
  • Marsh1
  • Nahm1
  • Ogrocki1
  • Oosterhof1
  • Otoole1
  • Pelli1
  • Pelphrey1
  • Perlman1
  • Pflugshaupt1
  • Sasson1
  • Schyns1
  • Segerstrom1
  • Smith1
  • Spezio1
  • Spezio2
  • Susskind1
  • Tracy1
  • WalkerSmith1
  • Wright1
  • Zhao1

consistently high target across all emotions during thefirst fixation However despite this popularity demon-strated there was still considerable variability infixation patterns across emotions The lower noseshowed wide variability across all emotions at the firstfixation but afterward fixation rates remained muchmore consistent across emotions The upper lip wasfixated at particularly high rates during the secondfixation for joy and disgust and to a weaker degreefear The nasion showed wide variability in fixationpatterns across emotions although there was a highfixation preference at the first fixation for shame

Asymmetry analysis

Previous research has shown that initial fixationstoward faces are biased toward the left side of the facefollowed by a rightward fixation (Everdell MarshYurick Munhall amp Pare 2007) We tested for thiseffect in the present data by coding each fixation asbeing toward the left or right excluding center regionsof the face (UN LN UL LL) A one-way ANOVAwith fixation number (one through eight) and blocktype (emotional or neutral) as factors showed a maineffect of fixation number F(1 7) frac14 4208 p 0001This effect was driven by the first fixation andremoving that fixation from the analysis eliminatedthe main effect F(1 6) frac14 0579 p frac14 0747 The firstfixation during trials across block type had a strongbias toward the left side of the screen (M frac14 581) incomparison to fixations two through eight (M frac14

462) which demonstrated a slight bias toward theright side of the screen These effects confirm theinitial leftward bias in face fixation patterns Possibleroots of this effect including a learned strategy thatthe right side of the face contains the most informa-tion or a general perceptual bias toward the left visualhemifield are discussed in detail in Everdell et al(2007)

Fixation patterns predict facial emotion

If the type of facial emotion displayed (or sought bythe observer) systematically affects the observerrsquospattern of eye movements it should be possible topredict the type of face displayed (or sought) bystatistical analysis of the eye movement sequence forany given trial We separately submitted the first 1ndashNfixation regions (where N frac14 1 to 8) from every trial inthe data set to a naive Bayesian classifier for categoricaldata (the NaiveBayes class of Matlab) using two thirdsof the data for training and one third for testing Wecalculated the average classification accuracy across100 fittest iterations selecting traintest sets indepen-dently for each iteration for (a) all trials (b) emotionalimagendashonly trials and (c) neutral trials

Figure 6 depicts classification accuracy for eachcondition as well as chance prediction (for sixemotions 166) Emotional-image trial classifica-tion accuracy of test trials was highest (with a peakof around 25 accuracy) for the emotional imagendashonly trial set consistent with the stronger fixationpatterns described in our other analyses Neutral-image trial classification accuracy was lower but stillwell above chance performance (with a peak ofaround 20 accuracy) Including all trials led toperformance in between these two (with a peak ofaround 23 accuracy) consistent with Hsiao andCottrell (2008) who showed that the first one to twofixations were most critical for face processing thediagnostic value of our fixation data was primarily inthe first two fixations It is clear from Figure 7 thatthe bulk of predictive power is reached by the secondfixation

Finally we examined the confusion matrix pro-duced by the classifier (Figure 8) When the wrongemotion is predicted which emotions are systemat-ically confused because they are most similar Wefocused here on two subsets of trials the highestemotion (60) images and the neutral image trialsFor the highest emotion images two pairs ofemotions stood out as having the most confusablefixation patterns joy and disgust and anger andsadness This confusability is consistent with theresults depicted in Figure 5 which reveals strongsimilarities between the fixation patterns produced by

Figure 7 Classification accuracy for each condition (emotional

face trials neutral face trials both trial types and chance

prediction) Classification accuracy for emotional trials was

highest consistent with the stronger fixation patterns observed

in our other analyses Neutral trial classification accuracy was

lower but still well above chance Consistent with the observed

fixation patterns of humans the diagnostic value of fixation

data was primarily reached in the first two fixations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 10

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

these pairs of images Two pairs of emotions stoodout as the least confusable joy and sadness and joyand anger This is also consistent with the resultsdepicted in Figure 5 in which the fixation pattern forjoy contrasts strongly with the pattern for anger andsadness For the neutral-trial subset the mostconfusable image pair was again joy and disgust andthe least confusable was again joy and sadness Otherpairings did not stand out as strong outliers for thissubset

Experiment 1 summary

Participants moved their eyes in distinctive patternsfor each type of emotional image These patternsremained (although weaker) for neutral images in eachemotionrsquos block revealing that these patterns were atleast partially goal-directed The pattern of fixationsalone could predict the type of image being sought(even if it was not present) at above chance levels in anaive Bayesian classifier

Participants preferentially fixate certain regions thatthey at least implicitly think are more diagnostic forrecognition of different types of emotions Experiment2 verifies that the two broadest differentiable regionsmdashthe eyes and the mouthmdashdo indeed differ in how wellthey signal particular emotions

Experiment 2 Varying diagnosticinformation of faces altersperformance

In order to verify whether different areas of our facestimuli contain more or less diagnostic information foreach emotion we asked a new set of participants tomake similar judgments of the emotional content offaces but we occluded information either from the eyeregion or the mouth region We predicted that foremotion judgments for which participants in Experi-ment 1 fixated the eye region at relatively high rates(ie anger and shame) occluding that region woulddecrease emotion detection accuracy In contrast foremotions for which the mouth region was particularlyhighly fixated (ie joy and disgust) occluding thatregion would decrease emotion detection accuracy

Methods

Participants

Ten college-aged participants completed Experiment2 All participants gave consent and received coursecredit for their participation

Figure 8 Confusion matrices for the Bayesian classifier including information from the first eight fixations from trials with

photographs showing the highest emotion level (left column) and trials with neutral emotion photographs (right column) The top

row contains full confusion matrices with bold type indicating significant differences from expectations of chance (1667)

conservatively Bonferroni corrected for 36 comparisons To facilitate visual inspection cell values are redundantly color coded for

magnitude within each table The bottom row removes bold formatting and folds each table along its identity diagonal to depict

mutual confusability regardless of whether a pair of classifications is actual and predicted or predicted and actual respectively Boxed

values are highlighted and discussed in the text

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 11

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Stimuli

Stimuli were identical to that of Experiment 1 withthe following exceptions For emotional faces onlyone emotional morph was used per condition Angerdisgust fear and sadness consisted of 40 emotionalintensity morphs whereas joy consisted of only 20and shame 60 We determined the specific emotionalintensity per emotion during piloting in order toensure participants could perform above chance butnot at ceiling (70 correct) The values used areconsistent with the performance observed in Experi-ment 1 (see Figure 3) Additionally all face stimulihad a black box (838 middot 288) covering either the eyesor the mouth

Apparatus

All of the stimuli were created and displayed usingMATLAB with the Psychophysics Toolbox (Brainard1997 Pelli 1997) on an Intel Macintosh running OS X106 All stimuli were displayed on a 17-in View-SonicE70fB CRT monitor (1024 middot 786 resolution 75Hz 2897 pixels8 of visual angle) The viewing distancewas approximately 56 cm

Procedure

This experiment consisted of a total of 72 trialspresented as six blocks of trials (one block of trials peremotion type) shown in random order repeated twicefor a total of 144 trials per participant Each blockconsisted of 12 trials per block consisting of sixneutral and six emotional face trials presented inrandomized order At the beginning of each trialparticipants saw a fixation cross for 1 s followed by abriefly presented face for 100 ms For half of the trials

a black box covered either the eyes or the mouth (seeFigure 9) Participants then judged whether the photodepicted a specific emotion or not (ie angry or notangry)

Results

We examined whether or not overall performancediffered according to whether the mouth (ie eyescovered by black box) or eyes (ie mouth covered byblack box) were visible For joy we found performancewas significantly better when the mouth was visible (Mfrac14 633) relative to when it was covered (Mfrac14 517)t(9)frac14 281 pfrac14 002 A similar pattern was observed fordisgust (Mfrac14 758 Mfrac14 601) t(9)frac14 338 p 001Conversely we found performance was significantlybetter for anger when the eyes were visible (Mfrac14 692)compared to when they were covered (Mfrac14 608) t(9)frac14 335 p 001 A similar trend was observed forshame (M frac14 817 M frac14 683) t(9)frac14 181 p frac14 010We failed to find any significant differences for fear (Mfrac14 65 M frac14 675) and sadness (M frac14 717 M frac14717)

Discussion

Emotion judgment performance depended on whichregions of the face were covered in a mannerconsistent with the fixation patterns found in Exper-iment 1 Performance was best for joy and disgustwhen the eyes were covered and the mouth was visibleconsistent with the eye fixation rates we observed inExperiment 1 in which observers tended to fixate moreat the upper lip and less at the eyes relative to otheremotions Figure 9 provides a strong illustration ofthis effect Joy is easy to detect in the left face butharder to detect in the right face We failed to find anymeaningful differences for fearful faces which isconsistent with the diagnostic regions identified inExperiment 1 the eyes and upper lip are notdifferentially important to identify fear For angerperformance was significantly better when the eyeswere visible compared to when they were covered Asimilar trend was also observed for shameful facesBoth of these effects are consistent with the diagnosticareas identified in Experiment 1 We did not see anysignificant differences for detecting sadness whencovering the mouth versus the eyes even thoughExperiment 1 did suggest different levels of diagnos-ticity for those regions Overall five out of the sixemotion types showed results consistent with thepredictions given the diagnostic areas of emotionalfaces identified in Experiment 1

Figure 9 For all trials participants were briefly shown a face

with either the eyes or mouth occluded The task was to judge

whether the face had a particular emotion or not (A) An

example of a joyful face with the eyes covered (B) an example

of the same face but with the mouth covered

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 12

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

General discussion

Perceiving emotion expressed in a face is associatedwith characteristic patterns of attention both whenthat emotion is visible and when it is merely antici-pated Although previous research has segmented partsof the face into three or four distinct regions (seeAviezer et al 2008 Buchan Pare amp Munhall 2007Everdell et al 2007 Henderson Falk Minut Dyer ampMahadevan 2000) the present study offers a moredetailed analysis using 21 defined regions Our resultsshow that of these regions the five main facial regions(eyes upper nose lower nose upper lip nasion)accounted for 8803 of all fixations any other facialregion accounted for at most 3 of fixations Thesepatterns suggest that these five facial regions may be themost critical for emotional recognition within faces

Despite the dominance of these five regions therewere consistent differences in fixation patterns whenseeking different emotional cues within a face Figure 5summarizes the regions that were highly fixated foreach emotion (relative to the other emotions) and offersa comparison of these patterns to past work exploringwhich regions are most diagnostic for evaluating agiven emotion (see Figure 5 Smith et al 2005)

For joy participants fixated the least at the eyesand spent the most time fixating at the upper lipsThis is likely driven by the importance of the upperlip relative to the smile the most salient facialfeature of joy (see Figure 5) Importantly thesefixation differences were significant across emo-tional and neutral stimuli suggesting a particularlystrong goal-driven strategy for perceiving joy Foremotional faces there also appeared to be amarginal difference in which the upper nose wasfixated less frequently These differences wereprimarily driven by the first few fixations

For disgust participants fixated more often at theupper lip and less often at the eyes Thesedifferences are likely due to the importance of thefurrowing of the nose and mouth when making thedisgust facial expression thus making the upper lipmore salient These patterns were primarily drivenby the first few fixations

For fear participants fixated more at the eyes andrelatively less at the nasion (see Figure 5) Theredid not appear to be any significant differencesrelative to other emotions in fixations at the uppernose lower nose and upper lip These patternswere fairly consistent across time for the nasionalthough the eyes became increasingly fixated overtime

For anger participants fixated most at the eyesand least at the upper lip When emotion was notpresent participants fixated significantly more at

the nasion as well There did not appear to be anysignificant fixation differences at the upper andlower nose (see Figure 5) These results areconsistent with previous findings suggesting thediagnostic value of the eyes and nasion fordetecting anger (Smith et al 2005) These patternswere consistent across time

For sadness participants fixated most at the eyesIn contrast fixation time was significantly lower atthe upper lip These fixation patterns weresignificant across both emotional and neutralstimuli for both the eyes and upper lip suggestingthis pattern may be driven by a goal-drivenstrategy These patterns were consistent acrosstime

For shame participants fixated to a greater extentat the eyes This pattern was most pronouncedafter the first fixation For all other regions theredid not appear to be any significant differences (seeFigure 5)

Across all blocks it is interesting that there were nosignificant differences found in fixation time spent atthe lower nose whether for emotional or neutral facesit appears to serve as a central fixation point across aface Another finding in our data is that dynamicdifferences are apparent even within the first fixationfor example in the much greater percentage of fixationsto the eye region for anger relative to joyful expres-sions This provides further support for a goal-drivenstrategy during emotional recognition of faces becausesuch differences were typically maintained acrossemotional and neutral stimuli The values and differ-ences within many regions across emotion type werelarger and more pronounced while viewing emotionalversus neutral stimuli still suggesting an important rolefor stimulus-driven factors on eye movements

Gaze patterns were particularly variable acrossemotions in the first few fixations and became lessvariable for the remainder of the viewing time Theupper nose was fixated to the greatest extent as the firstfixation regardless of emotion type and then fixationrates remained relatively variable depending on theemotion type for the remainder of the trial In contrastfor the lower nose the first fixation demonstratedvariability across different emotion types but was areliable target at the second fixation regardless ofemotion For all emotions the upper lip was consis-tently a weaker target at the first fixation and fixatedmost during the second fixation There was no generalfixation pattern at the nasion demonstrating widevariability across emotions over time We also observeda significant left dominance for the initial fixationregardless of stimulus type

Although participants made an average of 83fixations per trial the prediction rates for the Bayesianclassifier showed that the first two fixation locations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 13

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

were most predictive of the type of emotion sought bythe observer This result is consistent with previouswork showing that face recognition can be completedafter only one to two eye movements (Hsiao amp Cottrell2008) although the number of fixations needed torecognize emotional content may be greater

Building on the diagnostic regions of the emotionalfaces identified in Experiment 1 Experiment 2 offered adirect evaluation to whether those regions wereimportant by covering either the mouth or the eyesduring an emotional judgment task Consistent with thepredictions of Experiment 1 emotional judgmentperformance was impaired for emotions for which themouth was more diagnostic (joy disgust) as well as foremotions for which the eyes were more diagnostic(anger shame) Although the present data cannotdirectly confirm that eye movements to these morediagnostic regions helped performance they do showthat attentional strategies were aligned with locationsthat carry diagnostic information

Conclusion

When asked to evaluate the presence of a specificemotion within a face participants focused on acommon set of face regions but also used emotion-specific eye-movement strategies in both emotional andneutral faces These results are consistent with the ideathat focusing attention on certain diagnostic regions isbeneficial for emotion processing and that thesestrategies may be driven by both stimulus-driven andgoal-driven factors The evidence for goal-drivenstrategies is consistent with previous research suggest-ing that the goals and characteristics of the perceivermay influence how eye movements are deployed duringfacial emotion recognition (Jack et al 2009 Perlman etal 2009) Face and emotion recognition are even morebroadly affected by a facersquos gender (Wright amp Sladden2003) race or culture (Jack et al 2009 OrsquoToole et al1994) and individual facial morphology (Oosterhof ampTodorov 2009) These effects likely interact with thegoal-driven factors observed in the current study andwe hope that future research explores such interactions

The presence of a goal-driven strategy duringemotional recognition has numerous practical impli-cations specifically to individuals with disorders thatmay result in social deficits Patients with certaindisorders such as autism (Pelphrey et al 2002 SpezioAdolphs et al 2007 Spezio Huang Castelli ampAdolphs 2007) schizophrenia (Loughland Williamsamp Gordon 2002) social phobias (Horley WilliamsGonsalvez amp Gordon 2004) and Alzheimerrsquos disease(Hargrave Maddock amp Stone 2002 Ogrocki Hills ampStrauss 2000) demonstrate eye-movement patterns

that are remarkably different from normal participantsand these anomalous patterns may impair their abilityto correctly identify the emotional valence of facesThese individuals typically focus longer at nondiag-nostic regions (such as the brow or cheeks) relative todiagnostic regions of the face (eyes mouth nose)during emotional recognition These abnormal eye-movement patterns contribute to social or emotionaldeficits that accompany abnormal face recognition Bycomparing such abnormal eye-movement patterns toour results it may be possible to identify ways in whichaltered eye-movement patterns could improve emotionrecognition

Keywords face recognition emotion eye movementsattention fixation

Acknowledgments

We thank Trixie Lipke for assistance with datacollection and Sumeeth Jonathan for programming andhelp with movie creation

Commercial relationships noneCorresponding author Mark W SchurginEmail maschurginjhueduAddress Department of Psychological amp Brain Sci-ences Johns Hopkins University Baltimore MDUSA

References

Adolphs R Gosselin F Buchanan T W TranelD Schyns P amp Damasio A R (2005) Amechanism for impaired fear recognition afteramygdala damage Nature 433(7021) 68ndash72

Adolphs R Tranel D Damasio H amp Damasio A(1994) Impaired recognition of emotion in facialexpressions following bilateral damage to thehuman amygdala Nature 372 669ndash672

Aviezer H Hassin R R Ryan J Grady CSusskind J Anderson A Bentin S (2008)Angry disgusted or afraid Studies on the mal-leability of emotion perception PsychologicalScience 19(7) 724ndash732

Baron-Cohen S amp Wheelwright S (2004) Theempathy quotient An investigation of adults withAsperger syndrome or high functioning autism andnormal sex differences Journal of Autism andDevelopmental Disorders 34(2) 163ndash175

Beaupre M G amp Hess U (2005) Cross-culturalemotion recognition among Canadian ethnic

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 14

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

groups Journal of Cross-Cultural Psychology36(3) 355ndash370

Blair R J R (2005) Responding to the emotions ofothers Dissociating forms of empathy through thestudy of typical and psychiatric populationsConsciousness and Cognition 14(4) 698ndash718

Brainard D H (1997) The Psychophysics ToolboxSpatial Vision 10 443ndash446

Buchan J N Pare M amp Munhall K G (2007)Spatial statistics of gaze fixations during dynamicface processing Social Neuroscience 2(1) 1ndash13

Darwin C (1872) The expression of emotions in manand animals New York Philosophical Library

Ekman P amp Friesen W V (1975) Unmasking theface Englewood Cliffs NJ Prentice Hall

Ekman P amp Friesen W V (1978) The Facial ActionCoding System (FACS) A technique for themeasurement of facial action Palo Alto CAConsulting Psychologists Press

Elfenbein H A amp Ambady N (2002) Is there an in-group advantage in emotion recognition

Everdell I T Marsh H Yurick M D Munhall KG amp Pare M (2007) Gaze behaviour inaudiovisual speech perception Asymmetrical dis-tribution of face-directed fixations Perception 361535ndash1545

Gosselin F amp Schyns P G (2001) Bubbles Atechnique to reveal the use of information inrecognition tasks Vision Research 41(17) 2261ndash2271

Hargrave R Maddock R J amp Stone V (2002)Impaired recognition of facial expressions ofemotion in Alzheimerrsquos disease Journal of Neuro-psychiatry and Clinical Neurosciences 14 64ndash71

Hejmadi A Davidson R J amp Rozin P (2000)Exploring Hindu Indian emotion expressionsEvidence for accurate recognition by Americansand Indians Psychological Science 11(3) 183ndash187

Henderson J M Falk R Minut S Dyer F C ampMahadevan S (2000) Gaze control for facelearning and recognition by humans and machinesMichigan State University Eye Movement Labora-tory Technical Report 4 1ndash14

Henderson J M Williams C C amp Falk R J (2005)Eye movements are functional during face learningMemory amp Cognition 33(1) 98ndash106

Horley K Williams L M Gonsalvez C amp GordonE (2004) Face to face Visual scanpath evidencefor abnormal processing of facial expressions insocial phobia Psychiatry Research 127 43ndash53

Hsiao J H W amp Cottrell G (2008) Two fixations

suffice in face recognition Psychological Science19(10) 998ndash1006

Jack R E Blais C Scheepers C Schyns P G ampCaldara R (2009) Cultural confusions show thatfacial expressions are not universal Current Biol-ogy 19 1ndash6

Keltner D amp Buswell B N (1997) EmbarrassmentIts distinct form and appeasement functionsPsychological Bulletin 122(3) 250ndash270

Kowler E Anderson E Dosher B amp Blaser E(1995) The role of attention in the programming ofsaccades Vision Research 35(13) 1897ndash1916

Loughland C M Williams L M amp Gordon E(2002) Schizophrenia and affective disorder showdifferent visual scanning behaviour for faces Atrait versus state-based distinction BiologicalPsychiatry 52 338ndash348

Marsh A A Kozak M N amp Ambady N (2007)Accurate identification of fear facial expressionspredicts prosocial behavior Emotion 7(2) 239

Nahm F K D Perret A Amaral D G amp AlbrightT D (1997) How do monkeys look at facesJournal of Cognitive Neuroscience 9(5) 611ndash623

Ogrocki P K Hills A C amp Strauss M E (2000)Visual exploration of facial emotion by healthyolder adults and patients with Alzheimer diseaseNeuropsychiatry Neuropsychology and BehavioralNeurology 13 271ndash278

Oosterhof N N amp Todorov A (2009) Sharedperceptual basis of emotional expressions andtrustworthiness impressions from faces Emotion9(1) 128

OrsquoToole A J Deffenbacher K A Valentin D ampAbdi H (1994) Structural aspects of face recog-nition and the other-race effect Memory ampCognition 22(2) 208ndash224

Pelli D G (1997) The VideoToolbox software forvisual psychophysics Transforming numbers intomovies Spatial Vision 10 437ndash442

Pelphrey K A Sasson N J Reznick J Paul GGoldman B D amp Piven J (2002) Visualscanning of faces in autism Journal of Autism andDevelopmental Disorders 32(4) 1573ndash3432

Perlman S B Morris J P Vander Wyk B CGreen S R Doyle J L amp Pelphrey K A(2009) Individual differences in personality shapehow people look at faces PLoS ONE 4(6) e5952

Pflugshaupt T Mosimann U P Schmitt W J vonWartburg R Wurtz P Luthi M Muri RM (2007) To look or not to look at threatScanpath differences within a group of spiderphobics Journal of Anxiety Disorders 21(3) 353ndash366

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 15

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Sasson N Tsuchiya N Hurley R Couture S MPenn D L Adolphs R amp Piven J (2007)Orienting to social stimuli differentiates socialcognitive impairment in autism and schizophreniaNeuropsychologia 45 2580ndash2588

Schyns P G Petro L S amp Smith M L (2009)Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brainBehavioral and brain evidence PLoS One 4(5)e5625

Segerstrom S C (2001) Optimism and attentionalbias for negative and positive stimuli Personalityand Social Psychology Bulletin 27 1334ndash1343

Smith M L Cottrell G W Gosselin F amp SchynsP G (2005) Transmitting and decoding facialexpressions Psychological Science 16(3) 184ndash189

Spezio M L Adolphs R Hurley R S E amp PivenJ (2007) Analysis of face gaze in autism usinglsquolsquoBubblesrsquorsquo Neuropsychologia 45 144ndash151

Spezio M L Huang P-Y S Castelli F amp Adolphs

R (2007) Amygdala damage impairs eye contactduring conversations with real people The Journalof Neuroscience 27(15) 3994ndash3997

Susskind J M Lee D H Cusi A Feiman RGrabski W amp Anderson A K (2008) Expressingfear enhances sensory acquisition Nature Neuro-science 11(7) 843ndash850

Tracy J L amp Robins R W (2004) Show your prideEvidence for a discrete emotion expression Psy-chological Science 15 194ndash197

Walker-Smith G J Gale A G amp Findlay J M(1977) Eye movement strategies involved in faceperception Perception 6(3) 313ndash326

Wright D B amp Sladden B (2003) An own genderbias and the importance of hair in face recognitionActa Psychologica 114(1) 101ndash114

Zhao W Chellappa R Phillips P J amp RosenfeldA (2003) Face recognition A literature surveyAcm Computing Surveys (CSUR) 35(4) 399ndash458

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 16

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

  • Introduction
  • Experiment 1 Eye tracking of
  • f01
  • f02
  • f03
  • f04
  • f05
  • f06
  • f07
  • Experiment 2 Varying diagnostic information
  • f08
  • f09
  • General discussion
  • Conclusion
  • Adolphs1
  • Adolphs2
  • Aviezer1
  • BaronCohen1
  • Beaupre1
  • Blair1
  • Brainard1
  • Buchan1
  • Darwin1
  • Ekman1
  • Ekman2
  • Elfenbein1
  • Everdell1
  • Gosselin1
  • Hargrave1
  • Hejmadi1
  • Henderson1
  • Henderson2
  • Horley1
  • Hsiao1
  • Jack1
  • Keltner1
  • Kowler1
  • Loughland1
  • Marsh1
  • Nahm1
  • Ogrocki1
  • Oosterhof1
  • Otoole1
  • Pelli1
  • Pelphrey1
  • Perlman1
  • Pflugshaupt1
  • Sasson1
  • Schyns1
  • Segerstrom1
  • Smith1
  • Spezio1
  • Spezio2
  • Susskind1
  • Tracy1
  • WalkerSmith1
  • Wright1
  • Zhao1

these pairs of images Two pairs of emotions stoodout as the least confusable joy and sadness and joyand anger This is also consistent with the resultsdepicted in Figure 5 in which the fixation pattern forjoy contrasts strongly with the pattern for anger andsadness For the neutral-trial subset the mostconfusable image pair was again joy and disgust andthe least confusable was again joy and sadness Otherpairings did not stand out as strong outliers for thissubset

Experiment 1 summary

Participants moved their eyes in distinctive patternsfor each type of emotional image These patternsremained (although weaker) for neutral images in eachemotionrsquos block revealing that these patterns were atleast partially goal-directed The pattern of fixationsalone could predict the type of image being sought(even if it was not present) at above chance levels in anaive Bayesian classifier

Participants preferentially fixate certain regions thatthey at least implicitly think are more diagnostic forrecognition of different types of emotions Experiment2 verifies that the two broadest differentiable regionsmdashthe eyes and the mouthmdashdo indeed differ in how wellthey signal particular emotions

Experiment 2 Varying diagnosticinformation of faces altersperformance

In order to verify whether different areas of our facestimuli contain more or less diagnostic information foreach emotion we asked a new set of participants tomake similar judgments of the emotional content offaces but we occluded information either from the eyeregion or the mouth region We predicted that foremotion judgments for which participants in Experi-ment 1 fixated the eye region at relatively high rates(ie anger and shame) occluding that region woulddecrease emotion detection accuracy In contrast foremotions for which the mouth region was particularlyhighly fixated (ie joy and disgust) occluding thatregion would decrease emotion detection accuracy

Methods

Participants

Ten college-aged participants completed Experiment2 All participants gave consent and received coursecredit for their participation

Figure 8 Confusion matrices for the Bayesian classifier including information from the first eight fixations from trials with

photographs showing the highest emotion level (left column) and trials with neutral emotion photographs (right column) The top

row contains full confusion matrices with bold type indicating significant differences from expectations of chance (1667)

conservatively Bonferroni corrected for 36 comparisons To facilitate visual inspection cell values are redundantly color coded for

magnitude within each table The bottom row removes bold formatting and folds each table along its identity diagonal to depict

mutual confusability regardless of whether a pair of classifications is actual and predicted or predicted and actual respectively Boxed

values are highlighted and discussed in the text

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 11

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Stimuli

Stimuli were identical to that of Experiment 1 withthe following exceptions For emotional faces onlyone emotional morph was used per condition Angerdisgust fear and sadness consisted of 40 emotionalintensity morphs whereas joy consisted of only 20and shame 60 We determined the specific emotionalintensity per emotion during piloting in order toensure participants could perform above chance butnot at ceiling (70 correct) The values used areconsistent with the performance observed in Experi-ment 1 (see Figure 3) Additionally all face stimulihad a black box (838 middot 288) covering either the eyesor the mouth

Apparatus

All of the stimuli were created and displayed usingMATLAB with the Psychophysics Toolbox (Brainard1997 Pelli 1997) on an Intel Macintosh running OS X106 All stimuli were displayed on a 17-in View-SonicE70fB CRT monitor (1024 middot 786 resolution 75Hz 2897 pixels8 of visual angle) The viewing distancewas approximately 56 cm

Procedure

This experiment consisted of a total of 72 trialspresented as six blocks of trials (one block of trials peremotion type) shown in random order repeated twicefor a total of 144 trials per participant Each blockconsisted of 12 trials per block consisting of sixneutral and six emotional face trials presented inrandomized order At the beginning of each trialparticipants saw a fixation cross for 1 s followed by abriefly presented face for 100 ms For half of the trials

a black box covered either the eyes or the mouth (seeFigure 9) Participants then judged whether the photodepicted a specific emotion or not (ie angry or notangry)

Results

We examined whether or not overall performancediffered according to whether the mouth (ie eyescovered by black box) or eyes (ie mouth covered byblack box) were visible For joy we found performancewas significantly better when the mouth was visible (Mfrac14 633) relative to when it was covered (Mfrac14 517)t(9)frac14 281 pfrac14 002 A similar pattern was observed fordisgust (Mfrac14 758 Mfrac14 601) t(9)frac14 338 p 001Conversely we found performance was significantlybetter for anger when the eyes were visible (Mfrac14 692)compared to when they were covered (Mfrac14 608) t(9)frac14 335 p 001 A similar trend was observed forshame (M frac14 817 M frac14 683) t(9)frac14 181 p frac14 010We failed to find any significant differences for fear (Mfrac14 65 M frac14 675) and sadness (M frac14 717 M frac14717)

Discussion

Emotion judgment performance depended on whichregions of the face were covered in a mannerconsistent with the fixation patterns found in Exper-iment 1 Performance was best for joy and disgustwhen the eyes were covered and the mouth was visibleconsistent with the eye fixation rates we observed inExperiment 1 in which observers tended to fixate moreat the upper lip and less at the eyes relative to otheremotions Figure 9 provides a strong illustration ofthis effect Joy is easy to detect in the left face butharder to detect in the right face We failed to find anymeaningful differences for fearful faces which isconsistent with the diagnostic regions identified inExperiment 1 the eyes and upper lip are notdifferentially important to identify fear For angerperformance was significantly better when the eyeswere visible compared to when they were covered Asimilar trend was also observed for shameful facesBoth of these effects are consistent with the diagnosticareas identified in Experiment 1 We did not see anysignificant differences for detecting sadness whencovering the mouth versus the eyes even thoughExperiment 1 did suggest different levels of diagnos-ticity for those regions Overall five out of the sixemotion types showed results consistent with thepredictions given the diagnostic areas of emotionalfaces identified in Experiment 1

Figure 9 For all trials participants were briefly shown a face

with either the eyes or mouth occluded The task was to judge

whether the face had a particular emotion or not (A) An

example of a joyful face with the eyes covered (B) an example

of the same face but with the mouth covered

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 12

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

General discussion

Perceiving emotion expressed in a face is associatedwith characteristic patterns of attention both whenthat emotion is visible and when it is merely antici-pated Although previous research has segmented partsof the face into three or four distinct regions (seeAviezer et al 2008 Buchan Pare amp Munhall 2007Everdell et al 2007 Henderson Falk Minut Dyer ampMahadevan 2000) the present study offers a moredetailed analysis using 21 defined regions Our resultsshow that of these regions the five main facial regions(eyes upper nose lower nose upper lip nasion)accounted for 8803 of all fixations any other facialregion accounted for at most 3 of fixations Thesepatterns suggest that these five facial regions may be themost critical for emotional recognition within faces

Despite the dominance of these five regions therewere consistent differences in fixation patterns whenseeking different emotional cues within a face Figure 5summarizes the regions that were highly fixated foreach emotion (relative to the other emotions) and offersa comparison of these patterns to past work exploringwhich regions are most diagnostic for evaluating agiven emotion (see Figure 5 Smith et al 2005)

For joy participants fixated the least at the eyesand spent the most time fixating at the upper lipsThis is likely driven by the importance of the upperlip relative to the smile the most salient facialfeature of joy (see Figure 5) Importantly thesefixation differences were significant across emo-tional and neutral stimuli suggesting a particularlystrong goal-driven strategy for perceiving joy Foremotional faces there also appeared to be amarginal difference in which the upper nose wasfixated less frequently These differences wereprimarily driven by the first few fixations

For disgust participants fixated more often at theupper lip and less often at the eyes Thesedifferences are likely due to the importance of thefurrowing of the nose and mouth when making thedisgust facial expression thus making the upper lipmore salient These patterns were primarily drivenby the first few fixations

For fear participants fixated more at the eyes andrelatively less at the nasion (see Figure 5) Theredid not appear to be any significant differencesrelative to other emotions in fixations at the uppernose lower nose and upper lip These patternswere fairly consistent across time for the nasionalthough the eyes became increasingly fixated overtime

For anger participants fixated most at the eyesand least at the upper lip When emotion was notpresent participants fixated significantly more at

the nasion as well There did not appear to be anysignificant fixation differences at the upper andlower nose (see Figure 5) These results areconsistent with previous findings suggesting thediagnostic value of the eyes and nasion fordetecting anger (Smith et al 2005) These patternswere consistent across time

For sadness participants fixated most at the eyesIn contrast fixation time was significantly lower atthe upper lip These fixation patterns weresignificant across both emotional and neutralstimuli for both the eyes and upper lip suggestingthis pattern may be driven by a goal-drivenstrategy These patterns were consistent acrosstime

For shame participants fixated to a greater extentat the eyes This pattern was most pronouncedafter the first fixation For all other regions theredid not appear to be any significant differences (seeFigure 5)

Across all blocks it is interesting that there were nosignificant differences found in fixation time spent atthe lower nose whether for emotional or neutral facesit appears to serve as a central fixation point across aface Another finding in our data is that dynamicdifferences are apparent even within the first fixationfor example in the much greater percentage of fixationsto the eye region for anger relative to joyful expres-sions This provides further support for a goal-drivenstrategy during emotional recognition of faces becausesuch differences were typically maintained acrossemotional and neutral stimuli The values and differ-ences within many regions across emotion type werelarger and more pronounced while viewing emotionalversus neutral stimuli still suggesting an important rolefor stimulus-driven factors on eye movements

Gaze patterns were particularly variable acrossemotions in the first few fixations and became lessvariable for the remainder of the viewing time Theupper nose was fixated to the greatest extent as the firstfixation regardless of emotion type and then fixationrates remained relatively variable depending on theemotion type for the remainder of the trial In contrastfor the lower nose the first fixation demonstratedvariability across different emotion types but was areliable target at the second fixation regardless ofemotion For all emotions the upper lip was consis-tently a weaker target at the first fixation and fixatedmost during the second fixation There was no generalfixation pattern at the nasion demonstrating widevariability across emotions over time We also observeda significant left dominance for the initial fixationregardless of stimulus type

Although participants made an average of 83fixations per trial the prediction rates for the Bayesianclassifier showed that the first two fixation locations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 13

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

were most predictive of the type of emotion sought bythe observer This result is consistent with previouswork showing that face recognition can be completedafter only one to two eye movements (Hsiao amp Cottrell2008) although the number of fixations needed torecognize emotional content may be greater

Building on the diagnostic regions of the emotionalfaces identified in Experiment 1 Experiment 2 offered adirect evaluation to whether those regions wereimportant by covering either the mouth or the eyesduring an emotional judgment task Consistent with thepredictions of Experiment 1 emotional judgmentperformance was impaired for emotions for which themouth was more diagnostic (joy disgust) as well as foremotions for which the eyes were more diagnostic(anger shame) Although the present data cannotdirectly confirm that eye movements to these morediagnostic regions helped performance they do showthat attentional strategies were aligned with locationsthat carry diagnostic information

Conclusion

When asked to evaluate the presence of a specificemotion within a face participants focused on acommon set of face regions but also used emotion-specific eye-movement strategies in both emotional andneutral faces These results are consistent with the ideathat focusing attention on certain diagnostic regions isbeneficial for emotion processing and that thesestrategies may be driven by both stimulus-driven andgoal-driven factors The evidence for goal-drivenstrategies is consistent with previous research suggest-ing that the goals and characteristics of the perceivermay influence how eye movements are deployed duringfacial emotion recognition (Jack et al 2009 Perlman etal 2009) Face and emotion recognition are even morebroadly affected by a facersquos gender (Wright amp Sladden2003) race or culture (Jack et al 2009 OrsquoToole et al1994) and individual facial morphology (Oosterhof ampTodorov 2009) These effects likely interact with thegoal-driven factors observed in the current study andwe hope that future research explores such interactions

The presence of a goal-driven strategy duringemotional recognition has numerous practical impli-cations specifically to individuals with disorders thatmay result in social deficits Patients with certaindisorders such as autism (Pelphrey et al 2002 SpezioAdolphs et al 2007 Spezio Huang Castelli ampAdolphs 2007) schizophrenia (Loughland Williamsamp Gordon 2002) social phobias (Horley WilliamsGonsalvez amp Gordon 2004) and Alzheimerrsquos disease(Hargrave Maddock amp Stone 2002 Ogrocki Hills ampStrauss 2000) demonstrate eye-movement patterns

that are remarkably different from normal participantsand these anomalous patterns may impair their abilityto correctly identify the emotional valence of facesThese individuals typically focus longer at nondiag-nostic regions (such as the brow or cheeks) relative todiagnostic regions of the face (eyes mouth nose)during emotional recognition These abnormal eye-movement patterns contribute to social or emotionaldeficits that accompany abnormal face recognition Bycomparing such abnormal eye-movement patterns toour results it may be possible to identify ways in whichaltered eye-movement patterns could improve emotionrecognition

Keywords face recognition emotion eye movementsattention fixation

Acknowledgments

We thank Trixie Lipke for assistance with datacollection and Sumeeth Jonathan for programming andhelp with movie creation

Commercial relationships noneCorresponding author Mark W SchurginEmail maschurginjhueduAddress Department of Psychological amp Brain Sci-ences Johns Hopkins University Baltimore MDUSA

References

Adolphs R Gosselin F Buchanan T W TranelD Schyns P amp Damasio A R (2005) Amechanism for impaired fear recognition afteramygdala damage Nature 433(7021) 68ndash72

Adolphs R Tranel D Damasio H amp Damasio A(1994) Impaired recognition of emotion in facialexpressions following bilateral damage to thehuman amygdala Nature 372 669ndash672

Aviezer H Hassin R R Ryan J Grady CSusskind J Anderson A Bentin S (2008)Angry disgusted or afraid Studies on the mal-leability of emotion perception PsychologicalScience 19(7) 724ndash732

Baron-Cohen S amp Wheelwright S (2004) Theempathy quotient An investigation of adults withAsperger syndrome or high functioning autism andnormal sex differences Journal of Autism andDevelopmental Disorders 34(2) 163ndash175

Beaupre M G amp Hess U (2005) Cross-culturalemotion recognition among Canadian ethnic

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 14

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

groups Journal of Cross-Cultural Psychology36(3) 355ndash370

Blair R J R (2005) Responding to the emotions ofothers Dissociating forms of empathy through thestudy of typical and psychiatric populationsConsciousness and Cognition 14(4) 698ndash718

Brainard D H (1997) The Psychophysics ToolboxSpatial Vision 10 443ndash446

Buchan J N Pare M amp Munhall K G (2007)Spatial statistics of gaze fixations during dynamicface processing Social Neuroscience 2(1) 1ndash13

Darwin C (1872) The expression of emotions in manand animals New York Philosophical Library

Ekman P amp Friesen W V (1975) Unmasking theface Englewood Cliffs NJ Prentice Hall

Ekman P amp Friesen W V (1978) The Facial ActionCoding System (FACS) A technique for themeasurement of facial action Palo Alto CAConsulting Psychologists Press

Elfenbein H A amp Ambady N (2002) Is there an in-group advantage in emotion recognition

Everdell I T Marsh H Yurick M D Munhall KG amp Pare M (2007) Gaze behaviour inaudiovisual speech perception Asymmetrical dis-tribution of face-directed fixations Perception 361535ndash1545

Gosselin F amp Schyns P G (2001) Bubbles Atechnique to reveal the use of information inrecognition tasks Vision Research 41(17) 2261ndash2271

Hargrave R Maddock R J amp Stone V (2002)Impaired recognition of facial expressions ofemotion in Alzheimerrsquos disease Journal of Neuro-psychiatry and Clinical Neurosciences 14 64ndash71

Hejmadi A Davidson R J amp Rozin P (2000)Exploring Hindu Indian emotion expressionsEvidence for accurate recognition by Americansand Indians Psychological Science 11(3) 183ndash187

Henderson J M Falk R Minut S Dyer F C ampMahadevan S (2000) Gaze control for facelearning and recognition by humans and machinesMichigan State University Eye Movement Labora-tory Technical Report 4 1ndash14

Henderson J M Williams C C amp Falk R J (2005)Eye movements are functional during face learningMemory amp Cognition 33(1) 98ndash106

Horley K Williams L M Gonsalvez C amp GordonE (2004) Face to face Visual scanpath evidencefor abnormal processing of facial expressions insocial phobia Psychiatry Research 127 43ndash53

Hsiao J H W amp Cottrell G (2008) Two fixations

suffice in face recognition Psychological Science19(10) 998ndash1006

Jack R E Blais C Scheepers C Schyns P G ampCaldara R (2009) Cultural confusions show thatfacial expressions are not universal Current Biol-ogy 19 1ndash6

Keltner D amp Buswell B N (1997) EmbarrassmentIts distinct form and appeasement functionsPsychological Bulletin 122(3) 250ndash270

Kowler E Anderson E Dosher B amp Blaser E(1995) The role of attention in the programming ofsaccades Vision Research 35(13) 1897ndash1916

Loughland C M Williams L M amp Gordon E(2002) Schizophrenia and affective disorder showdifferent visual scanning behaviour for faces Atrait versus state-based distinction BiologicalPsychiatry 52 338ndash348

Marsh A A Kozak M N amp Ambady N (2007)Accurate identification of fear facial expressionspredicts prosocial behavior Emotion 7(2) 239

Nahm F K D Perret A Amaral D G amp AlbrightT D (1997) How do monkeys look at facesJournal of Cognitive Neuroscience 9(5) 611ndash623

Ogrocki P K Hills A C amp Strauss M E (2000)Visual exploration of facial emotion by healthyolder adults and patients with Alzheimer diseaseNeuropsychiatry Neuropsychology and BehavioralNeurology 13 271ndash278

Oosterhof N N amp Todorov A (2009) Sharedperceptual basis of emotional expressions andtrustworthiness impressions from faces Emotion9(1) 128

OrsquoToole A J Deffenbacher K A Valentin D ampAbdi H (1994) Structural aspects of face recog-nition and the other-race effect Memory ampCognition 22(2) 208ndash224

Pelli D G (1997) The VideoToolbox software forvisual psychophysics Transforming numbers intomovies Spatial Vision 10 437ndash442

Pelphrey K A Sasson N J Reznick J Paul GGoldman B D amp Piven J (2002) Visualscanning of faces in autism Journal of Autism andDevelopmental Disorders 32(4) 1573ndash3432

Perlman S B Morris J P Vander Wyk B CGreen S R Doyle J L amp Pelphrey K A(2009) Individual differences in personality shapehow people look at faces PLoS ONE 4(6) e5952

Pflugshaupt T Mosimann U P Schmitt W J vonWartburg R Wurtz P Luthi M Muri RM (2007) To look or not to look at threatScanpath differences within a group of spiderphobics Journal of Anxiety Disorders 21(3) 353ndash366

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 15

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Sasson N Tsuchiya N Hurley R Couture S MPenn D L Adolphs R amp Piven J (2007)Orienting to social stimuli differentiates socialcognitive impairment in autism and schizophreniaNeuropsychologia 45 2580ndash2588

Schyns P G Petro L S amp Smith M L (2009)Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brainBehavioral and brain evidence PLoS One 4(5)e5625

Segerstrom S C (2001) Optimism and attentionalbias for negative and positive stimuli Personalityand Social Psychology Bulletin 27 1334ndash1343

Smith M L Cottrell G W Gosselin F amp SchynsP G (2005) Transmitting and decoding facialexpressions Psychological Science 16(3) 184ndash189

Spezio M L Adolphs R Hurley R S E amp PivenJ (2007) Analysis of face gaze in autism usinglsquolsquoBubblesrsquorsquo Neuropsychologia 45 144ndash151

Spezio M L Huang P-Y S Castelli F amp Adolphs

R (2007) Amygdala damage impairs eye contactduring conversations with real people The Journalof Neuroscience 27(15) 3994ndash3997

Susskind J M Lee D H Cusi A Feiman RGrabski W amp Anderson A K (2008) Expressingfear enhances sensory acquisition Nature Neuro-science 11(7) 843ndash850

Tracy J L amp Robins R W (2004) Show your prideEvidence for a discrete emotion expression Psy-chological Science 15 194ndash197

Walker-Smith G J Gale A G amp Findlay J M(1977) Eye movement strategies involved in faceperception Perception 6(3) 313ndash326

Wright D B amp Sladden B (2003) An own genderbias and the importance of hair in face recognitionActa Psychologica 114(1) 101ndash114

Zhao W Chellappa R Phillips P J amp RosenfeldA (2003) Face recognition A literature surveyAcm Computing Surveys (CSUR) 35(4) 399ndash458

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 16

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

  • Introduction
  • Experiment 1 Eye tracking of
  • f01
  • f02
  • f03
  • f04
  • f05
  • f06
  • f07
  • Experiment 2 Varying diagnostic information
  • f08
  • f09
  • General discussion
  • Conclusion
  • Adolphs1
  • Adolphs2
  • Aviezer1
  • BaronCohen1
  • Beaupre1
  • Blair1
  • Brainard1
  • Buchan1
  • Darwin1
  • Ekman1
  • Ekman2
  • Elfenbein1
  • Everdell1
  • Gosselin1
  • Hargrave1
  • Hejmadi1
  • Henderson1
  • Henderson2
  • Horley1
  • Hsiao1
  • Jack1
  • Keltner1
  • Kowler1
  • Loughland1
  • Marsh1
  • Nahm1
  • Ogrocki1
  • Oosterhof1
  • Otoole1
  • Pelli1
  • Pelphrey1
  • Perlman1
  • Pflugshaupt1
  • Sasson1
  • Schyns1
  • Segerstrom1
  • Smith1
  • Spezio1
  • Spezio2
  • Susskind1
  • Tracy1
  • WalkerSmith1
  • Wright1
  • Zhao1

Stimuli

Stimuli were identical to that of Experiment 1 withthe following exceptions For emotional faces onlyone emotional morph was used per condition Angerdisgust fear and sadness consisted of 40 emotionalintensity morphs whereas joy consisted of only 20and shame 60 We determined the specific emotionalintensity per emotion during piloting in order toensure participants could perform above chance butnot at ceiling (70 correct) The values used areconsistent with the performance observed in Experi-ment 1 (see Figure 3) Additionally all face stimulihad a black box (838 middot 288) covering either the eyesor the mouth

Apparatus

All of the stimuli were created and displayed usingMATLAB with the Psychophysics Toolbox (Brainard1997 Pelli 1997) on an Intel Macintosh running OS X106 All stimuli were displayed on a 17-in View-SonicE70fB CRT monitor (1024 middot 786 resolution 75Hz 2897 pixels8 of visual angle) The viewing distancewas approximately 56 cm

Procedure

This experiment consisted of a total of 72 trialspresented as six blocks of trials (one block of trials peremotion type) shown in random order repeated twicefor a total of 144 trials per participant Each blockconsisted of 12 trials per block consisting of sixneutral and six emotional face trials presented inrandomized order At the beginning of each trialparticipants saw a fixation cross for 1 s followed by abriefly presented face for 100 ms For half of the trials

a black box covered either the eyes or the mouth (seeFigure 9) Participants then judged whether the photodepicted a specific emotion or not (ie angry or notangry)

Results

We examined whether or not overall performancediffered according to whether the mouth (ie eyescovered by black box) or eyes (ie mouth covered byblack box) were visible For joy we found performancewas significantly better when the mouth was visible (Mfrac14 633) relative to when it was covered (Mfrac14 517)t(9)frac14 281 pfrac14 002 A similar pattern was observed fordisgust (Mfrac14 758 Mfrac14 601) t(9)frac14 338 p 001Conversely we found performance was significantlybetter for anger when the eyes were visible (Mfrac14 692)compared to when they were covered (Mfrac14 608) t(9)frac14 335 p 001 A similar trend was observed forshame (M frac14 817 M frac14 683) t(9)frac14 181 p frac14 010We failed to find any significant differences for fear (Mfrac14 65 M frac14 675) and sadness (M frac14 717 M frac14717)

Discussion

Emotion judgment performance depended on whichregions of the face were covered in a mannerconsistent with the fixation patterns found in Exper-iment 1 Performance was best for joy and disgustwhen the eyes were covered and the mouth was visibleconsistent with the eye fixation rates we observed inExperiment 1 in which observers tended to fixate moreat the upper lip and less at the eyes relative to otheremotions Figure 9 provides a strong illustration ofthis effect Joy is easy to detect in the left face butharder to detect in the right face We failed to find anymeaningful differences for fearful faces which isconsistent with the diagnostic regions identified inExperiment 1 the eyes and upper lip are notdifferentially important to identify fear For angerperformance was significantly better when the eyeswere visible compared to when they were covered Asimilar trend was also observed for shameful facesBoth of these effects are consistent with the diagnosticareas identified in Experiment 1 We did not see anysignificant differences for detecting sadness whencovering the mouth versus the eyes even thoughExperiment 1 did suggest different levels of diagnos-ticity for those regions Overall five out of the sixemotion types showed results consistent with thepredictions given the diagnostic areas of emotionalfaces identified in Experiment 1

Figure 9 For all trials participants were briefly shown a face

with either the eyes or mouth occluded The task was to judge

whether the face had a particular emotion or not (A) An

example of a joyful face with the eyes covered (B) an example

of the same face but with the mouth covered

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 12

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

General discussion

Perceiving emotion expressed in a face is associatedwith characteristic patterns of attention both whenthat emotion is visible and when it is merely antici-pated Although previous research has segmented partsof the face into three or four distinct regions (seeAviezer et al 2008 Buchan Pare amp Munhall 2007Everdell et al 2007 Henderson Falk Minut Dyer ampMahadevan 2000) the present study offers a moredetailed analysis using 21 defined regions Our resultsshow that of these regions the five main facial regions(eyes upper nose lower nose upper lip nasion)accounted for 8803 of all fixations any other facialregion accounted for at most 3 of fixations Thesepatterns suggest that these five facial regions may be themost critical for emotional recognition within faces

Despite the dominance of these five regions therewere consistent differences in fixation patterns whenseeking different emotional cues within a face Figure 5summarizes the regions that were highly fixated foreach emotion (relative to the other emotions) and offersa comparison of these patterns to past work exploringwhich regions are most diagnostic for evaluating agiven emotion (see Figure 5 Smith et al 2005)

For joy participants fixated the least at the eyesand spent the most time fixating at the upper lipsThis is likely driven by the importance of the upperlip relative to the smile the most salient facialfeature of joy (see Figure 5) Importantly thesefixation differences were significant across emo-tional and neutral stimuli suggesting a particularlystrong goal-driven strategy for perceiving joy Foremotional faces there also appeared to be amarginal difference in which the upper nose wasfixated less frequently These differences wereprimarily driven by the first few fixations

For disgust participants fixated more often at theupper lip and less often at the eyes Thesedifferences are likely due to the importance of thefurrowing of the nose and mouth when making thedisgust facial expression thus making the upper lipmore salient These patterns were primarily drivenby the first few fixations

For fear participants fixated more at the eyes andrelatively less at the nasion (see Figure 5) Theredid not appear to be any significant differencesrelative to other emotions in fixations at the uppernose lower nose and upper lip These patternswere fairly consistent across time for the nasionalthough the eyes became increasingly fixated overtime

For anger participants fixated most at the eyesand least at the upper lip When emotion was notpresent participants fixated significantly more at

the nasion as well There did not appear to be anysignificant fixation differences at the upper andlower nose (see Figure 5) These results areconsistent with previous findings suggesting thediagnostic value of the eyes and nasion fordetecting anger (Smith et al 2005) These patternswere consistent across time

For sadness participants fixated most at the eyesIn contrast fixation time was significantly lower atthe upper lip These fixation patterns weresignificant across both emotional and neutralstimuli for both the eyes and upper lip suggestingthis pattern may be driven by a goal-drivenstrategy These patterns were consistent acrosstime

For shame participants fixated to a greater extentat the eyes This pattern was most pronouncedafter the first fixation For all other regions theredid not appear to be any significant differences (seeFigure 5)

Across all blocks it is interesting that there were nosignificant differences found in fixation time spent atthe lower nose whether for emotional or neutral facesit appears to serve as a central fixation point across aface Another finding in our data is that dynamicdifferences are apparent even within the first fixationfor example in the much greater percentage of fixationsto the eye region for anger relative to joyful expres-sions This provides further support for a goal-drivenstrategy during emotional recognition of faces becausesuch differences were typically maintained acrossemotional and neutral stimuli The values and differ-ences within many regions across emotion type werelarger and more pronounced while viewing emotionalversus neutral stimuli still suggesting an important rolefor stimulus-driven factors on eye movements

Gaze patterns were particularly variable acrossemotions in the first few fixations and became lessvariable for the remainder of the viewing time Theupper nose was fixated to the greatest extent as the firstfixation regardless of emotion type and then fixationrates remained relatively variable depending on theemotion type for the remainder of the trial In contrastfor the lower nose the first fixation demonstratedvariability across different emotion types but was areliable target at the second fixation regardless ofemotion For all emotions the upper lip was consis-tently a weaker target at the first fixation and fixatedmost during the second fixation There was no generalfixation pattern at the nasion demonstrating widevariability across emotions over time We also observeda significant left dominance for the initial fixationregardless of stimulus type

Although participants made an average of 83fixations per trial the prediction rates for the Bayesianclassifier showed that the first two fixation locations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 13

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

were most predictive of the type of emotion sought bythe observer This result is consistent with previouswork showing that face recognition can be completedafter only one to two eye movements (Hsiao amp Cottrell2008) although the number of fixations needed torecognize emotional content may be greater

Building on the diagnostic regions of the emotionalfaces identified in Experiment 1 Experiment 2 offered adirect evaluation to whether those regions wereimportant by covering either the mouth or the eyesduring an emotional judgment task Consistent with thepredictions of Experiment 1 emotional judgmentperformance was impaired for emotions for which themouth was more diagnostic (joy disgust) as well as foremotions for which the eyes were more diagnostic(anger shame) Although the present data cannotdirectly confirm that eye movements to these morediagnostic regions helped performance they do showthat attentional strategies were aligned with locationsthat carry diagnostic information

Conclusion

When asked to evaluate the presence of a specificemotion within a face participants focused on acommon set of face regions but also used emotion-specific eye-movement strategies in both emotional andneutral faces These results are consistent with the ideathat focusing attention on certain diagnostic regions isbeneficial for emotion processing and that thesestrategies may be driven by both stimulus-driven andgoal-driven factors The evidence for goal-drivenstrategies is consistent with previous research suggest-ing that the goals and characteristics of the perceivermay influence how eye movements are deployed duringfacial emotion recognition (Jack et al 2009 Perlman etal 2009) Face and emotion recognition are even morebroadly affected by a facersquos gender (Wright amp Sladden2003) race or culture (Jack et al 2009 OrsquoToole et al1994) and individual facial morphology (Oosterhof ampTodorov 2009) These effects likely interact with thegoal-driven factors observed in the current study andwe hope that future research explores such interactions

The presence of a goal-driven strategy duringemotional recognition has numerous practical impli-cations specifically to individuals with disorders thatmay result in social deficits Patients with certaindisorders such as autism (Pelphrey et al 2002 SpezioAdolphs et al 2007 Spezio Huang Castelli ampAdolphs 2007) schizophrenia (Loughland Williamsamp Gordon 2002) social phobias (Horley WilliamsGonsalvez amp Gordon 2004) and Alzheimerrsquos disease(Hargrave Maddock amp Stone 2002 Ogrocki Hills ampStrauss 2000) demonstrate eye-movement patterns

that are remarkably different from normal participantsand these anomalous patterns may impair their abilityto correctly identify the emotional valence of facesThese individuals typically focus longer at nondiag-nostic regions (such as the brow or cheeks) relative todiagnostic regions of the face (eyes mouth nose)during emotional recognition These abnormal eye-movement patterns contribute to social or emotionaldeficits that accompany abnormal face recognition Bycomparing such abnormal eye-movement patterns toour results it may be possible to identify ways in whichaltered eye-movement patterns could improve emotionrecognition

Keywords face recognition emotion eye movementsattention fixation

Acknowledgments

We thank Trixie Lipke for assistance with datacollection and Sumeeth Jonathan for programming andhelp with movie creation

Commercial relationships noneCorresponding author Mark W SchurginEmail maschurginjhueduAddress Department of Psychological amp Brain Sci-ences Johns Hopkins University Baltimore MDUSA

References

Adolphs R Gosselin F Buchanan T W TranelD Schyns P amp Damasio A R (2005) Amechanism for impaired fear recognition afteramygdala damage Nature 433(7021) 68ndash72

Adolphs R Tranel D Damasio H amp Damasio A(1994) Impaired recognition of emotion in facialexpressions following bilateral damage to thehuman amygdala Nature 372 669ndash672

Aviezer H Hassin R R Ryan J Grady CSusskind J Anderson A Bentin S (2008)Angry disgusted or afraid Studies on the mal-leability of emotion perception PsychologicalScience 19(7) 724ndash732

Baron-Cohen S amp Wheelwright S (2004) Theempathy quotient An investigation of adults withAsperger syndrome or high functioning autism andnormal sex differences Journal of Autism andDevelopmental Disorders 34(2) 163ndash175

Beaupre M G amp Hess U (2005) Cross-culturalemotion recognition among Canadian ethnic

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 14

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

groups Journal of Cross-Cultural Psychology36(3) 355ndash370

Blair R J R (2005) Responding to the emotions ofothers Dissociating forms of empathy through thestudy of typical and psychiatric populationsConsciousness and Cognition 14(4) 698ndash718

Brainard D H (1997) The Psychophysics ToolboxSpatial Vision 10 443ndash446

Buchan J N Pare M amp Munhall K G (2007)Spatial statistics of gaze fixations during dynamicface processing Social Neuroscience 2(1) 1ndash13

Darwin C (1872) The expression of emotions in manand animals New York Philosophical Library

Ekman P amp Friesen W V (1975) Unmasking theface Englewood Cliffs NJ Prentice Hall

Ekman P amp Friesen W V (1978) The Facial ActionCoding System (FACS) A technique for themeasurement of facial action Palo Alto CAConsulting Psychologists Press

Elfenbein H A amp Ambady N (2002) Is there an in-group advantage in emotion recognition

Everdell I T Marsh H Yurick M D Munhall KG amp Pare M (2007) Gaze behaviour inaudiovisual speech perception Asymmetrical dis-tribution of face-directed fixations Perception 361535ndash1545

Gosselin F amp Schyns P G (2001) Bubbles Atechnique to reveal the use of information inrecognition tasks Vision Research 41(17) 2261ndash2271

Hargrave R Maddock R J amp Stone V (2002)Impaired recognition of facial expressions ofemotion in Alzheimerrsquos disease Journal of Neuro-psychiatry and Clinical Neurosciences 14 64ndash71

Hejmadi A Davidson R J amp Rozin P (2000)Exploring Hindu Indian emotion expressionsEvidence for accurate recognition by Americansand Indians Psychological Science 11(3) 183ndash187

Henderson J M Falk R Minut S Dyer F C ampMahadevan S (2000) Gaze control for facelearning and recognition by humans and machinesMichigan State University Eye Movement Labora-tory Technical Report 4 1ndash14

Henderson J M Williams C C amp Falk R J (2005)Eye movements are functional during face learningMemory amp Cognition 33(1) 98ndash106

Horley K Williams L M Gonsalvez C amp GordonE (2004) Face to face Visual scanpath evidencefor abnormal processing of facial expressions insocial phobia Psychiatry Research 127 43ndash53

Hsiao J H W amp Cottrell G (2008) Two fixations

suffice in face recognition Psychological Science19(10) 998ndash1006

Jack R E Blais C Scheepers C Schyns P G ampCaldara R (2009) Cultural confusions show thatfacial expressions are not universal Current Biol-ogy 19 1ndash6

Keltner D amp Buswell B N (1997) EmbarrassmentIts distinct form and appeasement functionsPsychological Bulletin 122(3) 250ndash270

Kowler E Anderson E Dosher B amp Blaser E(1995) The role of attention in the programming ofsaccades Vision Research 35(13) 1897ndash1916

Loughland C M Williams L M amp Gordon E(2002) Schizophrenia and affective disorder showdifferent visual scanning behaviour for faces Atrait versus state-based distinction BiologicalPsychiatry 52 338ndash348

Marsh A A Kozak M N amp Ambady N (2007)Accurate identification of fear facial expressionspredicts prosocial behavior Emotion 7(2) 239

Nahm F K D Perret A Amaral D G amp AlbrightT D (1997) How do monkeys look at facesJournal of Cognitive Neuroscience 9(5) 611ndash623

Ogrocki P K Hills A C amp Strauss M E (2000)Visual exploration of facial emotion by healthyolder adults and patients with Alzheimer diseaseNeuropsychiatry Neuropsychology and BehavioralNeurology 13 271ndash278

Oosterhof N N amp Todorov A (2009) Sharedperceptual basis of emotional expressions andtrustworthiness impressions from faces Emotion9(1) 128

OrsquoToole A J Deffenbacher K A Valentin D ampAbdi H (1994) Structural aspects of face recog-nition and the other-race effect Memory ampCognition 22(2) 208ndash224

Pelli D G (1997) The VideoToolbox software forvisual psychophysics Transforming numbers intomovies Spatial Vision 10 437ndash442

Pelphrey K A Sasson N J Reznick J Paul GGoldman B D amp Piven J (2002) Visualscanning of faces in autism Journal of Autism andDevelopmental Disorders 32(4) 1573ndash3432

Perlman S B Morris J P Vander Wyk B CGreen S R Doyle J L amp Pelphrey K A(2009) Individual differences in personality shapehow people look at faces PLoS ONE 4(6) e5952

Pflugshaupt T Mosimann U P Schmitt W J vonWartburg R Wurtz P Luthi M Muri RM (2007) To look or not to look at threatScanpath differences within a group of spiderphobics Journal of Anxiety Disorders 21(3) 353ndash366

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 15

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Sasson N Tsuchiya N Hurley R Couture S MPenn D L Adolphs R amp Piven J (2007)Orienting to social stimuli differentiates socialcognitive impairment in autism and schizophreniaNeuropsychologia 45 2580ndash2588

Schyns P G Petro L S amp Smith M L (2009)Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brainBehavioral and brain evidence PLoS One 4(5)e5625

Segerstrom S C (2001) Optimism and attentionalbias for negative and positive stimuli Personalityand Social Psychology Bulletin 27 1334ndash1343

Smith M L Cottrell G W Gosselin F amp SchynsP G (2005) Transmitting and decoding facialexpressions Psychological Science 16(3) 184ndash189

Spezio M L Adolphs R Hurley R S E amp PivenJ (2007) Analysis of face gaze in autism usinglsquolsquoBubblesrsquorsquo Neuropsychologia 45 144ndash151

Spezio M L Huang P-Y S Castelli F amp Adolphs

R (2007) Amygdala damage impairs eye contactduring conversations with real people The Journalof Neuroscience 27(15) 3994ndash3997

Susskind J M Lee D H Cusi A Feiman RGrabski W amp Anderson A K (2008) Expressingfear enhances sensory acquisition Nature Neuro-science 11(7) 843ndash850

Tracy J L amp Robins R W (2004) Show your prideEvidence for a discrete emotion expression Psy-chological Science 15 194ndash197

Walker-Smith G J Gale A G amp Findlay J M(1977) Eye movement strategies involved in faceperception Perception 6(3) 313ndash326

Wright D B amp Sladden B (2003) An own genderbias and the importance of hair in face recognitionActa Psychologica 114(1) 101ndash114

Zhao W Chellappa R Phillips P J amp RosenfeldA (2003) Face recognition A literature surveyAcm Computing Surveys (CSUR) 35(4) 399ndash458

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 16

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

  • Introduction
  • Experiment 1 Eye tracking of
  • f01
  • f02
  • f03
  • f04
  • f05
  • f06
  • f07
  • Experiment 2 Varying diagnostic information
  • f08
  • f09
  • General discussion
  • Conclusion
  • Adolphs1
  • Adolphs2
  • Aviezer1
  • BaronCohen1
  • Beaupre1
  • Blair1
  • Brainard1
  • Buchan1
  • Darwin1
  • Ekman1
  • Ekman2
  • Elfenbein1
  • Everdell1
  • Gosselin1
  • Hargrave1
  • Hejmadi1
  • Henderson1
  • Henderson2
  • Horley1
  • Hsiao1
  • Jack1
  • Keltner1
  • Kowler1
  • Loughland1
  • Marsh1
  • Nahm1
  • Ogrocki1
  • Oosterhof1
  • Otoole1
  • Pelli1
  • Pelphrey1
  • Perlman1
  • Pflugshaupt1
  • Sasson1
  • Schyns1
  • Segerstrom1
  • Smith1
  • Spezio1
  • Spezio2
  • Susskind1
  • Tracy1
  • WalkerSmith1
  • Wright1
  • Zhao1

General discussion

Perceiving emotion expressed in a face is associatedwith characteristic patterns of attention both whenthat emotion is visible and when it is merely antici-pated Although previous research has segmented partsof the face into three or four distinct regions (seeAviezer et al 2008 Buchan Pare amp Munhall 2007Everdell et al 2007 Henderson Falk Minut Dyer ampMahadevan 2000) the present study offers a moredetailed analysis using 21 defined regions Our resultsshow that of these regions the five main facial regions(eyes upper nose lower nose upper lip nasion)accounted for 8803 of all fixations any other facialregion accounted for at most 3 of fixations Thesepatterns suggest that these five facial regions may be themost critical for emotional recognition within faces

Despite the dominance of these five regions therewere consistent differences in fixation patterns whenseeking different emotional cues within a face Figure 5summarizes the regions that were highly fixated foreach emotion (relative to the other emotions) and offersa comparison of these patterns to past work exploringwhich regions are most diagnostic for evaluating agiven emotion (see Figure 5 Smith et al 2005)

For joy participants fixated the least at the eyesand spent the most time fixating at the upper lipsThis is likely driven by the importance of the upperlip relative to the smile the most salient facialfeature of joy (see Figure 5) Importantly thesefixation differences were significant across emo-tional and neutral stimuli suggesting a particularlystrong goal-driven strategy for perceiving joy Foremotional faces there also appeared to be amarginal difference in which the upper nose wasfixated less frequently These differences wereprimarily driven by the first few fixations

For disgust participants fixated more often at theupper lip and less often at the eyes Thesedifferences are likely due to the importance of thefurrowing of the nose and mouth when making thedisgust facial expression thus making the upper lipmore salient These patterns were primarily drivenby the first few fixations

For fear participants fixated more at the eyes andrelatively less at the nasion (see Figure 5) Theredid not appear to be any significant differencesrelative to other emotions in fixations at the uppernose lower nose and upper lip These patternswere fairly consistent across time for the nasionalthough the eyes became increasingly fixated overtime

For anger participants fixated most at the eyesand least at the upper lip When emotion was notpresent participants fixated significantly more at

the nasion as well There did not appear to be anysignificant fixation differences at the upper andlower nose (see Figure 5) These results areconsistent with previous findings suggesting thediagnostic value of the eyes and nasion fordetecting anger (Smith et al 2005) These patternswere consistent across time

For sadness participants fixated most at the eyesIn contrast fixation time was significantly lower atthe upper lip These fixation patterns weresignificant across both emotional and neutralstimuli for both the eyes and upper lip suggestingthis pattern may be driven by a goal-drivenstrategy These patterns were consistent acrosstime

For shame participants fixated to a greater extentat the eyes This pattern was most pronouncedafter the first fixation For all other regions theredid not appear to be any significant differences (seeFigure 5)

Across all blocks it is interesting that there were nosignificant differences found in fixation time spent atthe lower nose whether for emotional or neutral facesit appears to serve as a central fixation point across aface Another finding in our data is that dynamicdifferences are apparent even within the first fixationfor example in the much greater percentage of fixationsto the eye region for anger relative to joyful expres-sions This provides further support for a goal-drivenstrategy during emotional recognition of faces becausesuch differences were typically maintained acrossemotional and neutral stimuli The values and differ-ences within many regions across emotion type werelarger and more pronounced while viewing emotionalversus neutral stimuli still suggesting an important rolefor stimulus-driven factors on eye movements

Gaze patterns were particularly variable acrossemotions in the first few fixations and became lessvariable for the remainder of the viewing time Theupper nose was fixated to the greatest extent as the firstfixation regardless of emotion type and then fixationrates remained relatively variable depending on theemotion type for the remainder of the trial In contrastfor the lower nose the first fixation demonstratedvariability across different emotion types but was areliable target at the second fixation regardless ofemotion For all emotions the upper lip was consis-tently a weaker target at the first fixation and fixatedmost during the second fixation There was no generalfixation pattern at the nasion demonstrating widevariability across emotions over time We also observeda significant left dominance for the initial fixationregardless of stimulus type

Although participants made an average of 83fixations per trial the prediction rates for the Bayesianclassifier showed that the first two fixation locations

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 13

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

were most predictive of the type of emotion sought bythe observer This result is consistent with previouswork showing that face recognition can be completedafter only one to two eye movements (Hsiao amp Cottrell2008) although the number of fixations needed torecognize emotional content may be greater

Building on the diagnostic regions of the emotionalfaces identified in Experiment 1 Experiment 2 offered adirect evaluation to whether those regions wereimportant by covering either the mouth or the eyesduring an emotional judgment task Consistent with thepredictions of Experiment 1 emotional judgmentperformance was impaired for emotions for which themouth was more diagnostic (joy disgust) as well as foremotions for which the eyes were more diagnostic(anger shame) Although the present data cannotdirectly confirm that eye movements to these morediagnostic regions helped performance they do showthat attentional strategies were aligned with locationsthat carry diagnostic information

Conclusion

When asked to evaluate the presence of a specificemotion within a face participants focused on acommon set of face regions but also used emotion-specific eye-movement strategies in both emotional andneutral faces These results are consistent with the ideathat focusing attention on certain diagnostic regions isbeneficial for emotion processing and that thesestrategies may be driven by both stimulus-driven andgoal-driven factors The evidence for goal-drivenstrategies is consistent with previous research suggest-ing that the goals and characteristics of the perceivermay influence how eye movements are deployed duringfacial emotion recognition (Jack et al 2009 Perlman etal 2009) Face and emotion recognition are even morebroadly affected by a facersquos gender (Wright amp Sladden2003) race or culture (Jack et al 2009 OrsquoToole et al1994) and individual facial morphology (Oosterhof ampTodorov 2009) These effects likely interact with thegoal-driven factors observed in the current study andwe hope that future research explores such interactions

The presence of a goal-driven strategy duringemotional recognition has numerous practical impli-cations specifically to individuals with disorders thatmay result in social deficits Patients with certaindisorders such as autism (Pelphrey et al 2002 SpezioAdolphs et al 2007 Spezio Huang Castelli ampAdolphs 2007) schizophrenia (Loughland Williamsamp Gordon 2002) social phobias (Horley WilliamsGonsalvez amp Gordon 2004) and Alzheimerrsquos disease(Hargrave Maddock amp Stone 2002 Ogrocki Hills ampStrauss 2000) demonstrate eye-movement patterns

that are remarkably different from normal participantsand these anomalous patterns may impair their abilityto correctly identify the emotional valence of facesThese individuals typically focus longer at nondiag-nostic regions (such as the brow or cheeks) relative todiagnostic regions of the face (eyes mouth nose)during emotional recognition These abnormal eye-movement patterns contribute to social or emotionaldeficits that accompany abnormal face recognition Bycomparing such abnormal eye-movement patterns toour results it may be possible to identify ways in whichaltered eye-movement patterns could improve emotionrecognition

Keywords face recognition emotion eye movementsattention fixation

Acknowledgments

We thank Trixie Lipke for assistance with datacollection and Sumeeth Jonathan for programming andhelp with movie creation

Commercial relationships noneCorresponding author Mark W SchurginEmail maschurginjhueduAddress Department of Psychological amp Brain Sci-ences Johns Hopkins University Baltimore MDUSA

References

Adolphs R Gosselin F Buchanan T W TranelD Schyns P amp Damasio A R (2005) Amechanism for impaired fear recognition afteramygdala damage Nature 433(7021) 68ndash72

Adolphs R Tranel D Damasio H amp Damasio A(1994) Impaired recognition of emotion in facialexpressions following bilateral damage to thehuman amygdala Nature 372 669ndash672

Aviezer H Hassin R R Ryan J Grady CSusskind J Anderson A Bentin S (2008)Angry disgusted or afraid Studies on the mal-leability of emotion perception PsychologicalScience 19(7) 724ndash732

Baron-Cohen S amp Wheelwright S (2004) Theempathy quotient An investigation of adults withAsperger syndrome or high functioning autism andnormal sex differences Journal of Autism andDevelopmental Disorders 34(2) 163ndash175

Beaupre M G amp Hess U (2005) Cross-culturalemotion recognition among Canadian ethnic

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 14

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

groups Journal of Cross-Cultural Psychology36(3) 355ndash370

Blair R J R (2005) Responding to the emotions ofothers Dissociating forms of empathy through thestudy of typical and psychiatric populationsConsciousness and Cognition 14(4) 698ndash718

Brainard D H (1997) The Psychophysics ToolboxSpatial Vision 10 443ndash446

Buchan J N Pare M amp Munhall K G (2007)Spatial statistics of gaze fixations during dynamicface processing Social Neuroscience 2(1) 1ndash13

Darwin C (1872) The expression of emotions in manand animals New York Philosophical Library

Ekman P amp Friesen W V (1975) Unmasking theface Englewood Cliffs NJ Prentice Hall

Ekman P amp Friesen W V (1978) The Facial ActionCoding System (FACS) A technique for themeasurement of facial action Palo Alto CAConsulting Psychologists Press

Elfenbein H A amp Ambady N (2002) Is there an in-group advantage in emotion recognition

Everdell I T Marsh H Yurick M D Munhall KG amp Pare M (2007) Gaze behaviour inaudiovisual speech perception Asymmetrical dis-tribution of face-directed fixations Perception 361535ndash1545

Gosselin F amp Schyns P G (2001) Bubbles Atechnique to reveal the use of information inrecognition tasks Vision Research 41(17) 2261ndash2271

Hargrave R Maddock R J amp Stone V (2002)Impaired recognition of facial expressions ofemotion in Alzheimerrsquos disease Journal of Neuro-psychiatry and Clinical Neurosciences 14 64ndash71

Hejmadi A Davidson R J amp Rozin P (2000)Exploring Hindu Indian emotion expressionsEvidence for accurate recognition by Americansand Indians Psychological Science 11(3) 183ndash187

Henderson J M Falk R Minut S Dyer F C ampMahadevan S (2000) Gaze control for facelearning and recognition by humans and machinesMichigan State University Eye Movement Labora-tory Technical Report 4 1ndash14

Henderson J M Williams C C amp Falk R J (2005)Eye movements are functional during face learningMemory amp Cognition 33(1) 98ndash106

Horley K Williams L M Gonsalvez C amp GordonE (2004) Face to face Visual scanpath evidencefor abnormal processing of facial expressions insocial phobia Psychiatry Research 127 43ndash53

Hsiao J H W amp Cottrell G (2008) Two fixations

suffice in face recognition Psychological Science19(10) 998ndash1006

Jack R E Blais C Scheepers C Schyns P G ampCaldara R (2009) Cultural confusions show thatfacial expressions are not universal Current Biol-ogy 19 1ndash6

Keltner D amp Buswell B N (1997) EmbarrassmentIts distinct form and appeasement functionsPsychological Bulletin 122(3) 250ndash270

Kowler E Anderson E Dosher B amp Blaser E(1995) The role of attention in the programming ofsaccades Vision Research 35(13) 1897ndash1916

Loughland C M Williams L M amp Gordon E(2002) Schizophrenia and affective disorder showdifferent visual scanning behaviour for faces Atrait versus state-based distinction BiologicalPsychiatry 52 338ndash348

Marsh A A Kozak M N amp Ambady N (2007)Accurate identification of fear facial expressionspredicts prosocial behavior Emotion 7(2) 239

Nahm F K D Perret A Amaral D G amp AlbrightT D (1997) How do monkeys look at facesJournal of Cognitive Neuroscience 9(5) 611ndash623

Ogrocki P K Hills A C amp Strauss M E (2000)Visual exploration of facial emotion by healthyolder adults and patients with Alzheimer diseaseNeuropsychiatry Neuropsychology and BehavioralNeurology 13 271ndash278

Oosterhof N N amp Todorov A (2009) Sharedperceptual basis of emotional expressions andtrustworthiness impressions from faces Emotion9(1) 128

OrsquoToole A J Deffenbacher K A Valentin D ampAbdi H (1994) Structural aspects of face recog-nition and the other-race effect Memory ampCognition 22(2) 208ndash224

Pelli D G (1997) The VideoToolbox software forvisual psychophysics Transforming numbers intomovies Spatial Vision 10 437ndash442

Pelphrey K A Sasson N J Reznick J Paul GGoldman B D amp Piven J (2002) Visualscanning of faces in autism Journal of Autism andDevelopmental Disorders 32(4) 1573ndash3432

Perlman S B Morris J P Vander Wyk B CGreen S R Doyle J L amp Pelphrey K A(2009) Individual differences in personality shapehow people look at faces PLoS ONE 4(6) e5952

Pflugshaupt T Mosimann U P Schmitt W J vonWartburg R Wurtz P Luthi M Muri RM (2007) To look or not to look at threatScanpath differences within a group of spiderphobics Journal of Anxiety Disorders 21(3) 353ndash366

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 15

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Sasson N Tsuchiya N Hurley R Couture S MPenn D L Adolphs R amp Piven J (2007)Orienting to social stimuli differentiates socialcognitive impairment in autism and schizophreniaNeuropsychologia 45 2580ndash2588

Schyns P G Petro L S amp Smith M L (2009)Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brainBehavioral and brain evidence PLoS One 4(5)e5625

Segerstrom S C (2001) Optimism and attentionalbias for negative and positive stimuli Personalityand Social Psychology Bulletin 27 1334ndash1343

Smith M L Cottrell G W Gosselin F amp SchynsP G (2005) Transmitting and decoding facialexpressions Psychological Science 16(3) 184ndash189

Spezio M L Adolphs R Hurley R S E amp PivenJ (2007) Analysis of face gaze in autism usinglsquolsquoBubblesrsquorsquo Neuropsychologia 45 144ndash151

Spezio M L Huang P-Y S Castelli F amp Adolphs

R (2007) Amygdala damage impairs eye contactduring conversations with real people The Journalof Neuroscience 27(15) 3994ndash3997

Susskind J M Lee D H Cusi A Feiman RGrabski W amp Anderson A K (2008) Expressingfear enhances sensory acquisition Nature Neuro-science 11(7) 843ndash850

Tracy J L amp Robins R W (2004) Show your prideEvidence for a discrete emotion expression Psy-chological Science 15 194ndash197

Walker-Smith G J Gale A G amp Findlay J M(1977) Eye movement strategies involved in faceperception Perception 6(3) 313ndash326

Wright D B amp Sladden B (2003) An own genderbias and the importance of hair in face recognitionActa Psychologica 114(1) 101ndash114

Zhao W Chellappa R Phillips P J amp RosenfeldA (2003) Face recognition A literature surveyAcm Computing Surveys (CSUR) 35(4) 399ndash458

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 16

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

  • Introduction
  • Experiment 1 Eye tracking of
  • f01
  • f02
  • f03
  • f04
  • f05
  • f06
  • f07
  • Experiment 2 Varying diagnostic information
  • f08
  • f09
  • General discussion
  • Conclusion
  • Adolphs1
  • Adolphs2
  • Aviezer1
  • BaronCohen1
  • Beaupre1
  • Blair1
  • Brainard1
  • Buchan1
  • Darwin1
  • Ekman1
  • Ekman2
  • Elfenbein1
  • Everdell1
  • Gosselin1
  • Hargrave1
  • Hejmadi1
  • Henderson1
  • Henderson2
  • Horley1
  • Hsiao1
  • Jack1
  • Keltner1
  • Kowler1
  • Loughland1
  • Marsh1
  • Nahm1
  • Ogrocki1
  • Oosterhof1
  • Otoole1
  • Pelli1
  • Pelphrey1
  • Perlman1
  • Pflugshaupt1
  • Sasson1
  • Schyns1
  • Segerstrom1
  • Smith1
  • Spezio1
  • Spezio2
  • Susskind1
  • Tracy1
  • WalkerSmith1
  • Wright1
  • Zhao1

were most predictive of the type of emotion sought bythe observer This result is consistent with previouswork showing that face recognition can be completedafter only one to two eye movements (Hsiao amp Cottrell2008) although the number of fixations needed torecognize emotional content may be greater

Building on the diagnostic regions of the emotionalfaces identified in Experiment 1 Experiment 2 offered adirect evaluation to whether those regions wereimportant by covering either the mouth or the eyesduring an emotional judgment task Consistent with thepredictions of Experiment 1 emotional judgmentperformance was impaired for emotions for which themouth was more diagnostic (joy disgust) as well as foremotions for which the eyes were more diagnostic(anger shame) Although the present data cannotdirectly confirm that eye movements to these morediagnostic regions helped performance they do showthat attentional strategies were aligned with locationsthat carry diagnostic information

Conclusion

When asked to evaluate the presence of a specificemotion within a face participants focused on acommon set of face regions but also used emotion-specific eye-movement strategies in both emotional andneutral faces These results are consistent with the ideathat focusing attention on certain diagnostic regions isbeneficial for emotion processing and that thesestrategies may be driven by both stimulus-driven andgoal-driven factors The evidence for goal-drivenstrategies is consistent with previous research suggest-ing that the goals and characteristics of the perceivermay influence how eye movements are deployed duringfacial emotion recognition (Jack et al 2009 Perlman etal 2009) Face and emotion recognition are even morebroadly affected by a facersquos gender (Wright amp Sladden2003) race or culture (Jack et al 2009 OrsquoToole et al1994) and individual facial morphology (Oosterhof ampTodorov 2009) These effects likely interact with thegoal-driven factors observed in the current study andwe hope that future research explores such interactions

The presence of a goal-driven strategy duringemotional recognition has numerous practical impli-cations specifically to individuals with disorders thatmay result in social deficits Patients with certaindisorders such as autism (Pelphrey et al 2002 SpezioAdolphs et al 2007 Spezio Huang Castelli ampAdolphs 2007) schizophrenia (Loughland Williamsamp Gordon 2002) social phobias (Horley WilliamsGonsalvez amp Gordon 2004) and Alzheimerrsquos disease(Hargrave Maddock amp Stone 2002 Ogrocki Hills ampStrauss 2000) demonstrate eye-movement patterns

that are remarkably different from normal participantsand these anomalous patterns may impair their abilityto correctly identify the emotional valence of facesThese individuals typically focus longer at nondiag-nostic regions (such as the brow or cheeks) relative todiagnostic regions of the face (eyes mouth nose)during emotional recognition These abnormal eye-movement patterns contribute to social or emotionaldeficits that accompany abnormal face recognition Bycomparing such abnormal eye-movement patterns toour results it may be possible to identify ways in whichaltered eye-movement patterns could improve emotionrecognition

Keywords face recognition emotion eye movementsattention fixation

Acknowledgments

We thank Trixie Lipke for assistance with datacollection and Sumeeth Jonathan for programming andhelp with movie creation

Commercial relationships noneCorresponding author Mark W SchurginEmail maschurginjhueduAddress Department of Psychological amp Brain Sci-ences Johns Hopkins University Baltimore MDUSA

References

Adolphs R Gosselin F Buchanan T W TranelD Schyns P amp Damasio A R (2005) Amechanism for impaired fear recognition afteramygdala damage Nature 433(7021) 68ndash72

Adolphs R Tranel D Damasio H amp Damasio A(1994) Impaired recognition of emotion in facialexpressions following bilateral damage to thehuman amygdala Nature 372 669ndash672

Aviezer H Hassin R R Ryan J Grady CSusskind J Anderson A Bentin S (2008)Angry disgusted or afraid Studies on the mal-leability of emotion perception PsychologicalScience 19(7) 724ndash732

Baron-Cohen S amp Wheelwright S (2004) Theempathy quotient An investigation of adults withAsperger syndrome or high functioning autism andnormal sex differences Journal of Autism andDevelopmental Disorders 34(2) 163ndash175

Beaupre M G amp Hess U (2005) Cross-culturalemotion recognition among Canadian ethnic

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 14

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

groups Journal of Cross-Cultural Psychology36(3) 355ndash370

Blair R J R (2005) Responding to the emotions ofothers Dissociating forms of empathy through thestudy of typical and psychiatric populationsConsciousness and Cognition 14(4) 698ndash718

Brainard D H (1997) The Psychophysics ToolboxSpatial Vision 10 443ndash446

Buchan J N Pare M amp Munhall K G (2007)Spatial statistics of gaze fixations during dynamicface processing Social Neuroscience 2(1) 1ndash13

Darwin C (1872) The expression of emotions in manand animals New York Philosophical Library

Ekman P amp Friesen W V (1975) Unmasking theface Englewood Cliffs NJ Prentice Hall

Ekman P amp Friesen W V (1978) The Facial ActionCoding System (FACS) A technique for themeasurement of facial action Palo Alto CAConsulting Psychologists Press

Elfenbein H A amp Ambady N (2002) Is there an in-group advantage in emotion recognition

Everdell I T Marsh H Yurick M D Munhall KG amp Pare M (2007) Gaze behaviour inaudiovisual speech perception Asymmetrical dis-tribution of face-directed fixations Perception 361535ndash1545

Gosselin F amp Schyns P G (2001) Bubbles Atechnique to reveal the use of information inrecognition tasks Vision Research 41(17) 2261ndash2271

Hargrave R Maddock R J amp Stone V (2002)Impaired recognition of facial expressions ofemotion in Alzheimerrsquos disease Journal of Neuro-psychiatry and Clinical Neurosciences 14 64ndash71

Hejmadi A Davidson R J amp Rozin P (2000)Exploring Hindu Indian emotion expressionsEvidence for accurate recognition by Americansand Indians Psychological Science 11(3) 183ndash187

Henderson J M Falk R Minut S Dyer F C ampMahadevan S (2000) Gaze control for facelearning and recognition by humans and machinesMichigan State University Eye Movement Labora-tory Technical Report 4 1ndash14

Henderson J M Williams C C amp Falk R J (2005)Eye movements are functional during face learningMemory amp Cognition 33(1) 98ndash106

Horley K Williams L M Gonsalvez C amp GordonE (2004) Face to face Visual scanpath evidencefor abnormal processing of facial expressions insocial phobia Psychiatry Research 127 43ndash53

Hsiao J H W amp Cottrell G (2008) Two fixations

suffice in face recognition Psychological Science19(10) 998ndash1006

Jack R E Blais C Scheepers C Schyns P G ampCaldara R (2009) Cultural confusions show thatfacial expressions are not universal Current Biol-ogy 19 1ndash6

Keltner D amp Buswell B N (1997) EmbarrassmentIts distinct form and appeasement functionsPsychological Bulletin 122(3) 250ndash270

Kowler E Anderson E Dosher B amp Blaser E(1995) The role of attention in the programming ofsaccades Vision Research 35(13) 1897ndash1916

Loughland C M Williams L M amp Gordon E(2002) Schizophrenia and affective disorder showdifferent visual scanning behaviour for faces Atrait versus state-based distinction BiologicalPsychiatry 52 338ndash348

Marsh A A Kozak M N amp Ambady N (2007)Accurate identification of fear facial expressionspredicts prosocial behavior Emotion 7(2) 239

Nahm F K D Perret A Amaral D G amp AlbrightT D (1997) How do monkeys look at facesJournal of Cognitive Neuroscience 9(5) 611ndash623

Ogrocki P K Hills A C amp Strauss M E (2000)Visual exploration of facial emotion by healthyolder adults and patients with Alzheimer diseaseNeuropsychiatry Neuropsychology and BehavioralNeurology 13 271ndash278

Oosterhof N N amp Todorov A (2009) Sharedperceptual basis of emotional expressions andtrustworthiness impressions from faces Emotion9(1) 128

OrsquoToole A J Deffenbacher K A Valentin D ampAbdi H (1994) Structural aspects of face recog-nition and the other-race effect Memory ampCognition 22(2) 208ndash224

Pelli D G (1997) The VideoToolbox software forvisual psychophysics Transforming numbers intomovies Spatial Vision 10 437ndash442

Pelphrey K A Sasson N J Reznick J Paul GGoldman B D amp Piven J (2002) Visualscanning of faces in autism Journal of Autism andDevelopmental Disorders 32(4) 1573ndash3432

Perlman S B Morris J P Vander Wyk B CGreen S R Doyle J L amp Pelphrey K A(2009) Individual differences in personality shapehow people look at faces PLoS ONE 4(6) e5952

Pflugshaupt T Mosimann U P Schmitt W J vonWartburg R Wurtz P Luthi M Muri RM (2007) To look or not to look at threatScanpath differences within a group of spiderphobics Journal of Anxiety Disorders 21(3) 353ndash366

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 15

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Sasson N Tsuchiya N Hurley R Couture S MPenn D L Adolphs R amp Piven J (2007)Orienting to social stimuli differentiates socialcognitive impairment in autism and schizophreniaNeuropsychologia 45 2580ndash2588

Schyns P G Petro L S amp Smith M L (2009)Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brainBehavioral and brain evidence PLoS One 4(5)e5625

Segerstrom S C (2001) Optimism and attentionalbias for negative and positive stimuli Personalityand Social Psychology Bulletin 27 1334ndash1343

Smith M L Cottrell G W Gosselin F amp SchynsP G (2005) Transmitting and decoding facialexpressions Psychological Science 16(3) 184ndash189

Spezio M L Adolphs R Hurley R S E amp PivenJ (2007) Analysis of face gaze in autism usinglsquolsquoBubblesrsquorsquo Neuropsychologia 45 144ndash151

Spezio M L Huang P-Y S Castelli F amp Adolphs

R (2007) Amygdala damage impairs eye contactduring conversations with real people The Journalof Neuroscience 27(15) 3994ndash3997

Susskind J M Lee D H Cusi A Feiman RGrabski W amp Anderson A K (2008) Expressingfear enhances sensory acquisition Nature Neuro-science 11(7) 843ndash850

Tracy J L amp Robins R W (2004) Show your prideEvidence for a discrete emotion expression Psy-chological Science 15 194ndash197

Walker-Smith G J Gale A G amp Findlay J M(1977) Eye movement strategies involved in faceperception Perception 6(3) 313ndash326

Wright D B amp Sladden B (2003) An own genderbias and the importance of hair in face recognitionActa Psychologica 114(1) 101ndash114

Zhao W Chellappa R Phillips P J amp RosenfeldA (2003) Face recognition A literature surveyAcm Computing Surveys (CSUR) 35(4) 399ndash458

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 16

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

  • Introduction
  • Experiment 1 Eye tracking of
  • f01
  • f02
  • f03
  • f04
  • f05
  • f06
  • f07
  • Experiment 2 Varying diagnostic information
  • f08
  • f09
  • General discussion
  • Conclusion
  • Adolphs1
  • Adolphs2
  • Aviezer1
  • BaronCohen1
  • Beaupre1
  • Blair1
  • Brainard1
  • Buchan1
  • Darwin1
  • Ekman1
  • Ekman2
  • Elfenbein1
  • Everdell1
  • Gosselin1
  • Hargrave1
  • Hejmadi1
  • Henderson1
  • Henderson2
  • Horley1
  • Hsiao1
  • Jack1
  • Keltner1
  • Kowler1
  • Loughland1
  • Marsh1
  • Nahm1
  • Ogrocki1
  • Oosterhof1
  • Otoole1
  • Pelli1
  • Pelphrey1
  • Perlman1
  • Pflugshaupt1
  • Sasson1
  • Schyns1
  • Segerstrom1
  • Smith1
  • Spezio1
  • Spezio2
  • Susskind1
  • Tracy1
  • WalkerSmith1
  • Wright1
  • Zhao1

groups Journal of Cross-Cultural Psychology36(3) 355ndash370

Blair R J R (2005) Responding to the emotions ofothers Dissociating forms of empathy through thestudy of typical and psychiatric populationsConsciousness and Cognition 14(4) 698ndash718

Brainard D H (1997) The Psychophysics ToolboxSpatial Vision 10 443ndash446

Buchan J N Pare M amp Munhall K G (2007)Spatial statistics of gaze fixations during dynamicface processing Social Neuroscience 2(1) 1ndash13

Darwin C (1872) The expression of emotions in manand animals New York Philosophical Library

Ekman P amp Friesen W V (1975) Unmasking theface Englewood Cliffs NJ Prentice Hall

Ekman P amp Friesen W V (1978) The Facial ActionCoding System (FACS) A technique for themeasurement of facial action Palo Alto CAConsulting Psychologists Press

Elfenbein H A amp Ambady N (2002) Is there an in-group advantage in emotion recognition

Everdell I T Marsh H Yurick M D Munhall KG amp Pare M (2007) Gaze behaviour inaudiovisual speech perception Asymmetrical dis-tribution of face-directed fixations Perception 361535ndash1545

Gosselin F amp Schyns P G (2001) Bubbles Atechnique to reveal the use of information inrecognition tasks Vision Research 41(17) 2261ndash2271

Hargrave R Maddock R J amp Stone V (2002)Impaired recognition of facial expressions ofemotion in Alzheimerrsquos disease Journal of Neuro-psychiatry and Clinical Neurosciences 14 64ndash71

Hejmadi A Davidson R J amp Rozin P (2000)Exploring Hindu Indian emotion expressionsEvidence for accurate recognition by Americansand Indians Psychological Science 11(3) 183ndash187

Henderson J M Falk R Minut S Dyer F C ampMahadevan S (2000) Gaze control for facelearning and recognition by humans and machinesMichigan State University Eye Movement Labora-tory Technical Report 4 1ndash14

Henderson J M Williams C C amp Falk R J (2005)Eye movements are functional during face learningMemory amp Cognition 33(1) 98ndash106

Horley K Williams L M Gonsalvez C amp GordonE (2004) Face to face Visual scanpath evidencefor abnormal processing of facial expressions insocial phobia Psychiatry Research 127 43ndash53

Hsiao J H W amp Cottrell G (2008) Two fixations

suffice in face recognition Psychological Science19(10) 998ndash1006

Jack R E Blais C Scheepers C Schyns P G ampCaldara R (2009) Cultural confusions show thatfacial expressions are not universal Current Biol-ogy 19 1ndash6

Keltner D amp Buswell B N (1997) EmbarrassmentIts distinct form and appeasement functionsPsychological Bulletin 122(3) 250ndash270

Kowler E Anderson E Dosher B amp Blaser E(1995) The role of attention in the programming ofsaccades Vision Research 35(13) 1897ndash1916

Loughland C M Williams L M amp Gordon E(2002) Schizophrenia and affective disorder showdifferent visual scanning behaviour for faces Atrait versus state-based distinction BiologicalPsychiatry 52 338ndash348

Marsh A A Kozak M N amp Ambady N (2007)Accurate identification of fear facial expressionspredicts prosocial behavior Emotion 7(2) 239

Nahm F K D Perret A Amaral D G amp AlbrightT D (1997) How do monkeys look at facesJournal of Cognitive Neuroscience 9(5) 611ndash623

Ogrocki P K Hills A C amp Strauss M E (2000)Visual exploration of facial emotion by healthyolder adults and patients with Alzheimer diseaseNeuropsychiatry Neuropsychology and BehavioralNeurology 13 271ndash278

Oosterhof N N amp Todorov A (2009) Sharedperceptual basis of emotional expressions andtrustworthiness impressions from faces Emotion9(1) 128

OrsquoToole A J Deffenbacher K A Valentin D ampAbdi H (1994) Structural aspects of face recog-nition and the other-race effect Memory ampCognition 22(2) 208ndash224

Pelli D G (1997) The VideoToolbox software forvisual psychophysics Transforming numbers intomovies Spatial Vision 10 437ndash442

Pelphrey K A Sasson N J Reznick J Paul GGoldman B D amp Piven J (2002) Visualscanning of faces in autism Journal of Autism andDevelopmental Disorders 32(4) 1573ndash3432

Perlman S B Morris J P Vander Wyk B CGreen S R Doyle J L amp Pelphrey K A(2009) Individual differences in personality shapehow people look at faces PLoS ONE 4(6) e5952

Pflugshaupt T Mosimann U P Schmitt W J vonWartburg R Wurtz P Luthi M Muri RM (2007) To look or not to look at threatScanpath differences within a group of spiderphobics Journal of Anxiety Disorders 21(3) 353ndash366

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 15

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

Sasson N Tsuchiya N Hurley R Couture S MPenn D L Adolphs R amp Piven J (2007)Orienting to social stimuli differentiates socialcognitive impairment in autism and schizophreniaNeuropsychologia 45 2580ndash2588

Schyns P G Petro L S amp Smith M L (2009)Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brainBehavioral and brain evidence PLoS One 4(5)e5625

Segerstrom S C (2001) Optimism and attentionalbias for negative and positive stimuli Personalityand Social Psychology Bulletin 27 1334ndash1343

Smith M L Cottrell G W Gosselin F amp SchynsP G (2005) Transmitting and decoding facialexpressions Psychological Science 16(3) 184ndash189

Spezio M L Adolphs R Hurley R S E amp PivenJ (2007) Analysis of face gaze in autism usinglsquolsquoBubblesrsquorsquo Neuropsychologia 45 144ndash151

Spezio M L Huang P-Y S Castelli F amp Adolphs

R (2007) Amygdala damage impairs eye contactduring conversations with real people The Journalof Neuroscience 27(15) 3994ndash3997

Susskind J M Lee D H Cusi A Feiman RGrabski W amp Anderson A K (2008) Expressingfear enhances sensory acquisition Nature Neuro-science 11(7) 843ndash850

Tracy J L amp Robins R W (2004) Show your prideEvidence for a discrete emotion expression Psy-chological Science 15 194ndash197

Walker-Smith G J Gale A G amp Findlay J M(1977) Eye movement strategies involved in faceperception Perception 6(3) 313ndash326

Wright D B amp Sladden B (2003) An own genderbias and the importance of hair in face recognitionActa Psychologica 114(1) 101ndash114

Zhao W Chellappa R Phillips P J amp RosenfeldA (2003) Face recognition A literature surveyAcm Computing Surveys (CSUR) 35(4) 399ndash458

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 16

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

  • Introduction
  • Experiment 1 Eye tracking of
  • f01
  • f02
  • f03
  • f04
  • f05
  • f06
  • f07
  • Experiment 2 Varying diagnostic information
  • f08
  • f09
  • General discussion
  • Conclusion
  • Adolphs1
  • Adolphs2
  • Aviezer1
  • BaronCohen1
  • Beaupre1
  • Blair1
  • Brainard1
  • Buchan1
  • Darwin1
  • Ekman1
  • Ekman2
  • Elfenbein1
  • Everdell1
  • Gosselin1
  • Hargrave1
  • Hejmadi1
  • Henderson1
  • Henderson2
  • Horley1
  • Hsiao1
  • Jack1
  • Keltner1
  • Kowler1
  • Loughland1
  • Marsh1
  • Nahm1
  • Ogrocki1
  • Oosterhof1
  • Otoole1
  • Pelli1
  • Pelphrey1
  • Perlman1
  • Pflugshaupt1
  • Sasson1
  • Schyns1
  • Segerstrom1
  • Smith1
  • Spezio1
  • Spezio2
  • Susskind1
  • Tracy1
  • WalkerSmith1
  • Wright1
  • Zhao1

Sasson N Tsuchiya N Hurley R Couture S MPenn D L Adolphs R amp Piven J (2007)Orienting to social stimuli differentiates socialcognitive impairment in autism and schizophreniaNeuropsychologia 45 2580ndash2588

Schyns P G Petro L S amp Smith M L (2009)Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brainBehavioral and brain evidence PLoS One 4(5)e5625

Segerstrom S C (2001) Optimism and attentionalbias for negative and positive stimuli Personalityand Social Psychology Bulletin 27 1334ndash1343

Smith M L Cottrell G W Gosselin F amp SchynsP G (2005) Transmitting and decoding facialexpressions Psychological Science 16(3) 184ndash189

Spezio M L Adolphs R Hurley R S E amp PivenJ (2007) Analysis of face gaze in autism usinglsquolsquoBubblesrsquorsquo Neuropsychologia 45 144ndash151

Spezio M L Huang P-Y S Castelli F amp Adolphs

R (2007) Amygdala damage impairs eye contactduring conversations with real people The Journalof Neuroscience 27(15) 3994ndash3997

Susskind J M Lee D H Cusi A Feiman RGrabski W amp Anderson A K (2008) Expressingfear enhances sensory acquisition Nature Neuro-science 11(7) 843ndash850

Tracy J L amp Robins R W (2004) Show your prideEvidence for a discrete emotion expression Psy-chological Science 15 194ndash197

Walker-Smith G J Gale A G amp Findlay J M(1977) Eye movement strategies involved in faceperception Perception 6(3) 313ndash326

Wright D B amp Sladden B (2003) An own genderbias and the importance of hair in face recognitionActa Psychologica 114(1) 101ndash114

Zhao W Chellappa R Phillips P J amp RosenfeldA (2003) Face recognition A literature surveyAcm Computing Surveys (CSUR) 35(4) 399ndash458

Journal of Vision (2014) 14(13)14 1ndash16 Schurgin et al 16

Downloaded From httpjovarvojournalsorgpdfaccessashxurl=dataJournalsJOV933685 on 07202015

  • Introduction
  • Experiment 1 Eye tracking of
  • f01
  • f02
  • f03
  • f04
  • f05
  • f06
  • f07
  • Experiment 2 Varying diagnostic information
  • f08
  • f09
  • General discussion
  • Conclusion
  • Adolphs1
  • Adolphs2
  • Aviezer1
  • BaronCohen1
  • Beaupre1
  • Blair1
  • Brainard1
  • Buchan1
  • Darwin1
  • Ekman1
  • Ekman2
  • Elfenbein1
  • Everdell1
  • Gosselin1
  • Hargrave1
  • Hejmadi1
  • Henderson1
  • Henderson2
  • Horley1
  • Hsiao1
  • Jack1
  • Keltner1
  • Kowler1
  • Loughland1
  • Marsh1
  • Nahm1
  • Ogrocki1
  • Oosterhof1
  • Otoole1
  • Pelli1
  • Pelphrey1
  • Perlman1
  • Pflugshaupt1
  • Sasson1
  • Schyns1
  • Segerstrom1
  • Smith1
  • Spezio1
  • Spezio2
  • Susskind1
  • Tracy1
  • WalkerSmith1
  • Wright1
  • Zhao1