horses (equus caballus) adaptively change the …

12
100 Psychologia, 2016, 59, 100–111 HORSES (EQUUS CABALLUS) ADAPTIVELY CHANGE THE MODALITY OF THEIR BEGGING BEHAVIOR AS A FUNCTION OF HUMAN ATTENTIONAL STATES Ayaka TAKIMOTO 1),2),3) , Yusuke HORI 2),4) , and Kazuo FUJITA 2) 1) Graduate School of Letters, Hokkaido University, Japan 2) Graduate School of Letters, Kyoto University, Japan 3) Japan Society for the Promotion of Science, Japan 4) Wildlife Research Center, Kyoto University, Japan We tested whether horses (Equus caballus) are sensitive to human attentional states and modify the modality of begging behaviors as a function of human attentional states in a naturalistic food-requesting situation. In Experiment 1, horses tended to produce more auditory or tactile begging behaviors when the human experimenter (E1)’s eyes were covered by her hand than when they were not covered. However, there was no difference in visual begging behaviors between conditions. In Experiment 2, horses produced significantly more auditory or tactile begging behaviors when E1’s eyes were closed than when they were open. In contrast, horses produced significantly more visual begging behaviors when E1’s eyes were open than when they were closed. These results suggest that horses understand the role of eyes as an indicator of human attentional states and show effective and flexible begging behaviors proactively as a function of human attentional states. ey oods K horses, human attentional states, begging behaviors INTRODUCTION As a building block of communication, the ability to reliably detect the focus of others’ attention would seem to be useful for any social species. Such ability should have obvious adaptive advantages, potentially allowing an animal to more efficiently detect the This study was supported by the Japan Society for the Promotion of Science (JSPS) Grant-in-Aid for JSPS Research Fellow Grant Numbers JP21264, JP248353 to Ayaka Takimoto; JP255327 to Yusuke Hori, the Grant-in-Aid for Young Scientists (B) Grant Number JP15K20946 to Ayaka Takimoto, the Grant-in-Aid for Scientific Research Grant Numbers JP20220004, JP25240020, JP25118002, JP16H06301 to Kazuo Fujita; JP26118004 to Ayaka Takimoto, Grant-in-Aid for Challenging Exploratory Research Grant Number 15K12047, and the Japan Ministry of Education, Culture, Sport, Science, and Technology (MEXT) Global COE Program, D-07, to Kyoto University. We wish to appreciate the members of the horseback riding club of Kyoto University, especially Shozo Sasaki, Tomoyuki Arase, Akane Yamasaki, Kaoru Obayashi, Yoko Kamiya, Shigeaki Hirano, for their cooperation and help in conducting this study. We also gratefully acknowledge Kazuki Tadokoro for his help in conducting this study and Leanne Proops for editing and useful comments on our manuscript. Correspondence concerning this article should be addressed to Ayaka Takimoto, Department of Behavioral Science, Graduate School of Letters, Hokkaido University, North10, South 7, Kita-ku, Sapporo 060-0810, Japan (e-mail: [email protected]).

Upload: others

Post on 10-Feb-2022

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: HORSES (EQUUS CABALLUS) ADAPTIVELY CHANGE THE …

100

Psychologia, 2016, 59, 100–111

HORSES (EQUUS CABALLUS) ADAPTIVELY CHANGE THE MODALITY OF THEIR BEGGING BEHAVIOR AS A FUNCTION OF HUMAN ATTENTIONAL STATES

Ayaka TAKIMOTO1),2),3), Yusuke HORI2),4), and Kazuo FUJITA2)

1)Graduate School of Letters, Hokkaido University, Japan2)Graduate School of Letters, Kyoto University, Japan

3)Japan Society for the Promotion of Science, Japan4)Wildlife Research Center, Kyoto University, Japan

We tested whether horses (Equus caballus) are sensitive to human attentional states and modify the modality of begging behaviors as a function of human attentional states in a naturalistic food-requesting situation. In Experiment 1, horses tended to produce more auditory or tactile begging behaviors when the human experimenter (E1)’s eyes were covered by her hand than when they were not covered. However, there was no difference in visual begging behaviors between conditions. In Experiment 2, horses produced significantly more auditory or tactile begging behaviors when E1’s eyes were closed than when they were open. In contrast, horses produced significantly more visual begging behaviors when E1’s eyes were open than when they were closed. These results suggest that horses understand the role of eyes as an indicator of human attentional states and show effective and flexible begging behaviors proactively as a function of human attentional states.

ey oodss:K horses, human attentional states, begging behaviors

IntroductIon

As a building block of communication, the ability to reliably detect the focus of others’ attention would seem to be useful for any social species. Such ability should have obvious adaptive advantages, potentially allowing an animal to more efficiently detect the

This study was supported by the Japan Society for the Promotion of Science (JSPS) Grant-in-Aid for JSPS Research Fellow Grant Numbers JP21264, JP248353 to Ayaka Takimoto; JP255327 to Yusuke Hori, the Grant-in-Aid for Young Scientists (B) Grant Number JP15K20946 to Ayaka Takimoto, the Grant-in-Aid for Scientific Research Grant Numbers JP20220004, JP25240020, JP25118002, JP16H06301 to Kazuo Fujita; JP26118004 to Ayaka Takimoto, Grant-in-Aid for Challenging Exploratory Research Grant Number 15K12047, and the Japan Ministry of Education, Culture, Sport, Science, and Technology (MEXT) Global COE Program, D-07, to Kyoto University.

We wish to appreciate the members of the horseback riding club of Kyoto University, especially Shozo Sasaki, Tomoyuki Arase, Akane Yamasaki, Kaoru Obayashi, Yoko Kamiya, Shigeaki Hirano, for their cooperation and help in conducting this study. We also gratefully acknowledge Kazuki Tadokoro for his help in conducting this study and Leanne Proops for editing and useful comments on our manuscript.

Correspondence concerning this article should be addressed to Ayaka Takimoto, Department of Behavioral Science, Graduate School of Letters, Hokkaido University, North10, South 7, Kita-ku, Sapporo 060-0810, Japan (e-mail: [email protected]).

Page 2: HORSES (EQUUS CABALLUS) ADAPTIVELY CHANGE THE …

HORSES READ HUMAN ATTENTIONAL STATES 101

presence of predators, competitors, or any beneficial resources (water, food and so on), to follow the gaze of others to significant events and locations, and to engage in more effective communication. Nonhuman primates have been shown to follow human experimenter’s gaze (chimpanzees (Pan troglodytes), bonobos (Pan paniscus), gorillas (Gorilla gorilla), orangutans (Pongo pygmaeus): Bräuer, Call, & Tomasello, 2005; Itakura, 1996; Kano & Call, 2014; Tomasello, Hare, & Agnetta, 1999; Tomasello, Hare, & Fogleman, 2001; rhesus macaques (Macaca mulatta): Emery, Lorincz, Perrett, Oram, & Baker, 1997; Tomasello et al., 2001; stump-tailed macaques (Macaca arctoides): Anderson & Mitchell, 1999; pig-tailed macaques (Macaca nemestrina): Ferrari, Kohler, Fogassi, & Gallese, 2000; long-tailed macaques (Macaca fascicularis): Goossens, Dekleva, Reader, Sterck, & Bolhuis, 2008). They are also sensitive to other conspecifics’ gaze shown by photos (chimpanzees (Pan troglodytes), sooty mangabeys (Cercocebus atys), rhesus macaques, stump-tailed macaques (Macaca arctoides), and pig-tailed macaques: Tomasello, Call, & Hare, 1998; common marmosets (Callithrix jacchus): Burkart & Heschl, 2007; cotton-top tamarins (Saguinus oedipus): Neiworth, Burman, Basile, & Lickteig, 2002, spider monkeys (Ateles geoffroyi) and tufted capuchin monkeys (Cebus apella): Amici, Aureli, Visalberghi, & Call, 2009; brown lemurs (Eulemur fulvus) and black lemurs (Eulemur macaco): Ruiz, Gomez, Roeder, & Byrne, 2009). Similar sensitivity has been found in a wide range of nonprimate species (gray wolves (Canis lupus): Range & Virányi, 2011; goats (Capra hircus): Kaminski, Riedel, Call, & Tomasello, 2005; dogs (Canis familiaris): Met, Miklósi, & Lakatos, 2014), red-foot tortoises (Geochelone carbonaria; Wilkinson, Mandl, Bugnyar, & Huber, 2010); ravens (Corvus corax; Bugnyar, Stöwe, & Heinrich, 2004).

Animals may also behave flexibly depending upon others’ attentional states; subordinate chimpanzees tend to avoid the food dominant individuals can see in a competitive situation (Hare, Call, Agnetta, & Tomasello, 2000; Hare, Call, & Tomasello, 2001). Chimpanzees can also use human attentional cues shown by eyes and produce more effective begging gestures in a natural food requesting situation (Hostetter, Russell, Freeman, & Hopkins, 2007). Some monkeys, too, are able to use human attentional cues in food-begging and food-competition tasks (rhesus macaques: Flombaum & Santos, 2005; olive baboons (Papio anubis): Vick & Anderson, 2003; tufted capuchin monkeys: Hattori, Kuroshima, & Fujita, 2007, 2010). Tufted capuchin monkeys may have a rudimentary understanding of the relationship between seeing and knowing (Kuroshima, Fujita, Fuyuki, & Masuda, 2002).

However, nonhuman primates’ sensitivity to human attentional states may be a consequence of intensive training; Bräuer, Kaminski, Riedel, Call, and Tomasello (2006) showed that untrained apes are less sensitive to human gaze cues than dogs. Dogs, in contrast, show remarkable sensitivity to human attentional states without training. Dogs appear adept at reading human cues. They can use various cues shown by body, head and eyes to determine human attentional states in a variety of tasks such as obeying commands, fetching toys, locating hidden food and deciding whom to approach for food (e.g., Call, Bräuer, Kaminski, & Tomasello, 2003; Gácsi, Miklósi, Varga, Topál, & Csányi, 2004; Kaminski, Bräuer, Call, &Tomasello, 2009; Miklósi, Polgárdi, Topál, & Csányi, 1998; Schwab & Huber, 2006; Virányi, Topál, Gácsi, Miklósi, & Csányi, 2004). The reason why

Page 3: HORSES (EQUUS CABALLUS) ADAPTIVELY CHANGE THE …

102 TAKIMOTO, HORI, & FUJITA

dogs show this flexible attention-reading ability that generalizes across a variety of attention attribution tasks, something not seen in other species including primates, is still under debate. The process of domestication may have selected for an ability to read human cues in dogs (Hare, Brown, Williamson, & Tomasello, 2002; Hare & Tomasello, 2005).

Horses (Equus caballus) have lived with humans for 5500 years (Ludwig et al., 2009) and have built up a unique partnership with humans. Horses also are able to utilize subtle changes in communicative signals in conspecifics: states of ears, the orientation and widening of the eyes, the dilation of the nostrils and the tension of the mouth (Waring 2003). Thus, both the history of domestication and the predisposition for utilizing conspecific visual cues of behavior can provide a background for horses to be sensitive to human attentional states. Indeed, some studies have shown that horses, without specific training, are able to use human pointing gestures to locate hidden food (Maros, Gácsi, & Miklósi, 2008; McKinley & Sambrook, 2000; Proops, Walton, & McComb, 2010). Horses are sensitive to human attentional states and they approach an attentive human experimenter for food (Proops & McComb, 2010), and are also sensitive to other conspecifics’ attentional states and understand the role of their eyes and ears as important indicators of their attention (Wathan & McComb, 2014). However, it is still unclear whether horses could modify the kinds of begging behaviors proactively, that is, whether they would selectively increase more effective begging behaviors in order to get human attention as a function of human attentional states.

In the present study, we examined whether horses are sensitive to human attentional states and whether they are able to modify begging behaviors flexibly and effectively as a function of human attentional states in a naturalistic food-requesting situation. In particular, we tested whether they would understand the role of eyes as a cue to human attentional states, using the paradigm that successfully demonstrated chimpanzees’ understanding of human attention (Hostetter et al., 2007), with slight modification. If horses are sensitive to subtle human attentional states, horses would discriminate the situation in which a human experimenter can see them from the situation in which she cannot and selectively produce more effective begging behaviors as a function of these different attentional states. In particular, they should produce more auditory or tactile begging behaviors when the human experimenter cannot see their begging behaviors and show more visual begging behaviors when she can see them.

ExpErImEnt 1

mEthod

ParticipantsParticipants were sixteen horses from the horseback-riding club of Kyoto University (age: 6-27 yrs,

mean: 11.63, S.D. 5.39, breed: fifteen thoroughbred and one Anglo-Arabian, sex: one stallion, thirteen geldings and two mares1). No horses were food deprived for this experiment, but their dairy feeding schedule was fixed; they were fed in the morning, at noon, in the evening and at night. They participated in this experiment

1 A stallion is an uncastrated male horse and a gelding is a castrated male horse. A mare is a female horse.

Page 4: HORSES (EQUUS CABALLUS) ADAPTIVELY CHANGE THE …

HORSES READ HUMAN ATTENTIONAL STATES 103

in the afternoon, between the 2nd and the 3rd feeding times. All horses were tested at the hoof wash place of the horseback-riding club in March and April, 2008 and October and November, 2009.

ProcedureWe used a slightly modified version of the Hostetter et al. (2007) test paradigm that previously yielded

support for chimpanzees’ understanding of human attention. A participant horse was tied loosely between two poles at the hoof wash place. Two experimenters were involved for the test of each horse. Experimenter 1 (E1) offered food to the participant and Experimenter 2 (E2) played a role of timekeeper and recorder of the participant’s behavior by two video cameras. E1 stood approximately 1m in front of the participant and showed a carrot to the horse. During the trial, E1 was engaged in one of the experimental manipulations described later. Once E1 was in position and the trial started, E2 timed 60 seconds with a stopwatch. All of the participant’s behaviors during the trial were recorded by each digital video camera (Sony, DCR-TRV27) from the front and side of the participant. Following each 60-s trial, E1 gave the carrot to the participant regardless of the behaviors displayed during the trial. Horses received no training for this test paradigm.

Experimental conditionsThere were two test conditions (Mouth-covered condition and Eyes-covered condition) and one control

condition (Out condition). In the Mouth-covered condition, E1 placed her hand over her mouth so that the lower half of the face was not visible. Whether E1 used her right or left hand to show a carrot was counterbalanced between participants. She looked at the area between the horse’s eyes throughout the trial. In the Eyes-covered condition, E1 placed her hand over her eyes so that the upper half of the face was not visible and maintained a forward orientation. In the Out condition (control condition), E1 placed a carrot in front of the horse and left the place. This condition was used as a baseline and horses that performed lots of pawing etc. were removed from the study.

Horses received each of the two test conditions and one control condition only once in a counterbalanced order. Trials were separated by at least one day, usually several days (one–nine days). We used this standard paradigm because it represents a naturalistic situation that does not require extensive training to perform and we tested participants on a single trial for each cue to prevent any learning during the study.

Fig. 1. General setting of test conditions of Experiments 1 and 2

Page 5: HORSES (EQUUS CABALLUS) ADAPTIVELY CHANGE THE …

104 TAKIMOTO, HORI, & FUJITA

AnalysesWe classified participants’ begging behaviors into two categories: auditory or tactile begging behaviors

(pawing the ground and nudging [touching] a human experimenter) and visual ones (gaze alternation). The former was effective begging behaviors for participants to get E1’s attention when she could not see participants’ behaviors, that is, her eyes were covered by her hand (Eyes-covered condition). The latter was effective begging behaviors for participants to get E1’s attention only when she was able to see participants’ behaviors, that is, her mouth was covered by her hand (Mouth-covered condition). Pawing means the behavior that horses lift one foreleg from the ground slightly and then extend it quickly in a forward direction, followed by movement backward dragging the toe against the ground in a digging motion (McDonnell, 2003). This behavior is frequently used to obtain or investigate something (Rees, 1985). It could also be a sign of request and/or frustration and is seen in horses prevented from getting at the object that they want. Nudge is a horses’ slow and gentle touch by their nose toward friends or their handlers (Rees, 1985). This behavior does not carry the head upwards, and the ears are not flattened but are half-back or forward. It is an attention-seeking movement (Rees, 1985). We defined gaze alternation as participants’ gaze at E1 followed by that at the carrot in her hand, or vice versa, within 2 seconds (cf. Malavasi & Huber, 2016).

One of the authors (AT) scored all trials from the videotapes. A naive observer who was ignorant of the purpose of the study scored a randomly selected sample of trials (20%) to assess inter-observer reliability for the auditory, tactile and visual begging behaviors during the trial. Reliability between the two coders (auditory or tactile begging behaviors: Spearman r(5) = 1.000, p < 0.001; visual ones Spearman r(5) = .973, p = 0.005) was satisfactory.

To make scores of auditory begging behavior (pawing), we subtracted its frequency in the control condition (Out condition) from that in the two test conditions (Mouth-covered and Eyes-covered conditions). This subtracted score is appropriate as that of begging behaviors for request because the same pawing occurs as a frustration irrespective of human presence. This subtraction was not necessary for the other non-visual behavior (tactile one), nudge, because it could never occur in human absence. We summed up the subtracted number of pawing and the frequency of nudge to make up a score of non-visual begging behaviors (auditory or tactile ones) and analyzed this score by Wilcoxon’s signed rank test. Visual begging behavior (Gaze alternation), just like nudge, could not occur in the control condition (Out condition). We simply compared the frequency of horses’ visual begging behavior between the two test conditions (Mouth-covered and Eyes-covered conditions) by Wilcoxon’s signed rank test. Here, we excluded four horses’ data from the analyses. One horse’s data was excluded because he could not participate in all test conditions because of a schooling problem. The other three horses’ data were excluded because of lack of composure in the test situation. In fact, they showed more pawing in the control condition (Out condition) in which E1 was absent than in the two test conditions (Mouth-covered and Eyes-covered conditions). Such horses did not seem to be suited to the present study as our experimental operation did not seem to have worked for them. These horses may have shown their apparent begging behaviors due to frustration, not as a request for humans. Therefore, we excluded these three horses’ data from our analyses.

rEsults & dIscussIon

Fig. 2 shows the frequency scores of horses’ auditory or tactile begging behaviors (Fig. 2a) and those of visual ones (Fig. 2b) in the Mouth-covered and the Eyes-covered conditions. For scores of auditory or tactile begging behaviors, there was a marginally significant difference between the Mouth-covered and the Eyes-covered condition (Z = –1.943, n = 12, p = 0.052). On the contrary, there was no difference between the two test conditions in visual ones (Z = –1.494, n = 12, p = 0.135). In other words, horses tended to produce more auditory or tactile begging behaviors when the experimenter’s eyes were not visible (Eyes-covered condition) than when her eyes were visible (Mouth-covered condition), but they failed to change the frequency of visual begging behaviors in the two test conditions.

Page 6: HORSES (EQUUS CABALLUS) ADAPTIVELY CHANGE THE …

HORSES READ HUMAN ATTENTIONAL STATES 105

The fact that horses tended to increase the number of the more effective begging behaviors to get attention from the experimenter when her eyes were not visible, rather than all kinds of begging behaviors, may suggest that the horses recognize the eyes’ functional role and what kind of behavior is effective to elicit a response in humans when their attention is averted. However, horses produced comparable numbers of visual begging behaviors between when the experimenter’s eyes were visible and when not, despite these behaviors being ineffective in the latter situation. Therefore, these results supported our hypotheses only partially. The reason may have been that covering the eyes by a hand is not within horses’ behavioral repertoire. Thus it may be difficult for horses to infer human attentional states when humans do what horses cannot do. Similar correspondence between what the agents can do and what they can infer has been shown in human infants (e.g., Kanakogi & Itakura, 2011; Sommerville, Woodward, & Needham, 2005) and capuchin monkeys (Kuroshima, Kaiser, & Fragaszy, 2014).

Fig. 2. (a) The frequency scores of horses’ auditory or tactile begging behaviors, and (b) those of visual ones in Experiment 1. The plot shows medians, first and third percentiles and ranges. (+ p < 0.100)

Page 7: HORSES (EQUUS CABALLUS) ADAPTIVELY CHANGE THE …

106 TAKIMOTO, HORI, & FUJITA

Another simpler account for these ambiguous results might be that the experimenter’s gestures of covering her mouth or eyes by hand were unfamiliar to horses. This is because horses were able to use human pointing gestures in order to find food even though such gestures are not in their behavioral repertoire (McKinley & Sambrook, 2000; Maros et al., 2008; Proops et al., 2010).

These two possibilities led us to test the same subjects using gestures that are familiar to and included in the repertoire of horses.

ExpErImEnt 2

In Experiment 2, we explored if horses were more sensitive to human attentional states when the test conditions included human cues that are also part of horses’ behavioral repertoire and familiar to them. The other procedures were the same as in Experiment 1.

mEthod

ParticipantsParticipants were twelve horses that were included into Experiment 1’s analyses (age: 6–27yrs., mean

age: 12.00, S.D. 5.82, breed: eleven thoroughbred and one Anglo-Arabian horses, sex: one stallion, ten geldings and one mare).

ProcedureThe procedure was the same as in Experiment 1, except for varying the human experimenter’s attentional

states using new two test conditions.

Experimental conditionsWe set new two test conditions; the Eyes-open and the Eyes-closed conditions. In the Eyes-open

condition, E1 kept her eyes open throughout the trial and looked at the area with neutral expression between the horse’s eyes. On the other hand, in the Eyes-closed condition, E1 kept her eyes closed and remained oriented toward the front throughout the trial.

AnalysesClassification of participants’ begging behaviors and statistical methods were the same as in Experiment

1. We also assessed inter-observer reliability for two kinds of begging behaviors during the trial as Experiment 1. AT scored all trials from the videotapes. A naive observer who was ignorant of the purpose of the study scored a randomly selected sample of trials (20%). Reliability between the author and the naive observer (auditory or tactile begging behaviors: Spearman r(5) = 1.000, p < 0.001; visual ones Spearman r(5) = .975, p = 0.005) was satisfactory.

As Experiment 1, we summed up the subtracted number of pawing and the frequency of nudge to make up a score of the auditory or tactile begging behaviors. We also analyzed this score and the frequency of visual begging behavior by Wilcoxon’s signed rank tests.

rEsults & dIscussIon

Fig. 3 shows the frequency scores of horses’ auditory or tactile begging behaviors (Fig. 3a) and those of visual ones (Fig. 3b) in the Eyes-open and the Eyes-closed conditions. The Wilcoxon signed-rank test found a significant difference in the scores of auditory or

Page 8: HORSES (EQUUS CABALLUS) ADAPTIVELY CHANGE THE …

HORSES READ HUMAN ATTENTIONAL STATES 107

tactile begging behaviors between the Eyes-open and the Eyes-closed conditions (Z = –2.229, n = 12, p = 0.026). There was a significant difference in visual begging behaviors, too, between the two conditions (Z = –2.081, n = 12, p = 0.037). In other words, horses produced significantly more auditory or tactile begging behaviors when human eyes were closed than when they were open, and produced significantly more visual begging behaviors when human eyes were open than when they were closed.

Horses increased begging behaviors that are effective to get the experimenter’s attention, not all kinds of ones, depending upon attentional states of the experimenter. The results suggest that horses recognize the eyes’ functional role in attention and that they can produce more effective begging behaviors in a naturalistic food-requesting situation. Horses’ behavioral association of begging behaviors as a function of human attentional states were clearer in Experiment 2 than in Experiment 1. This may have resulted from the use of human cues that are familiar to horses and shared in the behavioral repertoire between humans and horses.

Fig. 3. (a) The frequency scores of horses’ auditory or tactile begging behaviors, and (b) those of visual ones in Experiment 2. The plot shows medians, first and third percentiles and ranges. (* p < 0.050)

Page 9: HORSES (EQUUS CABALLUS) ADAPTIVELY CHANGE THE …

108 TAKIMOTO, HORI, & FUJITA

GEnEral dIscussIon

In the present study, we asked whether horses are not only sensitive to human attentional states but also able to modify their begging behaviors flexibly and effectively as a function of human attentional states in a naturalistic food-requesting situation. We focused on whether they would understand the role of eyes as an indicator of human attentional states in two experiments. In Experiment 1, we controlled human attentional cues by covering or not covering the eyes by a hand, and we found that horses tended to show more auditory and tactile begging behaviors that were effective when the experimenter could not see them more often than when she could. However, the number of horses’ visual begging behaviors was not different between the conditions. In Experiment 2, we controlled attentional cues by closing and opening the eyes. We now found that horses not only increased the number of auditory and tactile begging behaviors when a human could not see them but also increased visual behaviors when she could. These results show that horses not only discriminate the situations in which humans can see from those in which humans cannot but also flexibly adapt their attention-getting behaviors to each situation so that the begging would be more successful.

Of special interest, whereas horses failed to show difference in visual begging behaviors between two test conditions in which the experimenter either could or could not see in Experiment 1, they did show this in Experiment 2. This may suggest difficulty for horses to infer human attentional states when the behavioral cues provided are not included in horses’ behavioral repertoire, or are unfamiliar to them. Similar correspondence between what the agents can do and what they can infer has been shown in human infants (e.g., Kanakogi & Itakura, 2011; Sommerville et al., 2005) and capuchin monkeys (Kuroshima et al., 2014).

Previous studies have found that horses are sensitive to conspecifics’ and human attentional states and that they understand the eyes or ears as an important indicator in determining others’ attentional states (Proops & McComb, 2010; Sankey, Henry, André, Richard-Yris, & Hausberger, 2011; Wathan & McComb, 2014). Especially, in Wathan and McComb (2014), horses used the conspecific model’s head orientation as a cue to locate a hidden food in the object choice task only when both eyes and ears of the model were visible. That is to say, horses failed to follow the conspecific’s head orientation when either eyes or ears were covered by a mask. The results showed that horses required both eye and ear cues for collecting information regarding the model’s attentional states. Proops and McComb (2010) showed that horses also were highly skilled at reading human cues to attention; they approached an attentive person more often than inattentive one when requesting food. The authors suggested that horses could judge human attentional states by using cues of human body, head and gaze direction. Importantly, in their study, human experimenters always showed behavioral cues included in the horses’ behavioral repertoire and familiar to them. In contrast, in Experiment 1 of the present study, the experimenter showed the behavioral cues that were neither included in the behavioral repertoire of horses nor familiar to them: covering eyes by a hand. We assessed this possible interpretation of the results from Experiment 1 in Experiment 2, in which we used closure of the eyes as

Page 10: HORSES (EQUUS CABALLUS) ADAPTIVELY CHANGE THE …

HORSES READ HUMAN ATTENTIONAL STATES 109

cues of human attention. Results showed that it is more difficult for horses to infer human attentional states from human behavior they cannot do by themselves or unfamiliar to them (Experiment 1).

We report the first evidence that horses can modify the kinds of begging behaviors they employ proactively, increasing only the more effective begging behaviors as a function of differing human attentional states, by a slightly modified version of a paradigm that successfully demonstrated chimpanzees’ understanding of human attention (Hostetter et al., 2007). This test paradigm is typical of captive animals’ everyday experience, although interacting with a human who takes 60 seconds before responding is certainly not without its oddities. Hostetter et al. (2007) indicated that it is very important to investigate what kinds of human attentional and mental states animals are sensitive to using procedures that are more representative of the animals’ ecological experiences. Our studies also confirm that such naturalistic experimental paradigms should be very effective in unveiling a wide range of animals’ social abilities when spontaneously interacting with humans.

In the present study, we tested horses’ sensitivity to human attentional states by varying whether a human experimenter can see horses’ behaviors or not. The results suggested that horses are sensitive to human attentional states and do recognize the eyes as an important indicator of whether or not a human experimenter will respond to their behavior. Moreover, we found that horses flexibly adapt their behavior to human attentional states. Wathan and McComb (2014) showed that horses understand not only the importance of eyes but also that of ears for collecting information and determining the attentional states. In the future, it may be interesting to assess whether horses can behave effectively or flexibly as a function of whether humans are able to hear them or not. In addition, we should compare horses’ sensitivity to human and conspecific social cues such as facial expressions and emotional vocalisations in order to clarify whether horses have developed their communicative abilities especially for interacting with humans through their domestication history. Further studies of such kinds will help us to build a better relationship between horses and humans as unique and precious heterospecific partners.

REFERENCES

Amici, F., Aureli, F., Visalberghi, E., & Call, J. 2009. Spider monkeys (Ateles geoffroyi) and capuchin monkeys (Cebus apella) follow gaze around barriers: Evidence for perspective taking? Journal of Comparative Psychology, 123, 368–374.

Anderson, J. R., & Mitchell, R. W. 1999. Macaques but not lemurs co-orient visually with humans. Folia Primatologica, 70, 17–22.

Bräuer, J., Call, J., & Tomasello, M. 2005. All great ape species follow gaze to distant locations and around barriers. Journal of Comparative Psychology, 119, 145–154.

Bräuer, J., Kaminski, J., Riedel, J., Call, J., & Tomasello, M. 2006. Making inferences about the location of hidden food: Social dog, causal ape. Journal of Comparative Psychology, 120, 38–47.

Bugnyar, T., Stöwe, M., & Heinrich, B. 2004. Ravens, Corvus corax, follow gaze direction of humans around obstacles. Proceedings of the Royal Society B: Biological Sciences, 271, 1331–1336.

Burkart, J. M., & Heschl, A. 2007. Understanding visual access in common marmosets, Callithrix jacchus: Perspective taking or behaviour reading? Animal Behaviour, 73, 457–469.

Call, J., Bräuer, J., Kaminski, J., & Tomasello, M. 2003. Domestic dogs (Canis familiaris) are sensitive to the

Page 11: HORSES (EQUUS CABALLUS) ADAPTIVELY CHANGE THE …

110 TAKIMOTO, HORI, & FUJITA

attentional state of humans. Journal of Comparative Psychology, 117, 257–263.Emery, N. J., Lorincz, E., N., Perrett, D. I., Oram, M. W., & Baker, C. I. 1997. Gaze following and joint

attention in rhesus monkeys (Macaca mulatta). Journal of Comparative Psychology, 111, 286–293.Ferrari, P. F., Kohler, E., Fogassi, L., & Gallese, V. 2000. The ability to follow eye gaze and its emergence

during development in macaque monkeys. Proceedings of the National Academy of Sciences of the United States of America, 97, 13997–14002.

Flombaum, J. I., & Santos, L. R. 2005. Rhesus monkeys attribute perceptions to others. Current Biology, 15. 447–452.

Gácsi, M., Miklósi, Á., Varga, O., Topál, J., & Csányi, V. 2004. Are readers of our face readers of our minds? Dogs (Canis familiaris) show situation-dependent recognition of human’s attention. Animal Cognition, 7, 144–153.

Goossens, B. M. A., Dekleva, M., Reader, S. M., Sterck, E. H. M., & Bolhuis, J. J. 2008. Gaze following in monkeys is modulated by observed facial expressions. Animal Behaviour, 75, 1673–1681.

Hare, B., Brown, M., Williamson, C., & Tomasello, M. 2002. The domestication of social cognition in dogs. Science, 298, 1634–1636.

Hare, B., Call, J., Agnetta, B., & Tomasello, M. 2000. Chimpanzees know what conspecifics do and do not see. Animal Behaviour, 59, 771–785.

Hare, B., Call, J., & Tomasello, M. 2001. Do chimpanzees know what conspecifics know? Animal Behaviour, 61, 139–151.

Hare, B., & Tomasello, M. 2005. Human-like social skills in dogs? Trends in Cognitive Sciences, 9, 439–444.Hattori, Y., Kuroshima, H., & Fujita, K. 2007. I know you are not looking at me: Capuchin monkeys’ (Cebus

apella) sensitivity to human attentional states. Animal Cognition, 10, 141–148.Hattori, Y., Kuroshima, H., & Fujita, K. 2010. Tufted capuchin monkeys (Cebus apella) show understanding

of human attentional states when requesting food held by a human. Animal Cognition, 13, 87–92.Hostetter, A. B., Russell, J. L., Freeman, H., & Hopkins, W. D. 2007. Now you see me, now you don’t:

Evidence that chimpanzees understand the role of the eyes in attention. Animal Cognition, 10, 55–62.Itakura, S. 1996. An exploratory study of gaze-monitoring in nonhuman primates. Japanese Psychological

Research, 38, 174–180.Kaminski, J., Bräuer, J., Call, J., & Tomasello, M. 2009. Domestic dogs are sensitive to a human’s perspective.

Behaviour, 146, 979–998.Kaminski, J., Riedel, J., Call, J., & Tomasello, M. 2005. Domestic goats, Capra hircus, follow gaze direction

and use social cues in an object choice task. Animal Behaviour, 69, 11–18.Kanakogi, Y., & Itakura, S. 2011. Developmental correspondence between action prediction and motor ability

in early infancy. Nature Communications, 2, 341.Kano, F., & Call, J. 2014. Cross-species variation in gaze following and conspecific preference among great

apes, human infants and adults. Animal Behaviour, 91, 137–150.Kuroshima, H., Fujita, K., Fuyuki, A., & Masuda, T. 2002. Understanding of the relationship between seeing

and knowing by tufted capuchin monkeys (Cebus apella). Animal Cognition, 5, 41–48.Kuroshima, H., Kaiser, I., & Fragaszy, D. M. 2014. Does own experience affect perception of others’ actions

in capuchin monkeys (Cebus apella)? Animal Cognition, 17, 1269–1279.Ludwig, A., Pruvost, M., Reissmann, M., Benecke, N., Brockmann, G. A., Castanõs, P., . . . Hofreiter, M.

2009. Coat color variation at the beginning of horse domestication. Science, 324, 485.Malavasi, R., & Huber, L. 2016. Evidence of heterospecific referential communication from domestic horses

(Equus caballus) to humans. Animal Cognition, 19, 899–909.Maros, K., Gácsi, M., & Miklósi, Á. 2008. Comprehension of human pointing gestures in horses (Equus

caballus). Animal Cognition, 11, 457–466.McDonnell, S. 2003. The equid ethogram: A practical field guide to horse behavior. Lexington, KY: Eclipse

Press.McKinley, J., & Sambrook, T. D. 2000. Use of human-given cues by domestic dogs (Canis familiaris) and

horses (Equus caballus). Animal Cognition, 3, 13–22.Met, A., Miklósi, Á., & Lakatos, G. 2014. Gaze-following behind barriers in domestic dogs. Animal

Cognition, 17, 1401–1405.Miklósi, Á., Polgárdi, R., Topál, J., & Csányi, V. 1998. Use of experimenter-given cues in dogs. Animal

Cognition, 1, 113–121.

Page 12: HORSES (EQUUS CABALLUS) ADAPTIVELY CHANGE THE …

HORSES READ HUMAN ATTENTIONAL STATES 111

Neiworth, J. J., Burman, M. A., Basile, B. M., & Lickteig, M. T. 2002. Use of experimenter-given cues in visual co-orienting and in an object-choice task by a new world monkey species, cotton top tamarins (Saguinus oedipus). Journal of Comparative Psychology, 116, 3–11.

Proops, L., & McComb, K. 2010. Attributing attention: The use of human-given cues by domestic horses (Equus caballus). Animal Cognition, 13, 197–205.

Proops, L., Walton, M., & McComb, K. 2010. The use of human-given cues by domestic horses, Equus caballus, during an object choice task. Animal Behavior, 79, 1205–1209.

Range, F., & Virányi, Z. 2011. Development of gaze following abilities in wolves (Canis lupus). PLoS ONE, 6, e16888.

Rees, L. 1985. The horse’s mind. New York, NY: Prentice Hall Press.Ruiz, A., Gomez, J. C., Roeder, J. J., & Byrne, R. W. 2009. Gaze following and gaze priming in lemurs.

Animal Cognition, 12, 427–434Sankey, C., Henry, S., André, N., Richard-Yris, M., & Hausberger, M. 2011. Do horses have a concept of

person? PloS ONE, 6, e18331.Schwab, C., & Huber, L. 2006. Obey or not obey? Dogs (Canis familiaris) behave differently in response to

attentional states of their owners. Journal of Comparative Psychology, 120, 169–175.Sommerville, J. A., Woodward, A. L., & Needham, A. 2005. Action experience alters 3-month-old infants’

perception of others’ actions. Cognition, 96, B1–B11.Tomasello, M., Call, J., & Hare, B. 1998. Five primate species follow the visual gaze of conspecifics. Animal

Behaviour, 55, 1063–1069.Tomasello, M., Hare, B., & Agnetta, B. 1999. Chimpanzees, Pan troglodytes, follow gaze direction

geometrically. Animal Behaviour, 58, 769–777.Tomasello, M., Hare, B., & Fogleman, T. 2001. The ontogeny of gaze following in chimpanzees, Pan

troglodytes, and rhesus macaques, Macaca mulatta. Animal Behaviour, 61, 335–343.Vick, S.-J., & Anderson, J. R. 2003. Use of human visual attention cues by olive baboons (Papio anubis) in a

competitive task. Journal of Comparative Psychology, 117, 209–216.Virányi, Z., Topál, J., Gácsi, M., Miklósi, Á., & Csányi, V. 2004. Dogs respond appropriately to cues of

humans’ attentional focus. Behavioural Processes, 66, 161–172.Waring, G. H. 2003. Horse behavior (2nd ed.). Norwich, NY: Noyes.Wathan, J., & McComb, K. 2014. The eyes and ears are visual indicators of attention in domestic horses.

Current Biology, 24, R677–R679.Wilkinson, A., Mandl, I., Bugnyar, T., & Huber, L. 2010. Gaze following in the red-footed tortoise (Geochelone

carbonaria). Animal Cognition, 13, 765–769.

(Manuscript received 4 September, 2016; Revision accepted 29 October, 2016)