Introduction

Facial communication plays a crucial role in the social cognition of humans and several species of non-human animals. Although faces are more or less symmetrical, human face perception (i.e. judgement of gender, age, expression, likeness and attractiveness) mostly relies on facial information contained in the right side of the owners’ face (left side of the viewed face from viewer’s perspective) (e.g. Gilbert and Bakan 1973; Grega et al. 1988; Burt and Perrett 1997). For instance, when asked to judge the facial expression of a briefly presented chimeric face image, in which the left and right side of the viewed face differ on facial expression, human viewers tend to base their decision more frequently on the expression contained within the right side of the owner’s face, i.e. the left hemiface for the viewer. This left perceptual bias in face perception is often accompanied and enhanced by a left gaze bias (LGB), defined by the higher probability of first gaze and a higher proportion of viewing time directed at the left hemiface, when actively exploring face images (Mertens et al. 1993; Philips and David 1997; Butler et al. 2005; Bulter and Harvey 2006). In other words, the left hemiface is often inspected first and/or for longer periods.

Although human visuospatial attention bias is to the left visual field (Rhodes 1986; Vaid and Singh 1989; Nicholls and Roberts 2002; Niemeier et al. 2007) and in some cultures, a long practised left-to-right directional scanning bias (most notably, reading) (Gilbert and Bakan 1973; Vaid and Singh 1989; Heath et al. 2005) may contribute to this gaze asymmetry, it is often argued that a right hemisphere advantage in face processing (receiving visual input from left visual field) is the likely cause of LGB (Burt and Perrett 1997; Butler et al. 2005; Leonards and Scott-Samuel 2005). As a consequence, if a face is initially presented within a viewer’s central visual field, the left hemiface is projected to the face-sensitive right hemisphere, where its saliency is more readily evaluated, causing an increase in the viewer’s attention as necessary. A recent study of judging the gender of chimeric faces showed that on trials where participants based their decision on the gender cues contained in the left side of the chimeric face, they fixated more often and longer on the left hemiface (Bulter et al. 2005), further suggesting that LGB is closely associated with the perceptual processing of facial information, and could be part of eye scanning patterns associated with face exploration.

However, it remains unclear how this face-related LGB develops in human and whether it is restricted to human or could evolve (homologously or analogously) in other species living in complex social environments. To address these questions, we systematically investigated gaze asymmetries in two comparative studies investigating gaze direction in human infants and adults and also in rhesus monkeys and domestic dogs while exploring various images of faces and objects.

Experiment 1: LGB in human infants and adults: developmental study

It has been suggested that human face processing involves a face-specific cognitive and neural mechanism (McKone et al. 2006; see also Tarr and Cheng 2003) which is species- and orientation-sensitive. Specifically, human adults differentiate faces of their own species (or even own race) better than faces of other species (or other races). However, this superior recognition performance deteriorates once the face is turned up-side down, and such a face inversion effect, one hallmark differentiating face processing from object processing, is more evident for own species, i.e. stronger for human than monkey faces (Diamond and Carey 1986; Tanaka et al. 2004; Bukach et al. 2006; McKone et al. 2006; Mondloch et al. 2006). It is likely that this sensitivity towards conspecific faces with upright orientation is closely associated with or even shaped by our extensive experience of identifying upright conspecific faces, probably through the process of perceptual narrowing (Pascalis et al. 2001, 2002; Grossmann and Johnson 2007). For instance, 6-month-old human infants perform equally well at discriminating individual human or monkey faces, but 9 months old start to show better performance for recognizing human faces (Pascalis et al. 2002).

If the LGB is closely associated with the processing of facial information, it could also be expected to show not only sensitivity to the orientation and species of the viewed face, but also differences during development. We examined these questions systematically in this study by comparing gaze asymmetries in human infants and adults while free viewing various face and object images with normal and inverted orientation. As face inversion alters global facial configuration but does not change image symmetry along vertical axis nor the local image properties of individual facial features (i.e. local contrast), inverted faces not only serve as ideal control images for upright faces, but also contribute to efforts to address the neural mechanisms underling LGB if different patterns of gaze asymmetries are elicited in response to upright and inverted faces.

Method

Nineteen 6-month-old infants (11 boys and 8 girls, 4.9–7.7 months old with mean of 6.22 ± 0.22 (mean ± SEM)) and 19 adults (11 males and 8 females, 19–39 years old with mean of 20.84 ± 1.13) participated in the study. All children were born full-term and were in good health. Ethical approval had been granted and all procedures complied with British Psychological Society Ethical guidance.

Visual stimuli included 20 face images with neutral facial expression (5 upright and 5 inverted human faces, 5 upright and 5 inverted monkey faces) and 10 symmetrical inanimate object images (see image examples in Fig. 1). The common object images were sampled from our daily environment, and could be categorically familiar to the infants as indicated by the parents. The gray scale digitized images were gamma-corrected and back-projected once at eye-level on the center of a projection screen with a resolution of 600 × 600 pixels. No two images of the same category were presented consecutively. At a viewing distance of 70 cm the image subtended a visual angle of 72 × 72°.

Fig. 1
figure 1

a and c, the probability of initial fixation directed at left and right side of presented images for 6-month-old human infants and adults. b and d, the averaged proportion of viewing time within a trial on the left and right side of presented images for human infants and adults. Error bars indicate standard error of mean (*P < 0.05, **P < 0.01)

The intermodal preferential looking paradigm (Meints et al. 1999) was employed to measure gaze preference. During the experiments the infants were seated on their parent’s lap in a quiet, dim-lit test room, and binocularly viewed the display. The parents were asked to close their eyes during the experiment and to listen to instructions played over headphones which reminded them to sit quietly and to keep the infant seated in a central position. The trial was started with a flashing red fixation point (FP, 8° in diameter) presented on the centre of the screen. The infant’s head and eye positions were on-line monitored by the researcher through CCTV. Once the infant’s gaze was oriented towards the FP, a single image was presented for 5 seconds. The onset of the image presentation was accompanied by a female auditory instruction “look” delivered through a loudspeaker positioned centrally above the displayed images.

During the experiment, the researcher was not visible. The overall order of all trials shown to a given infant was pseudo-randomised. Inter-trial intervals varied with the infant’s attention on the task with a minimum duration of 0.5 s. A new trial was not launched until infants had centred their gaze either spontaneously or when attracted by the flashing FP. All of the tested infants successfully completed all the trials (30 in total). The same procedure, but without a parent, was employed for the testing of human adults.

The participant’s eye position and head movements were recorded by two miniature cameras, and then digitized with a sampling frequency of 60 Hz. The data image was replayed off-line frame by frame for accurate analysis by two researchers independently. The direction of participant’s gaze was classified as ‘left’, ‘right’ and ‘central’ looking accordingly. The researchers were blind to the test images for each trial when performing off-line data analysis, and inter-judge reliability measures between two researchers yielded correlations of 0.96 for infants’ data, and 0.95 for adults’ data.

Results

To address when and how the LGB develops in humans, we compared the gaze preferences of 19 6-month-old human infants and 19 adults while free viewing human and monkey faces (both upright and inverted) and symmetric familiar object images. The images of different categories appeared to attract about the same amount of viewing time within the group of infants (Table 1). On average, human infants spent 64–69% (ANOVA, F (4,94) = 0.46, P = 0.76) of the 5 s image presentation time viewing different classes of face and object images. By contrast human adults spent 96–98% (ANOVA, F (4,94) = 0.68, P = 0.60) of the time viewing the different images classes.

Table 1 Cumulative image viewing time as percentage of total trial time (%)

We then examined whether the left hemi-image attracted a higher probability of first gaze direction after image presentation, and a higher proportion of viewing time during image presentation. Paired one tailed t tests were used for each image category after an ANOVA test to determine a significant general main effect of left–right bias across all image categories. We also calculated P rep and effect sizes (d) to estimate the probability of replicating the effect (Killeen 2005). Human infants showed a consistent general LGB while exploring the images (ANOVA, first gaze direction: F (1,189) = 27.15, P = 5.11E-7, Fig. 1a; viewing time: F (1,189) = 35.38, P = 1.38E-8, Fig. 1b). Specifically, the left side of upright human and monkey faces were inspected first [>63% of probability, t(18) = 1.96 and 2.68, P = 0.03 and 0.007, P rep = 0.9 and 0.99, d = 0.83 and 1.05] and longer [>59% of image viewing time, t(18) = 1.74 and 2.89, P = 0.05 and 0.005, P rep = 0.94 and 0.99, d = 0.79 and 1.33] as were the left sides of object images [first gaze direction: t(18) = 1.97, P = 0.03, P rep = 0.96, d = 0.91; viewing time: t(18) = 1.75, P = 0.048, P rep = 0.95, d = 0.81], suggesting a non-face-specific gaze asymmetry. Furthermore, the left side of inverted monkey faces also attracted longer viewing time [t(18) = 2.51, P = 0.01, P rep = 0.94; d = 0.96], suggesting that the gaze asymmetry in 6 months old is not sensitive to face orientation in species other than their own.

Human adults also demonstrated general main effect of LGB for image exploration (ANOVA, first gaze direction: F (1,189) = 88.04, P = 2.8E-17, Fig. 1c; viewing time: F (1,189) = 11.82, P = 7.27E-4, Fig. 1d). However, when taking individual image category into account, a more restricted pattern of LGB was revealed: the adults showed a clear LGB only towards faces, not objects. Although the left sides of both upright and inverted human or monkey faces were inspected first [t(18) = 2.35–5.71, P = 0.00001–0.02, P rep = 0.94–0.99], only the left side of human upright faces was inspected for a longer period [t(18) = 2.28, P = 0.02, P rep = 0.93; d = 0.93], suggesting that in adults, the LGB is face-specific and also sensitive to face orientation and species. Overall, infant results displayed a larger data variance than the adult population. This is not uncommon in infant studies and demonstrates the variability in development and lack of refinement of the process in this population compared to an identically sized adult population.

Discussion

Our differing observations in human infants and adults suggest that the specific LGB towards faces is an acquired behaviour, possibly through the process of “perceptual narrowing”. It has been proposed that the development of face perception is a modality-specific and experience-dependent process of gradual specialisation (de Haan et al. 2002; Grossmann and Johnson 2007). For instance, 6-month-old infants are equally good at recognising individual monkey and human faces, but 9 months old show a marked advantage in recognizing human faces (Pascalis et al. 2002), indicating a narrowing or specialising of perceptual ability in face perception. Similarly, young infants show a general, inherent LGB for all visual images, which later transforms itself into a more specific LGB for human faces only. Studies of perceptual asymmetries in face processing have previously shown that by the age of 5 years, children demonstrate a face-specific left perceptual bias (Failla et al. 2003) and that its magnitude increases from 6 to 10 years of age (Chiang et al. 2000).

The different patterns of gaze asymmetry when viewing different image categories in human adults also shed some light onto possible neural mechanism underling this LGB phenomenon. The LGB was most evident for upright faces, less evident for inverted faces and totally absent for object images, suggesting that the visuospatial attention bias towards the left visual field (Rhodes 1986; Nicholls and Roberts 2002; Niemeier et al. 2007) and our extensively-practised left-to-right directional scanning bias (Gilbert and Bakan 1973; Vaid and Singh 1989; Heath et al. 2005) cannot fully account for the observed face-specific LGB. The well documented human right-hemispheric advantage for face processing, on the other hand, offers a consistent explanation. Various brain imaging studies have revealed a strong right-hemispheric bias in processing upright faces, delayed and reduced right-hemisphere response in processing inverted faces, and bilateral responses in processing object images (e.g. Rossion et al. 2003; Yovel and Kanwisher 2005; Bukach et al. 2006; Grossmann and Johnson 2007). Our observed systematic change of LGB pattern towards different image categories seems to be consistent with reported changes of the distribution of cortical responses, providing further support for cortical lateralisation in human face processing.

The ability to detect/recognize facial cues (i.e. facial expression, gaze direction) and to respond accordingly also plays a crucial role in social communication of non-human primates and other social species (e.g. Emery 2000; Parr et al. 2000; Hare and Tomasello 2005), but the broader biological context of this phenomenon has been largely ignored until now. It has been suggested that functional brain lateralisation in perception and cognition is not a uniquely human attribute, but exists in other non-human social species which could be shaped by the selection pressure of living in complex social environments and performing intensive social communication during the evolution, at least at population level (e.g. Vallortigara and Rogers 2005). As in humans, the dominant role of the right hemisphere in social cognition as well as in individual recognition mediated by visual cues has been observed in other social species such as fish (Sovrano et al. 1999), domestic chicks (Vallortigara 1992; Vllortigara and Andrew 1991, 1994), quails (Zucca and Sovrano 2008), sheep (Kendrick 2006), monkeys (Hamilton and Vermeire 1988; Hauser 1993; Vermeire and Hamilton 1998) and chimpanzees (Morris and Hopkins 1993).

We hypothesise that if the LGB is mediated by a right hemisphere bias in face processing and if it is of broader adaptive value to social species, then it may also occur among non-human species adapted to living in complex social environments. This possibility is examined in our second study in which we investigated gaze asymmetries in rhesus monkeys (Macaca mulatta) and domestic dogs (Canis familiaris) while exploring various face and object images. Macaques were chosen because of their relatively close evolutionary relationship to humans, and their naturally complex social environment; whereas dogs were chosen because they are more distantly related, but given their close social association with humans and enculturation, they might also benefit from such a capacity of LGB, if it is indeed associated with social cognition.

Experiment 2: LGB in rhesus monkeys and domestic dogs: phylogenetic study

To address the question of whether a face-specific LGB is restricted to human or to primate species, or whether it is perhaps found more extensively among species living in complex social environments, we examined the responses of rhesus monkeys (Macaca mulatta) and domestic dogs (Canis familiaris). As rhesus monkeys rely heavily on facial cues for social communication (Rosenfeld and van Hoesen 1979; Mendelson et al. 1982; Parr et al. 2000) and possess a similar oculomotor strategy and cortical mechanism for face perception as humans (Rossion and Gauthier 2002; Guo et al. 2003; Guo 2007), we hypothesised that laboratory-raised monkeys might be good non-human candidates for the demonstration of a LGB while viewing faces of conspecifics and humans. Domestic dogs, on the other hand, have been domesticated for at least 10,000 years and possibly much longer (Vilà et al. 1997). They have shown greater attachment (Topál et al. 2005) and attention bias (Miklósi et al. 2003; Virányi et al. 2008) towards people compared to their close relative, the wolf. Their sensitivity to human cues exceeds that of non-human primates in certain tasks such as following human gaze directional cues, and it is hypothesised that they may have evolved a special predisposition for communicating with humans (Hare et al. 2002; Miklósi et al. 2003; Hare and Tomasello 2005). Pet dogs are additionally encultured into the human environment and so such biases may be further adapted in this subpopulation. Consequently, pet dogs were chosen as a non-primate social species that might benefit from any adaptive advantage associated with a LGB towards human faces and possibly other dog faces, but not necessarily towards faces from other species or objects.

Method

Recording from rhesus monkeys

Three male adult rhesus monkeys (Macaca mulatta, 4.5–6.0 kg) were tested in this study at the Department of Psychology, University of Newcastle upon Tyne. Initially, the monkeys were trained to fixate on a FP on a computer screen for several seconds in a dimming fixation detection task (Guo and Benson 1998). For the purpose of eye movement recordings, a scleral eye coil and head restraint were then implanted under aseptic conditions. Throughout the period of the recordings, the animal’s weight and general health were monitored daily. Ethical approval had been granted and all procedures complied with the “Principles of laboratory animal care” (NIH publication no. 86–23, revised 1985) and UK Home Office regulations.

Visual stimuli were generated using a VSG 2/3 W graphics system (Cambridge Research Systems) and displayed on a high frequency non-interlaced gamma-corrected color monitor (110 Hz, Sony GDM-F500T9) with the resolution of 1024 × 768 pixels. At a viewing distance of 57 cm the monitor subtended a visual angle of 40 × 30°. The mean luminance of uniform grey background was kept at 6.0 cd/m2.

Twenty monkeys and 20 human face images with neutral facial expressions were used as stimuli (see image examples in Fig. 2). The gray scale digitized images were gamma-corrected and displayed once in a random order at the center of the screen with a resolution of 512 × 512 pixels (20 × 20° visual angle).

Fig. 2
figure 2

a and c, the probability of initial fixation directed at left and right side of presented images for monkeys and dogs. b and d, the averaged proportion of viewing time within a trial on the left and right side of presented images for monkeys and dogs. Error bars indicate standard error of mean (*P < 0.05, **P < 0.01)

During the experiments the monkeys were seated in a purpose-built primate chair with their head restrained, and they viewed the display binocularly. To calibrate eye movement signals, a small red FP (0.2° diameter, 7.8 cd/m2 luminance) was displayed randomly at one of 25 positions (5 × 5 matrix) across the monitor. The distance between adjacent FP positions was 5°. The monkey was trained to follow the FP and maintain fixation for 1 s. After the calibration procedure, the trial was started with a FP displayed on the center of monitor, and the monkey’s eye positions were on-line monitored by the custom-made software. If the monkey maintained fixation for 1 s, the FP disappeared and a single face image was presented for 10 s. During the presentation, three monkeys passively viewed monkey face images, and two of them also viewed human face images. No reinforcement was given during this procedure, neither were the animals trained on any other task with these stimuli, which could have potentially affected their behavior. It was considered that with their lack of training, and in the absence of instrumental responding, their behavior should be as natural as possible.

Horizontal and vertical eye positions were measured using an 18 inc. cubic scleral search coil assembly with 6 min arc sensitivity (CNC Engineering). Eye movement signals were amplified by a CNC system and sampled at 500 Hz through the analogue inputs of CED1401 plus digital interface (Cambridge Electronic Design). The data were then analysed off-line using software developed in Matlab. The software computed horizontal and vertical eye displacement signals as a function of time to determine eye velocity and position. Fixation locations were then extracted from the raw eye tracking data using velocity (less than 0.2° eye displacement at a velocity of less than 20°/s) and duration (greater than 50 ms) criteria (Guo et al. 2003, 2006).

Recording from pet dogs

Seventeen adult domestic pet dogs (Canis familiaris, 2–7 years old, 3 Labrador, 3 Border Collie, 3 Lurcher, 2 Jack Russell, 1 Border Terrier, 1 Leonberger, 1 Schnauzer, 1 Staffordshire Terrier, 1 Spanish Water Dog, 1 Golden Cocker Spaniel) were recruited from university staff and students for this experiment. University ethical approval had been granted and all procedures complied with ethical guidance of the International Society for Applied Ethology.

Visual stimuli were generated using customized presentation software and back-projected on the center of a projection screen. At a viewing distance of 41 cm the screen subtended a visual angle of 100 × 163°. 30 face images with neutral facial expressions and 10 symmetrical inanimate object images were used as stimuli (see image examples in Fig. 2). The face images included five upright and five inverted human faces, five upright and five inverted monkey faces, five upright and five inverted dog faces. The common object images were sampled from the daily environment, and could be categorically familiar to the dogs as indicated by the owners. The gray scale digitized images were gamma-corrected and displayed once in a random order at the center of the screen with a resolution of 600 × 600 pixels (87 × 87° visual angle). No two images of the same category were presented consecutively. To reduce degree of left–right image asymmetry commonly associated with dog faces (i.e. facial color/hair pattern), we created left-mirror left composite chimeric images for eight dog faces and eight object images used in this experiment. This manipulation is widely adopted in studies of left perceptual bias and left gaze bias in face processing, and has generated consistent observation similar as those generated by natural face images across different laboratories, for human participants (e.g. Mertens et al. 1993; Butler et al. 2005; Leonards and Scott-Samuel 2005).

The preferential-looking procedure was adapted to test dog’s gaze preference (Meints et al. 1999). During the experiment the dog was seated in a quiet, dim-lit test room and binocularly viewed the display which was 41 cm away. A researcher stood behind the dog and did not interfere with the dog or coerce it to watch the screen. The dog owner was instructed to keep quiet and stay outside of the test room. A CCTV camera (SONY SSC-M388CE, resolution: 380 horizontal) placed in front of the dog was used to monitor and record the dog’s eye and head movements. The trial was started with a flashing red FP (10° in diameter) presented in the centre of the screen at the dog’s eye level. The dog’s head and eye positions were on-line monitored by the researcher through CCTV. Once the dog’s gaze was oriented towards the FP a single image was presented for 5 s. During the presentation, the dog passively viewed face and object images. No reinforcement was given during this procedure, neither were the dogs trained on any other task with these stimuli. A short break was provided after every ten trials if necessary. All of the dogs successfully tested completed at least 65% of the trials (82 ± 2%).

The dog’s eye and head movements were recorded and then digitised with a sampling frequency of 60 Hz. The image was replayed off-line frame by frame for accurate analysis by two researchers independently, and the direction of the dog’s gaze was classified as ‘left’, ‘right’ and ‘central’ looking accordingly. The researchers were blind to the test images for each trial when performing off-line data analysis, and inter-judge reliability measures yielded correlations of 0.98 between two researchers.

Results

We very precisely recorded monkeys’ gaze patterns with implanted scleral search coils, but the invasive nature of this recording methodology restricts the number of monkeys that can be ethically used in such studies (e.g. Guo et al. 2003, 2006). Therefore the analysis was carried out after pooling the data from three monkeys (i.e. t test was performed over the trials rather than subjects). As we did not intend to quantitatively compare the magnitude of LGB across different species in this study, such an approach can help to qualitatively establish whether the face-related LGB exists in non-human primates.

No statistical difference was observed in the cumulative viewing time across the entire set of human and monkey faces with different orientations (Table 1). The monkeys on average spent 44–52% of image presentation time viewing different categories of face images (ANOVA, F (3,132) = 1.52, P = 0.21). ANOVA tests of main effect of left–right bias across all images revealed a general LGB associated with face exploration (first gaze direction: F (1,261) = 8.47, P = 0.004, Fig. 2a; viewing time: F (1,261) = 12.51, P = 1.0E-6, Fig. 2b). Specifically, the left side of upright monkey and human faces had a significantly higher probability of being the first saccade destination (77% and 65% for monkey and human faces; paired one tailed t test, t(59) = 4.84 and 1.96, P = 4.81E-6 and 0.03, P rep = 0.99 and 0.91) than the right hemiface, and attracted more fixations [61 and 60% of total fixations per image for monkey and human faces; t(59) = 4.37 and 4.01, P = 2.52E-5 and 1.3E-4, P rep = 0.99 and 0.98]. Once the faces were inverted, although image symmetry was constant along the vertical axis, the left and right hemiface appeared to be equally salient [t(17) = 0.17–0.46, P = 0.33–0.43].

The highly sensitive technique used in laboratory monkeys was not appropriate for pet dogs and so the preferential looking paradigm was used to compare the gaze preferences of 17 owner-volunteered dogs while viewing human, dog and monkey faces, and symmetrical object images. On average, the dogs spent 43–47% of the 5 s image presentation time inspecting different types of face and object images (Table 1), and no significant difference in viewing time was observed across these image categories (ANOVA, F (6,118) = 0.51, P = 0.80).

Analysis of gaze preference showed a significant main effect of general LGB for image viewing (first gaze direction: F (1,237) = 20.59, P = 9.28E-6; viewing time: F (1,237) = 14.95, P = 1.45E-4). Paired one tailed t test further revealed that the left side of both upright and inverted human faces had a higher probability of being the first inspected region by the dogs [65 and 67% for upright and inverted human faces; t(16) = 2.99 and 3.18, P = 0.004 and 0.003, P rep = 0.97 and 0.97; Fig. 2c]. There was no significant difference in the probability of first inspection between the two sides of dog faces, monkey faces and object images [t(16) = 0.27–1.12, P = 0.14–0.40]. An analysis of the averaged proportion of viewing time towards each side of the images within a trial showed that only the left side of upright human faces attracted significantly longer inspection [62% of total viewing time; t(16) = 2.67, P = 0.008, P rep = 0.95; Fig. 2d). The dogs spent a similar amount of looking time at both sides of the images while viewing inverted human faces, both upright and inverted dog or monkey faces, and object images [t(16) = 0.52–1.69, P = 0.06–0.30].

Discussion

Our observations show that gaze asymmetry is not restricted to humans, and could have broader adaptive significance. Laboratory-raised monkeys showed a LGB towards faces of conspecifics and humans while pet dogs only demonstrated a LGB towards human faces, but not monkey or dog faces, nor towards object images. We suggest that these specific results are compatible with the animals’ normal communicative strategies given monkeys’ reliance on social cues and dogs’ unique evolutionary and ontogenetic history. All dogs in this study were well socialised to both people and other dogs. We therefore argue that the bias towards human faces alone cannot be explained simply in terms of lack of exposure to conspecifics, but that it may have a more fundamental phylogenetic origin. The ability to extract information from human faces and respond appropriately could have had a selective advantage during the process of domestication, especially as the emotional content of these faces may be of immediate adaptive behavioural significance. Indeed, recent studies have shown that the owner’s right hemiface (left hemiface from viewer’s perspective) can express a range of emotional expressions more accurately, and more importantly, can express specifically the negative expression of, evoked anger, more intensely (e.g. Indersmitten and Gur 2003). As the LGB directs the viewer’s attention to this side of face image, it could help the viewer detect and recognize biologically important information more quickly and precisely in faces of functional significance.

The maintenance of the bias by dogs towards inverted human faces may also be specifically important for this species. Dogs will frequently roll over and look up at human faces in initial social exchanges as an appeasement gesture and the ability to read the human face in this context may be important to establish if appeasement has succeeded. If LGB has its origins in right hemisphere specialisation for facial processing, it would be surprising if the behaviour could be reversed when the dog is viewing a face upside down, although this would allow preferential inspection of the right side of the viewee’s face. Dogs may not show a bias towards monkey faces because of their unfamiliarity or irrelevance compared to human faces, although the differentiating criteria remain to be established. However, a failure to show LGB towards dog faces might reflect a reduced dependence on facial processing in the initial assessment of conspecifics in this species, with greater facial asymmetry in this species and non-facial greeting including olfactory cues and visual cues of body postures perhaps being of greater significance.

General discussion

With the presentation of object images and faces of different species with upright and inverted orientation, in this comparative study we systematically examined the face-related LGB, defined by the higher probability of first gaze and a higher proportion of viewing time directed at the left hemiface, in human infants, human adults, rhesus monkeys and domestic dogs. While human infants showed a more general bias towards the left side of a visual image (mostly for upright images), adults demonstrated a very specific LGB towards upright human faces only. Laboratory-reared monkeys showed selective LGB towards upright human and monkey faces, while pet dogs only attended to the left side of human faces. Taken together, our results suggest that the face-specific LGB is not apparent in human infants, but develops over time; also, our evidence shows that the LGB is not a human-specific phenomenon, but seems to have broader adaptive value to social species.

Interestingly, both human adults and dogs demonstrated a LGB towards a broader range of face types at the earliest stage of face viewing (human and monkey faces with different orientations for human observers, upright and inverted human faces for dogs; Figs. 1c, 2c), adding further support to the hypothesis that the initial gaze bias for faces is automatic and internally-driven (Leonards and Scott-Samuel 2005), probably initiated by the gist configuration of face stimuli. After initial analysis, the LGB could be refined towards more efficient processing of biologically relevant faces (in our case, upright human faces for both adult human participants and pet dogs, Figs. 1d, 2d), and would be less evident for irrelevant faces or inverted faces as face inversion would alter global configuration of facial features and reduce efficiency and accuracy of face processing (Valentine 1988; Rossion and Gauthier 2002). A recent correlation study in humans also suggests that only an overall leftward face scanning bias (i.e. total number of left hemiface fixation within a trial), rather than initial gaze bias, can be correlated with perceptual processing of facial cues (Bulter et al. 2005).

In our study, we did not observe a consistent gaze bias towards object images in human adults and dogs, or towards inverted faces in monkeys. This observation rules out general preferential attention towards the left visual field and extensively-practised left-to-right scanning bias in humans as specific explanations for the LGB phenomenon. A recent recording of human saccadic eye movements in face processing also demonstrated that the initial gaze bias is the most evident while exploring upright faces, and is less or not evident while exploring inverted faces and symmetric non-face object or landscape images (Leonards and Scott-Samuel 2005). Taken together, the face- and orientation-sensitive LGB we observed here is most likely due to a lateralised right hemisphere bias for face processing which has been revealed by studies of brain imaging and patients with focal brain lesions (e.g. Farah and Aguirre 1999; Bukach et al. 2006; Grossmann and Johnson 2007).

Unlike humans and monkeys, relatively little is known about cerebral lateralization in dogs from neuroimaging approach. However, some behaviour studies on paw preference (Tan 1987; Wells 2003; Quaranta et al. 2004; Branson and Rogers 2006; Poyser et al. 2006) and tail-wagging response (Quaranta et al. 2007) have suggested a functional brain asymmetry in dogs which may even correlate with their immune system function (Quaranta et al. 2006, 2008). The right hemisphere also has greater mass, and this appears to be independent of laterality in certain forms of motor behaviour (Tan and Çalşikan 1987), which would be consistent with laterality in certain perceptual processes. Our observation of LGB towards human faces provides new evidence to support this hypothesis, with brain lateralization apparent in face processing in dogs. Furthermore, this prominent gaze asymmetry could be a useful non-invasive tool in the wider study of some aspects of social cognition (e.g. facial signalling) in those species who exhibit a bias.

As the recognition of facial expression is a crucial part of social cognition, right hemisphere dominance in emotional processing, such as in detecting facial movements (i.e. lip smacking; Ferrari et al. 2003; Leslie et al. 2004) and judging negative emotions (i.e. fear and anger; Asthana and Mandall 2001; Indersmitten and Gur 2003), may also steer the initiation of gaze asymmetry towards faces. Overall, it seems likely that the affective and semantic information contained in those faces with adaptive behavioural significance for the species concerned, are the most likely determinants of face-related LGB. In other words, the LGB may not just be initiated by the gist configuration of the faces in an automatic fashion, but is actively engaged in the processing of relevant facial cues. Consequently, the amplitude of LGB could well be affected by different type of facial information. This issue is currently under investigation by our group.