To which stimuli do humans respond




















Walker-Smith G. Eye movements during pattern perception. Perception ; — Llewelyn-Thomas E. Search behavior. Noton D. Scan paths in eye movements during pattern perception. Science ; — Stark L. Scan paths revisited: Cognitive models direct active looking. In: Fisher D. Eye Movements: Cognition and Visual Perception. Ellis S.

Patterns of statistical dependency in visual scanning. Eye Movements and Human Information Processing. Spitz H. Scan paths and pattern recognition. Science ; Didday R. Man-Machine Stud. Locher P. The role of scan paths in the recognition of random shapes. Eye movement patterns in viewing ambiguous figures. Groner R. Looking at faces: Local and global aspects of scan paths.

Eye movements, attention and visual information processing: Some experimental results and methodological considerations. In: Luer G. Toronto: C. Hogrefe; — The effects of stimulus characteristics, task requirements, and individual differences on scanning patterns. Smoker W. Spatial perception testing in diagnostic radiology. Walker G. Training and aptitude for mammographic inspection.

In Megaw E. Contemporary Ergonomics. Just M. A theeory of reading: From eye fixation to comprehension. Colquhoun W. R Circadian variations in mental efficiency. In: Colquhoun W. Biological Rhythms and Human Performance. London: Academic; 39— Monk T. Temporal effects in visual search. In Clare J. Search and the Human Observer. London: Taylor and Francis; 30— Folkard S. Chronopsychology: Circadian rhythms and human performance. Physilogical Correlates of Human Behavior.

London: Academic; 52— Circadian variation in radiology. Chi M. Farr M. The Nature of Expertise. Hillsdale NJ: Lawrence Erlbaum; Getty D. Assisting the radiologists to greater accuracy. In: Kundel H. Digital Mammography. Amsterdam: Elsevier; Hutt I. Prompting as an aid in mammography. Nodine C. Computer aided perception aids pulmonary nodule detection. This suggests that learning is an important factor in this situation. Rabbitt, P. Stone, M. Article Google Scholar. Morin, R. Download references.

You can also search for this author in PubMed Google Scholar. Reprints and Permissions. Human Response to Classes of Stimuli. Receptors are specialised cells that detect a stimulus.

Their job is to convert the stimulus into electrical signals in nerve cells. Some receptors can detect several different stimuli but they are usually specialised to detect one type of stimulus:. A sense organ is a group of receptors gathered together with some other structures.

The other structures help the receptors to work more efficiently. An example of this is the eye. Both AU5 and AU20 i. Another example of muscular variance that might reflect the reported difference in the displayed facial actions relates to the human-dog frustration comparison.

Dogs lack a zygomaticus minor muscle, which produces AU14 in humans. This specific movement is one of the core action units of human frustration facial expression. Given the lack of these specific muscles in dogs, it is possible that other muscles could be activated to produce reliable fear or frustration cues in this species. This appears to be the case for fear, since AD production had unique characteristics in this context.

However, we did not identify an equivalent in the case of frustration, as discussed earlier. Given the low number of specific facial actions produced in association with each emotion, we suggest that dogs do not display a composed facial expression with several facial actions being integrated in a stereotypical display, as is observed in humans. Instead, dogs seem to produce isolated actions in response to specific emotionally-competent stimuli. Due to well-known human perception biases e. The results in relation to our second hypothesis illustrate the error of broad descriptions and over-reliance on holistic categorization of facial behaviour, which can lead to biases in the perception of facial expressions of a different species e.

However, our study refutes this hypothesis, as the homologies between humans and dogs seem to be limited to the underlying musculature, rather than their activation. In other domestic animals, such as horses 72 and sheep 73 , the basic facial musculature also appears to be well conserved 26 , 74 , but the emotional displays appear to diverge from what is observed in humans.

It is worth noting that in both of these species and arguably most domestic animals the ears appear to be widely used as part of emotional signals, while in humans the ears have more vestigial muscles.

Close analysis of the aural signals may therefore be a fruitful line of enquiry in the future. In the case of the dog-human comparison, and unlike the chimpanzee-human comparison, facial expressions seem to be exclusive to the species.

Phylogenetic inertia was likely involved in the development of the well-conserved facial musculature of dogs and humans 47 , but there are clearly differences in the way this musculature is used to produce facial expressions associated with the intentional or unintentional communication of emotion.

Our findings of humans and dogs displaying distinctively different facial expressions have important theoretical and practical implications on human-dog emotion understanding and its underlying cognitive mechanisms. Given that in our study facial movement production differed between humans and dogs, it is unlikely that a mental simulation strategy could be useful or adopted by both species when interpreting heterospecific expressive facial signals.

Instead, an implicit social learning strategy would be a more meaningful way to establish a functional human-dog emotion understanding. In our study, most emotional facial actions produced by dogs were in the mouth area, using the tongue or the opening of the mouth, and none were in the eye area, which is an important difference from humans that produce eye region AUs in many of the prototypical facial expressions. The preceding studies strongly suggest that humans read dog communication as they would read other humans own-species bias 87 , and our results indicate this is potentially problematic when it comes to evaluation of the emotional content of these signals.

This leads to another important issue: all studies to date on human perception of dogs are missing bias-free empirical validation e. It remains to be seen to what extent the current findings can be generalised across all dog breeds and other emotional contexts, given the variation in facial morphology and possibly muscular functionality in different breeds. Nonetheless, we did establish that features like variation in general head shape may be less important than changes in jowl structure, given the location of key AUs used to express the emotions studied.

While the stimuli were selected for their fundamental emotional equivalence between the two species, the resulting expression might have been moderated by other factors associated with their processing, which were not controlled for e.

Nevertheless, the critical feature is that they should largely share a final common emotional process, whose expression we set out to measure. Thus, future studies might aim to compare dogs of different breeds and morphologies, as well as incorporate the multimodality of emotional signals, and test these hypotheses in different settings e. The videos used in this study were selected from online databases www. Only one individual per video was coded.

Whenever there was more than one individual present in the same video that fulfilled all the necessary criteria see section b , one of the individuals was randomly selected and coded. The sample consisted of individuals in total, 50 humans and family dogs, distributed equally between emotional contexts happiness, positive anticipation, fear, frustration and relaxation, Table 1. For detailed information on individuals, please see Supplementary material S1. The four categories of emotion were defined according to literature sources Table 1.

The videos were chosen on the basis of stimulus quality e. Only videos with minimum editing, high image quality at least p , good lighting and visibility of faces were selected. The putative emotion eliciting stimulus had to be apparent and clearly identifiable for at least part of the video. To ensure that the emotionally-competent stimuli were present in the dog videos and were impacting their behaviour, the first author of this study CC selected, blinded and randomised the video order.

Another author DM, specialist in veterinary behavioural medicine relabelled all videos according to the emotion likely triggered by the stimulus Only videos that had full agreement were included in the study. Before starting the FACS coding, one or more neutral face frames were selected for all individuals. The number of facial actions was identified by watching the videos frame-by-frame, logged in BORIS and extracted for analysis.

Since morphological differences could potentially impact upon the production of facial actions in dogs, we used cephalic type brachycephalic, mesaticephalic, dolichocephalic , jowls length none, short, long , ear morphology floppy, partially erect, erect and breed as per the British Kennel Club as control variables. We only analysed the effects of these control variables in the facial actions that were significant in our main analysis.

As an additional control, we categorized all the videos for the perceived level of arousal five point Likert scale: 1 — very low, 2 — low, 3 — moderate, 4 — high, 5 — very high to ensure arousal was not a confounding variable for emotion. These control variables allowed us to increase the internal validity of our study, since it excludes alternative explanations for the causal relationship between independent and dependent variables Human videos were not coded for control or reliability purposes, instead, our results were compared with the respective facial movements reported in the established literature.

Thus, we included 30 human facial actions and 22 dog facial actions in the statistical analysis Supplementary Table S2. Since all variables violated both assumptions in at least one of the groups, non-parametric tests were used throughout the analysis. To investigate if dogs produced differential facial actions in response to specific emotional triggers, we compared the facial actions displayed between emotions and between each emotion and the control relaxation videos with Kruskal-Wallis adjusted for ties followed by post-hoc pairwise multiple comparisons tests, with Dunn-Bonferroni correction.

For our second hypothesis, where we were interested in assessing if dogs were using the same facial actions as humans in response to an equivalent emotional trigger, we performed Mann-Whitney tests, with exact p-value for the homologous facial actions between species. For both hypotheses, the rate of facial movements was used as the dependent variable and the emotion category was the independent variable.

Additionally, we calculated effect sizes to consider and minimize the risk of type 2 error in our interpretation of the results. To analyse the potential effect of the control variables cephalic type, ear morphology, jowl length, breed, and arousal level on the facial actions, we used Kruskal-Wallis tests adjusted for ties and post-hoc pairwise multiple comparisons tests, with Dunn-Bonferroni correction.

For ear morphology and jowl length, we only performed the analysis for the relevant facial actions e. School of Psychology technicians for software support. Daniel Pincheira Donoso for his comments and discussion of points in earlier versions of the manuscript. All authors provided critical revisions and approved the final version. Electronic supplementary material. Supplementary information accompanies this paper at Publisher's note: Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Change history. The error has been fixed in the paper. National Center for Biotechnology Information , U. Sci Rep. Published online Nov Author information Article notes Copyright and License information Disclaimer. Corresponding author. Received May 25; Accepted Oct This article has been corrected. See Sci Rep. This article has been cited by other articles in PMC.

Supplementary Info File 1. Abstract The commonality of facial expressions of emotion has been studied in different species since Darwin, with most of the research focusing on closely related primate species.

Introduction The common origin of emotions has long been a subject of scientific interest 1 with different emotional responses producing a diverse range of communicative elements, especially through the face. Table 1 Emotion, brain system 44 , definition of emotion, trigger stimuli and context analysed for both species.

During presentation of animal or experience in a park ride. Experience of thunderstorms, visualisation of specific objects. During thunderstorms or presentation of objects. Possibility of gaining a high monetary reward with its subsequent loss es. Visualisation of a desired resource toy, food, space that is or becomes inaccessible. After first attempt to gain access to the resource and during its subsequent denials. Period from signalling of a reward till moment before receiving reward 99 , Visualisation of food, unwrapping a gift.

From visual presentation of food item till moment before eating; From visual presentation of gift till revelation of item. After trigger presentation till moment before obtaining food or leaving home. Only observed in the absence of immediate fitness threats and is highly dependent on all proximate needs being fulfilled , Gain of a high monetary reward.



0コメント

  • 1000 / 1000