No CrossRef data available.
Article contents
Contextual and social cues may dominate natural visual search
Published online by Cambridge University Press: 24 May 2017
Abstract
A framework where only the size of the functional visual field of fixations can vary is hardly able to explain natural visual-search behavior. In real-world search tasks, context guides eye movements, and task-irrelevant social stimuli may capture the gaze.
- Type
- Open Peer Commentary
- Information
- Copyright
- Copyright © Cambridge University Press 2017
References
Birmingham, E., Bischof, W. F. & Kingstone, A. (2008) Social attention and real-world scenes: The roles of action, competition and social content. Quarterly Journal of Experimental Psychology (Hove)
61(7):986–98.Google Scholar
Castelhano, M. S. & Henderson, J. M. (2008) Stable individual differences across images in human saccadic eye movements. Canadian Journal of Experimental Psychology
62(1):1–14.Google Scholar
Cerf, M., Frady, E. P. & Koch, C. (2009) Faces and text attract gaze independent of the task: Experimental data and computer model. Journal of Vision
9(12):10 11–15.Google Scholar
Crouzet, S. M., Kirchner, H. & Thorpe, S. J. (2010) Fast saccades toward faces: Face detection in just 100 ms. Journal of Vision
10(4):16 11–17.Google Scholar
Devue, C., Belopolsky, A. V. & Theeuwes, J. (2012) Oculomotor guidance and capture by irrelevant faces. PLoS ONE
7(4):e34598.Google Scholar
Hari, R., Henriksson, L., Malinen, S. & Parkkonen, L. (2015) Centrality of social interaction in human brain function. Neuron
88(1):181–93.Google Scholar
Hsiao, J. H. & Cottrell, G. (2008) Two fixations suffice in face recognition. Psychological Science
19(10):998–1006.Google Scholar
Krizhevsky, A., Sutskever, I. & Hinton, G. E. (2012) ImageNet classification with deep convolutional neural networks. In: Conference of Advances in neural information processing systems, ed. Pereira, F., Burges, C. J. C., Bottou, L. & Weinberger, K. Q., pp. 1097–105. Neural Information Processing Systems Foundation. Available at: https://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks.Google Scholar
Kujala, M. V., Kujala, J., Carlson, S. & Hari, R. (2012) Dog experts' brains distinguish socially relevant body postures similarly in dogs and humans. PLoS ONE
7(6):e39145.Google Scholar
Kümmerer, M., Theis, L. & Bethge, M. (2014) Deep Gaze I: Boosting saliency prediction with feature maps trained on ImageNet. arXiv preprint arXiv:1411.1045. Available at: http://arxiv.org/abs/1411.1045.Google Scholar
Kundel, H. L., Nodine, C. F., Conant, E. F. & Weinstein, S. P. (2007) Holistic component of image perception in mammogram interpretation: Gaze-tracking study. Radiology
242(2):396–402.CrossRefGoogle ScholarPubMed
Neider, M. B. & Zelinsky, G. J. (2006) Scene context guides eye movements during visual search. Vision Research
46(5):614–21.Google Scholar
Or, C. C., Peterson, M. F. & Eckstein, M. P. (2015) Initial eye movements during face identification are optimal and similar across cultures. Journal of Vision
15(13):12.CrossRefGoogle ScholarPubMed
Peterson, M. F. & Eckstein, M. P. (2012) Looking just below the eyes is optimal across face recognition tasks. Proceedings of the National Academy of Sciences of the United States of America
109(48):E3314–23.Google ScholarPubMed
Pihko, E., Virtanen, A., Saarinen, V. M., Pannasch, S., Hirvenkari, L., Tossavainen, T., Haapala, A. & Hari, R. (2011) Experiencing art: The influence of expertise and painting abstraction level. Frontiers in Human Neuroscience
5:94.Google Scholar
Riby, D. M., Brown, P. H., Jones, N. & Hanley, M. (2012) Brief report: Faces cause less distraction in autism. Journal of Autism and Developmental Disorders
42(4):634–39.Google Scholar
Rosenholtz, R. (2016) Capabilities and limitations of peripheral vision. Annual Review of Vision Science
2:437–57.Google Scholar
Torralba, A., Oliva, A., Castelhano, M. S. & Henderson, J. M. (2006) Contextual guidance of eye movements and attention in real-world scenes: The role of global features in object search. Psychological Review. 113(4):766–86.Google Scholar
Unema, P. J. a., Pannasch, S., Joos, M. & Velichkovsky, B. M. (2005) Time course of information processing during scene perception: The relationship between saccade amplitude and fixation duration. Visual Cognition
12(3):473–94. doi: 10.1080/13506280444000409.Google Scholar
Wang, S., Jiang, M., Duchesne, X. M., Laugeson, E. A., Kennedy, D. P., Adolphs, R. & Zhao, Q. (2015) Atypical visual saliency in Autism spectrum disorder quantified through model-based eye tracking. Neuron
88(3):604–16.Google Scholar
Target article
The impending demise of the item in visual search
Related commentaries (30)
An appeal against the item's death sentence: Accounting for diagnostic data patterns with an item-based model of visual search
Analysing real-world visual search tasks helps explain what the functional visual field is, and what its neural mechanisms are
Chances and challenges for an active visual search perspective
Cognitive architecture enables comprehensive predictive models of visual search
Contextual and social cues may dominate natural visual search
Don't admit defeat: A new dawn for the item in visual search
Eye movements are an important part of the story, but not the whole story
Feature integration, attention, and fixations during visual search
Fixations are not all created equal: An objection to mindless visual search
Gaze-contingent manipulation of the FVF demonstrates the importance of fixation duration for explaining search behavior
How functional are functional viewing fields?
Item-based selection is in good shape in visual compound search: A view from electrophysiology
Looking further! The importance of embedding visual search in action
Mathematical fixation: Search viewed through a cognitive lens
Oh, the number of things you will process (in parallel)!
Parallel attentive processing and pre-attentive guidance
Scanning movements during haptic search: similarity with fixations during visual search
Searching for unity: Real-world versus item-based visual search in age-related eye disease
Set size slope still does not distinguish parallel from serial search
Task implementation and top-down control in continuous search
The FVF framework and target prevalence effects
The FVF might be influenced by object-based attention
The “item” as a window into how prior knowledge guides visual search
Those pernicious items
Until the demise of the functional field of view
What fixations reveal about oculomotor scanning behavior in visual search
Where the item still rules supreme: Time-based selection, enumeration, pre-attentive processing and the target template?
Why the item will remain the unit of attentional selection in visual search
“I am not dead yet!” – The Item responds to Hulleman & Olivers
“Target-absent” decisions in cancer nodule detection are more efficient than “target-present” decisions!
Author response
On the brink: The demise of the item in visual search moves closer