The social difficulties that are a hallmark of autism spectrum disorder (ASD) are thought to arise, at least in part, from atypical attention toward stimuli and their features. To investigate this hypothesis comprehensively, we characterized 700 complex natural scene images with a novel three-layered saliency model that incorporated pixel-level (e.g., contrast), object-level (e.g., shape), and semantic-level attributes (e.g., faces) on 5,551 annotated objects. Compared with matched controls, people with ASD had a stronger image center bias regardless of object distribution, reduced saliency for faces and for locations indicated by social gaze, and yet a general increase in pixel-level saliency at the expense of semantic-level saliency. These results were further corroborated by direct analysis of fixation characteristics and investigation of feature interactions. Our results for the first time quantify atypical visual attention in ASD across multiple levels and categories of objects.
Paper
Shuo Wang*, Ming Jiang*, Xavier Morin Duchesne, Elizabeth A. Laugeson, Daniel P. Kennedy, Ralph Adolphs, and Qi Zhao, "Atypical Visual Saliency in Autism Spectrum Disorder Quantified through Model-based Eye Tracking," Neuron, Volume 88, Issue 3, Pages 604-616, November 2015.
[pdf] [bib]
(*Equal authorship)
Preview
Laurent Itti, "New Eye-Tracking Techniques May Revolutionize Mental Health Screening," Neuron, Volume 88, Issue 3, Pages 442-444, November 2015. [pdf]
Related Work
Juan Xu, Ming Jiang, Shuo Wang, Mohan Kankanhalli, and Qi Zhao, "Predicting Human Gaze Beyond Pixels," in Journal of Vision, Volume 14, Issue 1, Article 28, Pages 1-20, January 2014. [pdf] [bib] [project]