Recurrent neural circuits for perceptual grouping

Neurons in the visual cortex are sensitive to context: Responses to stimuli presented within their classical receptive fields (CRFs) are modulated by stimuli in their surrounding extra-classical receptive fields (eCRFs). However, the circuits underlying these contextual effects are not well understo...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Journal of vision (Charlottesville, Va.) Ročník 22; číslo 14; s. 3185
Hlavní autor: Serre, Thomas
Médium: Journal Article
Jazyk:angličtina
Vydáno: Association for Research in Vision and Ophthalmology 05.12.2022
Témata:
ISSN:1534-7362, 1534-7362
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Neurons in the visual cortex are sensitive to context: Responses to stimuli presented within their classical receptive fields (CRFs) are modulated by stimuli in their surrounding extra-classical receptive fields (eCRFs). However, the circuits underlying these contextual effects are not well understood, and little is known about how these circuits drive perception during everyday vision. We tackle these questions by approximating circuit-level eCRF models with a differentiable discrete-time recurrent neural network that is trainable with gradient-descent. After optimizing model synaptic connectivity and dynamics for object contour detection in natural images, the neural-circuit model rivals human observers on the task with far better sample efficiency than state-of-the-art computer vision approaches. Notably, the model also exhibits CRF and eCRF phenomena typically associated with primate vision. The model’s ability to accurately detect object contours also critically depends on these effects, and these contextual effects are not found in ablated versions of the model. Finally, we derive testable predictions about the neural mechanisms responsible for contextual integration and illustrate their importance for accurate and efficient perceptual grouping.
ISSN:1534-7362
1534-7362
DOI:10.1167/jov.22.14.3185