Beyond task-optimized neural models: constraints from embodied cognition
Kaushik Lakshminarasimhan, Columbia University, United States; Akis Stavropoulos, Dora Angelaki, New York University, United States
Session:
Contributed Talks 4 Lecture
Location:
Grand Ballroom A-C
Presentation Time:
Sun, 28 Aug, 12:25 - 12:45 Pacific Time (UTC -8)
Abstract:
Generic neural networks optimized for task performance are often very successful in predicting neural activity in animals. However, neural mechanisms dictate not only the task performance but also how a particular task is solved. Can we deduce mechanisms from cognitive strategies? To find out, we asked humans and monkeys to perform a challenging task in which they steered to a remembered goal location by integrating self-motion in a virtual environment lacking position cues. Although this task requires only mentally tracking one’s position relative to the goal, participants physically tracked this latent task variable with their gaze – an instance of embodied cognition. Restraining eye movements worsened task performance suggesting that embodiment plays a computational role. Above findings are well explained by a neural model with tuned bidirectional connections between oculomotor circuits and circuits that integrate sensory input. In contrast to other task optimized models, this model correctly predicted that leading principal components of the monkey posterior parietal cortex activity should encode their position relative to the goal. These results explain the computational significance of motor signals in evidence-integrating circuits and suggest that plasticity between those circuits might enable efficient learning of complex tasks via embodied cognition.