Optimal encoding of prior information in noisy working memory systems
Hua-Dong Xiong, The University of Arizona, United States; Xue-Xin Wei, The University of Texas at Austin, United States
Posters 3 Poster
Pacific Ballroom H-O
Sat, 27 Aug, 19:30 - 21:30 Pacific Time (UTC -7)
The brain adapts to the statistical regularities of the environments to improve behavior. While there are many theories of efficient coding for feedforward processing, little is known about how prior information is encoded through recurrent computation in the neural systems, which is critical for cognition. Here we investigate this question in the context of working memory. By optimizing recurrent neural networks (RNNs) to perform a working memory (WM) task with different noise levels and stimulus priors. We found that, with increasing neural noise, the attractor dynamics in RNNs transform from continuous to discrete. Moreover, to encode stimulus statistics, RNNs generally allocate more attractor states for more frequent stimuli, leading to an increased encoding precision. The resulting neural representations exhibit systematic deviations from previous theories of efficient coding. Our results reveal novel mechanistic insights into how prior information is encoded through recurrent computations.