Relevance, uncertainty, and expectations affect categorization
Janaki Sheth, Jared Collina, Konrad Kording, Yale Cohen, Maria Geffen, University Of Pennsylvania, United States
Session:
Posters 1 Poster
Location:
Pacific Ballroom H-O
Presentation Time:
Thu, 25 Aug, 19:30 - 21:30 Pacific Time (UTC -8)
Abstract:
Auditory categorization, e.g. identifying a song in a noisy environment, is difficult. After all, only some sounds that we hear may be relevant and part of the song. Thus, we need to appropriately weight and integrate over the different sounds that we hear. Simultaneously, we also need to constantly account for our internal sensory noise. Lastly, early verses in a song generally predict latter verses, making our ability to learn over time crucial to accurate decision-making. Despite this complexity underlying categorization, previous papers have usually studied the effects of relevance, sensory noise, and expectations in separation. Here, we test how these different factors combine to affect our decisions by formalizing multi-tone sound categorization as a Bayesian model and testing it with new behavioral experiments. We find that participants are sensitive to relevance and that the history of categories affects their expectations. However, there is substantial diversity amongst participants both in their measure of relevance and in their expectations. Thus, our model reveals participant-specific tone-by-tone estimates of relevance, sensory noise, and expectations, giving us variables to understand how the brain categorizes.