Revealing the Feature Dimensions Driving Similarity Judgements of Natural Scenes
Peter Brotherwood, Ian Charest, Université de Montréal, Canada; Andrey Barsky, Jasper Van Den Bosch, University of Birmingham, United Kingdom; Kendrick Kay, University of Minnesota, United States
Posters 1 Poster
Pacific Ballroom H-O
Thu, 25 Aug, 19:30 - 21:30 Pacific Time (UTC -8)
Revealing the feature dimensions relevant in distinguishing and recognising natural scenes holds the key to understanding those features most important in driving our ability to perceive, recognise and adapt to changes in our surrounding environment. Previous work revealed the feature dimensions utilised in driving similarity judgements of a large set of stimuli from the THINGS dataset using a triplet odd-one-out similarity task. Here, we developed a similar approach to reveal feature dimensions relevant in driving similarity judgements from individual participants and stimuli from the Natural Scenes Dataset (NSD). Using similarity judgments from a multiple arrangements method, we train sparse positive similarity embeddings to predict pairwise similarities over a set of 100 diverse natural scenes. The resulting embeddings are capable of reproducing similarity judgements for each subject with a high degree of accuracy, with an average Spearman’s correlation between embedding-predicted RDMs and RDMs produced by traditional iterative weighted averaging of similarity judgements of 0.88. Inspection of the resulting embeddings reveals the key feature dimensions driving these similarity judgements.