Voxel-wise Encoding Models with Hierarchical Task-optimized Brain Atlas
Huzheng Yang, Shi Gu, University of Electronic Science and Technology of China, China; Yuanning Li, ShanghaiTech University, China
Posters 3 Poster
Pacific Ballroom H-O
Sat, 27 Aug, 19:30 - 21:30 Pacific Time (UTC -7)
For end-to-end trained voxel-wise encoding models, sharing parameters with a large number of voxels is a common practice to reduce overfitting caused by single-voxel noise, it also explores the underlying network interaction between voxels. However, voxels that don't share the same learning dynamics and convergence speed will suffer from global early stopping criteria. To address this challenge, we propose a novel brain atlas named Hierarchical Task-optimized ROI: first extract task-optimized voxel embeddings from the encoding model, then cluster voxels by embeddings, similar embeddings imply similar learning dynamics and convergence speed. Applying it to a pre-trained SwinTransformer model, we achieved new state-of-the-art results on the Algonauts 2021 Challenge full-track whole-brain prediction task, boosting from 0.3715 to 0.3857 explainable variance explained without accessing pre-defined anatomical ROI information, 0.3917 when ensembling with anatomical ROI.