Transfer learning in a 3D-CNN is beneficial for small sample sizes in HCP task data
Philipp Seidel, Jens V. Schwarzbach, Regensburg University, Germany
Session:
Posters 3 Poster
Location:
Pacific Ballroom H-O
Presentation Time:
Sat, 27 Aug, 19:30 - 21:30 Pacific Time (UTC -8)
Abstract:
Deep neural networks (DNN) have become a powerful tool for decoding of brain activity based on (functional) magnetic resonance imaging (fMRI) data. However, brain imaging data may not be available in necessary quantities to achieve good training and testing performance. We trained DNNs with t-statistical maps from the HCP in a leave-one-task out approach and report to which degree a particular DNN architecture profits from transfer learning as a function of task and sample size. Our results suggest that TL boosts prediction performance in small sample sizes in those HCP tasks that can be successfully learned from scratch. This helps us to better estimate the lower limit of the sample-sizes that are necessary for acceptable classification performance when decoding brain states.