Perceptography: using machine learning to peek into the subjective experience
Elia Shahbazi, Timothy Ma, Arash Afraz, National Institutes of Health (NIH), United States
Session:
Posters 1 Poster
Location:
Pacific Ballroom H-O
Presentation Time:
Thu, 25 Aug, 19:30 - 21:30 Pacific Time (UTC -8)
Abstract:
Local stimulation in high-level cortical visual areas perturbs the contents of visual perception (Azadi et al., 2022). Anecdotal qualitative reports by human patients constitute the main body of evidence in this area of research (Penfield 1960, Parvizi et al., 2012). Nonhuman primate studies can quantitatively describe the perceptual events induced by stimulation but are very low dimensional in the absence of verbal descriptions (Afraz 2006). Here we introduce a novel methodology, perceptography, that allows taking pictures of complex visual events induced by optogenetic stimulation of the macaque inferior temporal (IT) cortex. We trained the animals to detect and report cortical stimulation impulses while fixating on a seed image. In a closed-loop paradigm, our feature extraction deep network used the monkeys’ behavioral responses to guide a Generative Adversarial Network (GAN) to evolve and optimize image perturbations that would be falsely taken by the animal as brain stimulation. This paradigm created altered images that induced a 55-85% false alarm rate, dramatically higher than the baseline of 3-7%. We named these images Perceptograms since the state of looking at them is difficult for the animals to discriminate from the perceptual state induced by cortical stimulation.