Conversion of ConvNets to Spiking Neural Networks With Less Than One Spike per Neuron
Javier López-Randulfe, Nico Reeb, Alois Knoll, Technical University of Munich, Germany
Session:
Posters 3 Poster
Location:
Pacific Ballroom H-O
Presentation Time:
Sat, 27 Aug, 19:30 - 21:30 Pacific Time (UTC -8)
Abstract:
Spiking neural networks can leverage the high efficiency of temporal coding by converting architectures that were previously learnt with the backpropagation algorithm. In this work, we present the application of a time-coded neuron model for the conversion of classic artificial neural networks that reduces the computational complexity in the synaptic connections. By adapting the ReLU activation function, the network achieved a sparsity of 0.142 spikes per neuron. The classification of handwritten digits from the MNIST dataset show that the neuron model is able to convert convolutional neural networks with several hidden layers.