Abstract

Spiking neural networks (SNNs) are nature's versatile solution to fault-tolerant, energy-efficient signal processing. To translate these benefits into hardware, a growing number of neuromorphic spiking NN processors have attempted to emulate biological NNs. These developments have created an imminent need for methods and tools that enable such systems to solve real-world signal processing problems. Like conventional NNs, SNNs can be trained on real, domain-specific data; however, their training requires the overcoming of a number of challenges linked to their binary and dynamical nature. This article elucidates step-by-step the problems typically encountered when training SNNs and guides the reader through the key concepts of synaptic plasticity and data-driven learning in the spiking setting. Accordingly, it gives an overview of existing approaches and provides an introduction to surrogate gradient (SG) methods, specifically, as a particularly flexible and efficient method to overcome the aforementioned challenges.

Keywords

Spiking neural networkComputer scienceNeuromorphic engineeringArtificial intelligenceArtificial neural networkMachine learningKey (lock)Signal processingDomain (mathematical analysis)Digital signal processingComputer hardware

Affiliated Institutions

Related Publications

Publication Info

Year
2019
Type
article
Volume
36
Issue
6
Pages
51-63
Citations
1181
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

1181
OpenAlex

Cite This

Emre Neftci, Hesham Mostafa, Friedemann Zenke (2019). Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-Based Optimization to Spiking Neural Networks. IEEE Signal Processing Magazine , 36 (6) , 51-63. https://doi.org/10.1109/msp.2019.2931595

Identifiers

DOI
10.1109/msp.2019.2931595