A supervised learning algorithm of SNN is proposed by using spike sequences with complex spatio-temporal information. We explore an error back-propagation method of SNN based on gradient descent. The chain rule proved mathematically that it is sufficient to update the SNN’s synaptic weights by directly using an optimizer. Utilizing the TensorFlo…
tensorflow
optimizer
supervised-learning
gradient-descent
spiking-neural-network
sar-image-classification
spike-sequence
synaptic-weight
-
Updated
Jan 15, 2022 - Python