Bibliography

[Bellec2018]

Bellec, G., Salaj, D., Subramoney, A., Legenstein, R., & Maass, W. (2018). Long short-term memory and learning-to-learn in networks of spiking neurons. Advances in Neural Information Processing Systems, 2018-Decem, 787–797.

[Bellec2020]

Bellec, G., Scherr, F., Subramoney, A., Hajek, E., Salaj, D., Legenstein, R., & Maass, W. (2020). A solution to the learning dilemma for recurrent networks of spiking neurons. Nature Communications, 11(1), 3625. https://doi.org/10.1038/s41467-020-17236-y

[Diehl2015]

Diehl, P. U., Neil, D., Binas, J., Cook, M., Liu, S.-C., & Pfeiffer, M. (2015). Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. 2015 International Joint Conference on Neural Networks (IJCNN), 2015-Septe, 1–8. https://doi.org/10.1109/IJCNN.2015.7280696

[Kingma2014]

Kingma, D. P., & Ba, J. L. (2015). Adam: A method for stochastic optimization. 3rd International Conference on Learning Representations, ICLR 2015 - Conference Track Proceedings, 1–15.

[Kriener2021]

Kriener, L., Göltz, J., & Petrovici, M. A. (2021). The Yin-Yang dataset. 2–7.

[Nowotny2024]

Nowotny, T., Turner, J. P., & Knight, J. C. (2022). Loss shaping enhances exact gradient learning with EventProp in Spiking Neural Networks (arXiv:2212.01232). arXiv. http://arxiv.org/abs/2212.01232

[Stockl2021]

Stöckl, C., & Maass, W. (2021). Optimized spiking neurons can classify images with high accuracy through temporal coding with two spikes. Nature Machine Intelligence, 3(3), 230–238. https://doi.org/10.1038/s42256-021-00311-4

[Wunderlich2021]

Wunderlich, T. C., & Pehle, C. (2021). Event-based backpropagation can compute exact gradients for spiking neural networks. Scientific Reports, 11(1), 12829. https://doi.org/10.1038/s41598-021-91786-z