CS-QCFS: Bridging the performance gap in ultra-low latency spiking neural networks

Neural Netw. 2025 Jan 1:184:107076. doi: 10.1016/j.neunet.2024.107076. Online ahead of print.

Abstract

Spiking Neural Networks (SNNs) are at the forefront of computational neuroscience, emulating the nuanced dynamics of biological systems. In the realm of SNN training methods, the conversion from ANNs to SNNs has generated significant interest due to its potential for creating energy-efficient and biologically plausible models. However, existing conversion methods often require long time-steps to ensure that the converted SNNs achieve performance comparable to the original ANNs. In this paper, we thoroughly investigate the process of ANN-SNN conversion and identify two critical issues: the frequently overlooked heterogeneity across channels and the emergence of negative thresholds, both of which lead to the problem of long time-steps. To address these issues, we introduce an innovative activation function called Channel-wise Softplus Quantization Clip-Floor-Shift (CS-QCFS) activation function. This function effectively handles the disparities between channels and maintain positive thresholds. This innovation enables us to achieve high-performance SNNs, particularly in ultra-low time-steps. Our experimental results demonstrate that the proposed method achieves state-of-the-art performance on CIFAR datasets. For instance, we achieve a top-1 accuracy of 95.86% on CIFAR-10 and 74.83% on CIFAR-100 with only 1 time-step.

Keywords: ANN-SNN conversion; Spiking neural networks; Ultra-low latency.