Initialization Methods Lower Energy Needs of Spiking Neural Networks for Land Cover Classification
Magda Zajaczkowska (Loughborough University)Earth 1
Spiking Neural Networks (SNN) have received a lot of attention as an energy-efficient alternative to Artificial Neural Networks (ANN). They are particularly useful for machine learning applications in space. Many learning algorithms have been developed to build energy-efficient SNNs and applied to address real-world problems. However, the impact of initialization choices on the energy-efficiency of SNNs has not been investigated thoroughly. In this paper, we study the impact of initial values for neuronal time constants and weights, and the reset mechanisms for membrane potential of neurons on the energy requirements of trained SNNs. For this purpose, we trained several SNNs with different initialization choices on the land cover classification problem using the EuroSAT dataset. It could be clearly observed that lower initial weights and higher initial neuronal time constants result in SNNs that generate fewer spikes without exhibiting any loss in performance. Different resetting mechanisms did not have a significant impact on the number of spikes generated by trained SNNs.