FNS allows efficient event-driven spiking neural network simulations based on a neuron model supporting spike latency.
Ontology highlight
ABSTRACT: Neural modelling tools are increasingly employed to describe, explain, and predict the human brain's behavior. Among them, spiking neural networks (SNNs) make possible the simulation of neural activity at the level of single neurons, but their use is often threatened by the resources needed in terms of processing capabilities and memory. Emerging applications where a low energy burden is required (e.g. implanted neuroprostheses) motivate the exploration of new strategies able to capture the relevant principles of neuronal dynamics in reduced and efficient models. The recent Leaky Integrate-and-Fire with Latency (LIFL) spiking neuron model shows some realistic neuronal features and efficiency at the same time, a combination of characteristics that may result appealing for SNN-based brain modelling. In this paper we introduce FNS, the first LIFL-based SNN framework, which combines spiking/synaptic modelling with the event-driven approach, allowing us to define heterogeneous neuron groups and multi-scale connectivity, with delayed connections and plastic synapses. FNS allows multi-thread, precise simulations, integrating a novel parallelization strategy and a mechanism of periodic dumping. We evaluate the performance of FNS in terms of simulation time and used memory, and compare it with those obtained with neuronal models having a similar neurocomputational profile, implemented in NEST, showing that FNS performs better in both scenarios. FNS can be advantageously used to explore the interaction within and between populations of spiking neurons, even for long time-scales and with a limited hardware configuration.
SUBMITTER: Susi G
PROVIDER: S-EPMC8190312 | biostudies-literature |
REPOSITORIES: biostudies-literature
ACCESS DATA