Abstract
We evolve both topology and synaptic weights of recurrent very small spiking neural networks in the presence of noise on the membrane potential. The noise is at a level similar to the level observed in biological neurons. The task of the networks is to recognise three signals in a particular order (a pattern ABC) in a continuous input stream in which each signal occurs with the same probability. The networks consist of adaptive exponential integrate and fire neurons and are limited to either three or four interneurons and one output neuron, with recurrent and self-connections allowed only for interneurons. Our results show that spiking neural networks evolved in the presence of noise are robust to the change of neuronal parameters. We propose a procedure to approximate the range, specific for every neuronal parameter, from which the parameters can be sampled to preserve, at least for some networks, high true positive rate and low false discovery rate. After assigning the state of neurons to states of the network corresponding to states in a finite state transducer, we show that this simple but not trivial computational task of temporal pattern recognition can be accomplished in a variety of ways.
Original language | Undefined/Unknown |
---|---|
Title of host publication | Artificial Neural Networks and Machine Learning -- ICANN 2018 |
Editors | Věra Kurková, Yannis Manolopoulos, Barbara Hammer, Lazaros Iliadis, Ilias Maglogiannis |
Place of Publication | Cham |
Publisher | Springer Nature |
Pages | 322-331 |
Number of pages | 10 |
ISBN (Print) | 978-3-030-01418-6 |
Publication status | Published - 2018 |