University of Hertfordshire

By the same authors

The Importance of Self-excitation in Spiking Neural Networks Evolved to Recognize Temporal Patterns

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Standard

The Importance of Self-excitation in Spiking Neural Networks Evolved to Recognize Temporal Patterns. / Yaqoob, Muhammad; Steuber, Volker; Wróbel, Borys.

Artificial Neural Networks and Machine Learning – ICANN 2019: Theoretical Neural Computation - 28th International Conference on Artificial Neural Networks, 2019, Proceedings. ed. / Igor V. Tetko; Pavel Karpov; Fabian Theis; Vera Kurková. Springer Verlag, 2019. p. 758-771 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11727 LNCS).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Harvard

Yaqoob, M, Steuber, V & Wróbel, B 2019, The Importance of Self-excitation in Spiking Neural Networks Evolved to Recognize Temporal Patterns. in IV Tetko, P Karpov, F Theis & V Kurková (eds), Artificial Neural Networks and Machine Learning – ICANN 2019: Theoretical Neural Computation - 28th International Conference on Artificial Neural Networks, 2019, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 11727 LNCS, Springer Verlag, pp. 758-771, 28th International Conference on Artificial Neural Networks, ICANN 2019, Munich, Germany, 17/09/19. https://doi.org/10.1007/978-3-030-30487-4_59

APA

Yaqoob, M., Steuber, V., & Wróbel, B. (2019). The Importance of Self-excitation in Spiking Neural Networks Evolved to Recognize Temporal Patterns. In I. V. Tetko, P. Karpov, F. Theis, & V. Kurková (Eds.), Artificial Neural Networks and Machine Learning – ICANN 2019: Theoretical Neural Computation - 28th International Conference on Artificial Neural Networks, 2019, Proceedings (pp. 758-771). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11727 LNCS). Springer Verlag. https://doi.org/10.1007/978-3-030-30487-4_59

Vancouver

Yaqoob M, Steuber V, Wróbel B. The Importance of Self-excitation in Spiking Neural Networks Evolved to Recognize Temporal Patterns. In Tetko IV, Karpov P, Theis F, Kurková V, editors, Artificial Neural Networks and Machine Learning – ICANN 2019: Theoretical Neural Computation - 28th International Conference on Artificial Neural Networks, 2019, Proceedings. Springer Verlag. 2019. p. 758-771. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-030-30487-4_59

Author

Yaqoob, Muhammad ; Steuber, Volker ; Wróbel, Borys. / The Importance of Self-excitation in Spiking Neural Networks Evolved to Recognize Temporal Patterns. Artificial Neural Networks and Machine Learning – ICANN 2019: Theoretical Neural Computation - 28th International Conference on Artificial Neural Networks, 2019, Proceedings. editor / Igor V. Tetko ; Pavel Karpov ; Fabian Theis ; Vera Kurková. Springer Verlag, 2019. pp. 758-771 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).

Bibtex

@inproceedings{f6b8021634da4e9e9ed11b4fb3ca5723,
title = "The Importance of Self-excitation in Spiking Neural Networks Evolved to Recognize Temporal Patterns",
abstract = "Biological and artificial spiking neural networks process information by changing their states in response to the temporal patterns of input and of the activity of the network itself. Here we analyse very small networks, evolved to recognize three signals in a specific pattern (ABC) in a continuous temporal stream of signals (..CABCACB..). This task can be accomplished by networks with just four neurons (three interneurons and one output). We show that evolving the networks in the presence of noise and variation of the intervals of silence between signals biases the solutions towards networks that can maintain their states (a form of memory), while the majority of networks evolved without variable intervals between signals cannot do so. We demonstrate that in most networks, the evolutionary process leads to the presence of superfluous connections that can be pruned without affecting the ability of the networks to perform the task and, if the unpruned network can maintain memory, so does the pruned network. We then analyse how these small networks can perform their tasks, using a paradigm of finite state transducers. This analysis shows that self-excitatory loops (autapses) in these networks are crucial for both the recognition of the pattern and for memory maintenance.",
keywords = "Artificial evolution, Complex networks, Ex-loops, Finite state transducer, Genetic algorithm, Minimal cognition, Self-loops, Spiking neural networks, Temporal pattern recognition",
author = "Muhammad Yaqoob and Volker Steuber and Borys Wr{\'o}bel",
year = "2019",
month = sep,
day = "9",
doi = "10.1007/978-3-030-30487-4_59",
language = "English",
isbn = "9783030304867",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "758--771",
editor = "Tetko, {Igor V.} and Pavel Karpov and Fabian Theis and Vera Kurkov{\'a}",
booktitle = "Artificial Neural Networks and Machine Learning – ICANN 2019",
address = "Germany",
note = "28th International Conference on Artificial Neural Networks, ICANN 2019 ; Conference date: 17-09-2019 Through 19-09-2019",

}

RIS

TY - GEN

T1 - The Importance of Self-excitation in Spiking Neural Networks Evolved to Recognize Temporal Patterns

AU - Yaqoob, Muhammad

AU - Steuber, Volker

AU - Wróbel, Borys

PY - 2019/9/9

Y1 - 2019/9/9

N2 - Biological and artificial spiking neural networks process information by changing their states in response to the temporal patterns of input and of the activity of the network itself. Here we analyse very small networks, evolved to recognize three signals in a specific pattern (ABC) in a continuous temporal stream of signals (..CABCACB..). This task can be accomplished by networks with just four neurons (three interneurons and one output). We show that evolving the networks in the presence of noise and variation of the intervals of silence between signals biases the solutions towards networks that can maintain their states (a form of memory), while the majority of networks evolved without variable intervals between signals cannot do so. We demonstrate that in most networks, the evolutionary process leads to the presence of superfluous connections that can be pruned without affecting the ability of the networks to perform the task and, if the unpruned network can maintain memory, so does the pruned network. We then analyse how these small networks can perform their tasks, using a paradigm of finite state transducers. This analysis shows that self-excitatory loops (autapses) in these networks are crucial for both the recognition of the pattern and for memory maintenance.

AB - Biological and artificial spiking neural networks process information by changing their states in response to the temporal patterns of input and of the activity of the network itself. Here we analyse very small networks, evolved to recognize three signals in a specific pattern (ABC) in a continuous temporal stream of signals (..CABCACB..). This task can be accomplished by networks with just four neurons (three interneurons and one output). We show that evolving the networks in the presence of noise and variation of the intervals of silence between signals biases the solutions towards networks that can maintain their states (a form of memory), while the majority of networks evolved without variable intervals between signals cannot do so. We demonstrate that in most networks, the evolutionary process leads to the presence of superfluous connections that can be pruned without affecting the ability of the networks to perform the task and, if the unpruned network can maintain memory, so does the pruned network. We then analyse how these small networks can perform their tasks, using a paradigm of finite state transducers. This analysis shows that self-excitatory loops (autapses) in these networks are crucial for both the recognition of the pattern and for memory maintenance.

KW - Artificial evolution

KW - Complex networks

KW - Ex-loops

KW - Finite state transducer

KW - Genetic algorithm

KW - Minimal cognition

KW - Self-loops

KW - Spiking neural networks

KW - Temporal pattern recognition

UR - http://www.scopus.com/inward/record.url?scp=85072863158&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-30487-4_59

DO - 10.1007/978-3-030-30487-4_59

M3 - Conference contribution

AN - SCOPUS:85072863158

SN - 9783030304867

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 758

EP - 771

BT - Artificial Neural Networks and Machine Learning – ICANN 2019

A2 - Tetko, Igor V.

A2 - Karpov, Pavel

A2 - Theis, Fabian

A2 - Kurková, Vera

PB - Springer Verlag

T2 - 28th International Conference on Artificial Neural Networks, ICANN 2019

Y2 - 17 September 2019 through 19 September 2019

ER -