A Triple-Memristor Hopfield Neural Network With Space Multi-Structure Attractors And Space Initial-Offset Behaviors

Hairong Lin, Chunhua Wang, Fei Yu, Qinghui Hong , Cong Xu, Yichuang Sun

Research output: Contribution to journalArticlepeer-review

16 Downloads (Pure)

Abstract

Memristors have recently demonstrated great promise in constructing memristive neural networks with complex dynamics. This paper proposes a memristive Hopfield neural network with three memristive coupling synaptic weights. The complex dynamical behaviors of the triple-memristor Hopfield neural network (TM-HNN), which have never been observed in previous Hopfield-type neural networks, include space multi-structure chaotic attractors and space initial-offset coexisting behaviors. Bifurcation diagrams, Lyapunov exponents, phase portraits, Poincaré maps, and basins of attraction are used to reveal and examine the specific dynamics. Theoretical analysis and numerical simulation show that the number of space multi-structure attractors can be adjusted by changing the control parameters of the memristors, and the position of space coexisting attractors can be changed by switching the initial states of the memristors. Extreme multistability emerges as a result of the TM-HNN’s unique dynamical behaviors, making it more suitable for applications based on chaos. Moreover, a digital hardware platform is developed and the space multi-structure attractors as well as the space coexisting attractors are experimentally demonstrated. Finally, we design a pseudo-random number generator to explore the potential application of the proposed TM-HNN.
Original languageEnglish
Number of pages10
JournalIEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
Early online date20 Jun 2023
DOIs
Publication statusE-pub ahead of print - 20 Jun 2023

Fingerprint

Dive into the research topics of 'A Triple-Memristor Hopfield Neural Network With Space Multi-Structure Attractors And Space Initial-Offset Behaviors'. Together they form a unique fingerprint.

Cite this