High performance associative memory models and symmetric connections

N. Davey, R.G. Adams, Stephen Hunt

Research output: Chapter in Book/Report/Conference proceedingConference contribution

34 Downloads (Pure)


Two existing high capacity training rules for the standard Hopfield architecture associative memory are examined. Both rules, based on the perceptron learning rule produce asymmetric weight matrices, for which the simple dynamics
(only point attractors) of a symmetric network can no longer be guaranteed. This paper examines the consequences of imposing a symmetry constraint in learning. The mean size of attractor basins of trained patterns and the mean time for learning convergence are analysed for the networks that arise from these learning rules, in both the asymmetric and symmetric instantiations. It is concluded that a symmetry constraint does not have any adverse affect on performance but that it does offer benefits in learning time and in network dynamics
Original languageEnglish
Title of host publicationProceedings of the International ICSC Congress on Intelligent Systems and Applications (ISA 2000): Symposium on Computational Intelligence (CI 2000)
Place of PublicationWollongong, Australia
Publication statusPublished - 2000


Dive into the research topics of 'High performance associative memory models and symmetric connections'. Together they form a unique fingerprint.

Cite this