University of Hertfordshire

High performance associative memory models and symmetric connections

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Documents

  • 900907

    Accepted author manuscript, 49 KB, PDF document

View graph of relations
Original languageEnglish
Title of host publicationProceedings of the International ICSC Congress on Intelligent Systems and Applications (ISA 2000): Symposium on Computational Intelligence (CI 2000)
Place of PublicationWollongong, Australia
Pages326-331
Volume2
Publication statusPublished - 2000

Abstract

Two existing high capacity training rules for the standard Hopfield architecture associative memory are examined. Both rules, based on the perceptron learning rule produce asymmetric weight matrices, for which the simple dynamics
(only point attractors) of a symmetric network can no longer be guaranteed. This paper examines the consequences of imposing a symmetry constraint in learning. The mean size of attractor basins of trained patterns and the mean time for learning convergence are analysed for the networks that arise from these learning rules, in both the asymmetric and symmetric instantiations. It is concluded that a symmetry constraint does not have any adverse affect on performance but that it does offer benefits in learning time and in network dynamics

ID: 422428