Abstract
Two existing high capacity training rules for the standard Hopfield architecture associative memory are examined. Both rules, based on the perceptron learning rule produce asymmetric weight matrices, for which the simple dynamics
(only point attractors) of a symmetric network can no longer be guaranteed. This paper examines the consequences of imposing a symmetry constraint in learning. The mean size of attractor basins of trained patterns and the mean time for learning convergence are analysed for the networks that arise from these learning rules, in both the asymmetric and symmetric instantiations. It is concluded that a symmetry constraint does not have any adverse affect on performance but that it does offer benefits in learning time and in network dynamics
(only point attractors) of a symmetric network can no longer be guaranteed. This paper examines the consequences of imposing a symmetry constraint in learning. The mean size of attractor basins of trained patterns and the mean time for learning convergence are analysed for the networks that arise from these learning rules, in both the asymmetric and symmetric instantiations. It is concluded that a symmetry constraint does not have any adverse affect on performance but that it does offer benefits in learning time and in network dynamics
Original language | English |
---|---|
Title of host publication | Proceedings of the International ICSC Congress on Intelligent Systems and Applications (ISA 2000): Symposium on Computational Intelligence (CI 2000) |
Place of Publication | Wollongong, Australia |
Pages | 326-331 |
Volume | 2 |
Publication status | Published - 2000 |