Abstract
The consequences of two techniques for symmetrically diluting the weights of the standard Hopfield architecture associative memory model, trained using a non-Hebbian learning rule, are examined. This paper reports experimental investigations into the effect of dilution on factors such as: pattern stability and attractor performance. It is concluded that these networks maintain a reasonable level of performance at fairly high dilution rates.
Original language | English |
---|---|
Title of host publication | Applications and Science in Soft Computing - Advances in Intelligent and Soft Computing , Vol. 24 |
Editors | A. Lotfi, J.M. Garibaldi |
Publisher | Springer Nature Link |
Pages | 23-30 |
ISBN (Print) | 978-3-540-40856-7 |
Publication status | Published - 2004 |
Keywords
- Hopfield Networks
- Basins of Attraction