The consequences of two techniques for symmetrically diluting the weights of the standard Hopfield architecture associative memory model, trained using a non-Hebbian learning rule, are examined. This paper reports experimental investigations into the effect of dilution on factors such as: pattern stability and attractor performance. It is concluded that these networks maintain a reasonable level of performance at fairly high dilution rates.
|Title of host publication||Applications and Science in Soft Computing - Advances in Intelligent and Soft Computing , Vol. 24|
|Editors||A. Lotfi, J.M. Garibaldi|
|Publication status||Published - 2004|
- Hopfield Networks
- Basins of Attraction