Abstract
The consequences of diluting the weights of the standard Hopfield architecture associative memory model, trained using perceptron like learning rules, is examined. A proportion of the weights of the network are removed; this can be done in a symmetric and asymmetric way and both methods are investigated. This paper reports experimental investigations into the consequences of dilution in terms of: capacity, training times and size of basins of attraction. It is concluded that these networks maintain a reasonable performance at fairly high dilution rates.
Original language | English |
---|---|
Title of host publication | Procs of Int Conf on Neural Information Processing |
Subtitle of host publication | ICONIP'01 |
Pages | 597-602 |
Volume | 2 |
Publication status | Published - 2001 |