High performance associative memory models and sign constraints

N. Davey, R.G. Adams

Research output: Chapter in Book/Report/Conference proceedingConference contribution

32 Downloads (Pure)


The consequences of imposing a sign constraint on the standard Hopfield architecture associative memory model, trained using perceptron like learning rules, is examined. Such learning rules have been shown to have capacity of at most half of their unconstrained versions. This paper reports experimental investigations into the consequences of constraining the sign of the network weights in terms of: capacity, training times and size of basins of attraction. It is concluded that the capacity is roughly half the theoretical maximum, the training times are much increased and that the attractor basins are significantly reduced in size.
Original languageEnglish
Title of host publicationProcs of NNA'01: 2001 WSES Int Conf on Neural Networks & Applications
Publication statusPublished - 2001


Dive into the research topics of 'High performance associative memory models and sign constraints'. Together they form a unique fingerprint.

Cite this