Abstract

An adaptive neural network with asymmetric connections is introduced. This network is related to the Hopfield network with graded neurons and uses a recurrent generalization of the \ensuremath{\delta} rule of Rumelhart, Hinton, and Williams to modify adaptively the synaptic weights. The new network bears a resemblance to the master/slave network of Lapedes and Farber but it is architecturally simpler.

Keywords

GeneralizationHopfield networkArtificial neural networkComputer scienceRecurrent neural networkArtificial intelligenceBackpropagationTypes of artificial neural networksMathematics

Affiliated Institutions

Related Publications

Long Short-Term Memory

Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We brief...

1997 Neural Computation 90535 citations

Publication Info

Year
1987
Type
article
Volume
59
Issue
19
Pages
2229-2232
Citations
949
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

949
OpenAlex

Cite This

Fernando J. Pineda (1987). Generalization of back-propagation to recurrent neural networks. Physical Review Letters , 59 (19) , 2229-2232. https://doi.org/10.1103/physrevlett.59.2229

Identifiers

DOI
10.1103/physrevlett.59.2229