Abstract
Many neural network learning procedures compute gradients of the errors on the output layer of units after they have settled to their final values. We describe a procedure for finding ∂E/∂w ij , where E is an error functional of the temporal trajectory of the states of a continuous recurrent network and w ij are the weights of that network. Computing these quantities allows one to perform gradient descent in the weights to minimize E. Simulations in which networks are taught to move through limit cycles are shown. This type of recurrent network seems particularly suited for temporally continuous domains, such as signal processing, control, and speech.
Keywords
Affiliated Institutions
Related Publications
Training Recurrent Networks by Evolino
In recent years, gradient-based LSTM recurrent neural networks (RNNs) solved many previously RNN-unlearnable tasks. Sometimes, however, gradient information is of little use for...
Recent advances in physical reservoir computing: A review
Reservoir computing is a computational framework suited for temporal/sequential data processing. It is derived from several recurrent neural network models, including echo state...
Learning long-term dependencies with gradient descent is difficult
Recurrent neural networks can be used to map input sequences to output sequences, such as for recognition, production or prediction problems. However, practical difficulties hav...
A Learning Algorithm for Continually Running Fully Recurrent Neural Networks
The exact form of a gradient-following learning algorithm for completely recurrent networks running in continually sampled time is derived and used as the basis for practical al...
Greedy function approximation: A gradient boosting machine.
Function estimation/approximation is viewed from the perspective\nof numerical optimization in function space, rather than parameter space. A\nconnection is made between stagewi...
Publication Info
- Year
- 1989
- Type
- article
- Volume
- 1
- Issue
- 2
- Pages
- 263-269
- Citations
- 674
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.1162/neco.1989.1.2.263