Abstract

In the first part of this paper, a regular recurrent neural network (RNN) is extended to a bidirectional recurrent neural network (BRNN). The BRNN can be trained without the limitation of using input information just up to a preset future frame. This is accomplished by training it simultaneously in positive and negative time direction. Structure and training procedure of the proposed network are explained. In regression and classification experiments on artificial data, the proposed structure gives better results than other approaches. For real data, classification experiments for phonemes from the TIMIT database show the same tendency. In the second part of this paper, it is shown how the proposed bidirectional structure can be easily modified to allow efficient estimation of the conditional posterior probability of complete symbol sequences without making any explicit assumption about the shape of the distribution. For this part, experiments on real data are reported.

Keywords

TIMITRecurrent neural networkComputer scienceArtificial neural networkFrame (networking)Artificial intelligencePattern recognition (psychology)Symbol (formal)AlgorithmMachine learningHidden Markov model

Affiliated Institutions

Related Publications

Publication Info

Year
1997
Type
article
Volume
45
Issue
11
Pages
2673-2681
Citations
9385
Access
Closed

External Links

Social Impact

Altmetric

Social media, news, blog, policy document mentions

Citation Metrics

9385
OpenAlex

Cite This

Mike Schuster, Kuldip K. Paliwal (1997). Bidirectional recurrent neural networks. IEEE Transactions on Signal Processing , 45 (11) , 2673-2681. https://doi.org/10.1109/78.650093

Identifiers

DOI
10.1109/78.650093