Abstract
Variational methods have been previously explored as a tractable approximation to Bayesian inference for neural networks. However the approaches proposed so far have only been applicable to a few simple network architectures. This paper introduces an easy-to-implement stochastic variational method (or equivalently, minimum description length loss function) that can be applied to most neural net-works. Along the way it revisits several common regularisers from a variational perspective. It also provides a simple pruning heuristic that can both drastically re-duce the number of network weights and lead to improved generalisation. Exper-imental results are provided for a hierarchical multidimensional recurrent neural network applied to the TIMIT speech corpus. 1
Keywords
Affiliated Institutions
Related Publications
An application of recurrent nets to phone probability estimation
This paper presents an application of recurrent networks for phone probability estimation in large vocabulary speech recognition. The need for efficient exploitation of context ...
Generalization of Back propagation to Recurrent and Higher Order Neural Networks
A general method for deriving backpropagation algorithms for networks with recurrent and higher order networks is introduced. The propagation of activation in these networks is ...
Learning the hidden structure of speech
In the work described here, the backpropagation neural network learning procedure is applied to the analysis and recognition of speech. This procedure takes a set of input/outpu...
CNN Features Off-the-Shelf: An Astounding Baseline for Recognition
Recent results indicate that the generic descriptors ex-tracted from the convolutional neural networks are very powerful. This paper adds to the mounting evidence that this is i...
Conditional Random Fields as Recurrent Neural Networks
Pixel-level labelling tasks, such as semantic segmentation, play a central role in image understanding. Recent approaches have attempted to harness the capabilities of deep lear...
Publication Info
- Year
- 2011
- Type
- article
- Volume
- 24
- Pages
- 2348-2356
- Citations
- 1061
- Access
- Closed