Abstract

This paper employs the Auto-Encoding Variational Bayes (AEVB) estimator based on Stochastic Gradient Variational Bayes (SGVB), designed to optimize recognition models for challenging posterior distributions and large-scale datasets. It has been applied to the mnist dataset and extended to form a Dynamic Bayesian Network (DBN) in the context of time series. The paper delves into Bayesian inference, variational methods, and the fusion of Variational Autoencoders (VAEs) and variational techniques. Emphasis is placed on reparameterization for achieving efficient optimization. AEVB employs VAEs as an approximation for intricate posterior distributions.

Keywords

Bayes' theoremEncoding (memory)Computer scienceBayesian probabilityMathematicsComputational biologyArtificial intelligenceAlgorithmBiology

Affiliated Institutions

Related Publications

Bayes Factors

Abstract In a 1935 paper and in his book Theory of Probability, Jeffreys developed a methodology for quantifying the evidence in favor of a scientific theory. The centerpiece wa...

1995 Journal of the American Statistical A... 11631 citations

Publication Info

Year
2024
Type
article
Volume
2
Issue
1
Citations
977
Access
Closed

External Links

Social Impact

Altmetric

Social media, news, blog, policy document mentions

Citation Metrics

977
OpenAlex

Cite This

Yan-Kun Chen, Jingxuan Liu, Lingyun Peng et al. (2024). Auto-Encoding Variational Bayes. Cambridge Explorations in Arts and Sciences , 2 (1) . https://doi.org/10.61603/ceas.v2i1.33

Identifiers

DOI
10.61603/ceas.v2i1.33