Abstract

Abstract In a 1935 paper and in his book Theory of Probability, Jeffreys developed a methodology for quantifying the evidence in favor of a scientific theory. The centerpiece was a number, now called the Bayes factor, which is the posterior odds of the null hypothesis when the prior probability on the null is one-half. Although there has been much discussion of Bayesian hypothesis testing in the context of criticism of P-values, less attention has been given to the Bayes factor as a practical tool of applied statistics. In this article we review and discuss the uses of Bayes factors in the context of five scientific applications in genetics, sports, ecology, sociology, and psychology. We emphasize the following points: •From Jeffreys' Bayesian viewpoint, the purpose of hypothesis testing is to evaluate the evidence in favor of a scientific theory.•Bayes factors offer a way of evaluating evidence in favor of a null hypothesis.•Bayes factors provide a way of incorporating external information into the evaluation of evidence about a hypothesis.•Bayes factors are very general and do not require alternative models to be nested.•Several techniques are available for computing Bayes factors, including asymptotic approximations that are easy to compute using the output from standard packages that maximize likelihoods.•In "nonstandard" statistical models that do not satisfy common regularity conditions, it can be technically simpler to calculate Bayes factors than to derive non-Bayesian significance tests.•The Schwarz criterion (or BIC) gives a rough approximation to the logarithm of the Bayes factor, which is easy to use and does not require evaluation of prior distributions.•When one is interested in estimation or prediction, Bayes factors may be converted to weights to be attached to various models so that a composite estimate or prediction may be obtained that takes account of structural or model uncertainty.•Algorithms have been proposed that allow model uncertainty to be taken into account when the class of models initially considered is very large.•Bayes factors are useful for guiding an evolutionary model-building process.•It is important, and feasible, to assess the sensitivity of conclusions to the prior distributions used.

Keywords

Bayes' theoremStatisticsMathematicsComputer scienceBayesian probability

Affiliated Institutions

Related Publications

Theory of Probability

Abstract Jeffreys' Theory of Probability, first published in 1939, was the first attempt to develop a fundamental theory of scientific inference based on Bayesian statistics. Hi...

1998 Nature 6694 citations

Testing Precise Hypotheses

Testing of precise (point or small interval) hypotheses is reviewed, with special emphasis placed on exploring the dramatic conflict between conditional measures (Bayes factors ...

1987 Statistical Science 676 citations

Bayesian inference in ecology

Abstract Bayesian inference is an important statistical tool that is increasingly being used by ecologists. In a Bayesian analysis, information available before a study is condu...

2004 Ecology Letters 770 citations

Publication Info

Year
1995
Type
article
Volume
90
Issue
430
Pages
773-795
Citations
11631
Access
Closed

External Links

Social Impact

Altmetric
PlumX Metrics

Social media, news, blog, policy document mentions

Citation Metrics

11631
OpenAlex

Cite This

Robert E. Kass, Adrian E. Raftery (1995). Bayes Factors. Journal of the American Statistical Association , 90 (430) , 773-795. https://doi.org/10.1080/01621459.1995.10476572

Identifiers

DOI
10.1080/01621459.1995.10476572