Abstract

We use simulation studies, whose design is realistic for educational and medical\nresearch (as well as other fields of inquiry), to compare Bayesian and likelihood-based\nmethods for fitting variance-components (VC) and random-effects logistic regression\n(RELR) models. The likelihood (and approximate likelihood) approaches we examine are\nbased on the methods most widely used in current applied multilevel (hierarchical)\nanalyses: maximum likelihood (ML) and restricted ML (REML) for Gaussian outcomes, and\nmarginal and penalized quasi-likelihood (MQL and PQL) for Bernoulli outcomes. Our\nBayesian methods use Markov chain Monte Carlo (MCMC) estimation, with adaptive hybrid\nMetropolis-Gibbs sampling for RELR models, and several diffuse prior distributions\n($\\Gamma^{ -1 }( \\epsilon, \\epsilon )$ and $U( 0, \\frac{ 1 }{ \\epsilon } )$ priors for\nvariance components). For evaluation criteria we consider bias of point estimates and\nnominal versus actual coverage of interval estimates in repeated sampling. In two-level\nVC models we find that (a) both likelihood-based and Bayesian approaches can be made to\nproduce approximately unbiased estimates, although the automatic manner in which REML\naccomplishes this is an advantage, but (b) both approaches had difficulty achieving\nnominal coverage in small samples and with small values of the intraclass correlation.\nWith the three-level RELR models we examine we find that (c) quasi-likelihood methods\nfor estimating random-effects variances perform badly with respect to bias and coverage\nin the example we simulated, and (d) Bayesian diffuse-prior methods lead to\nwell-calibrated point and interval RELR estimates. While it is true that the\nlikelihood-based methods we study are considerably faster computationally than MCMC, (i)\nsteady improvements in recent years in both hardware speed and efficiency of Monte Carlo\nalgorithms and (ii) the lack of calibration of likelihood-based methods in some common\nhierarchical settings combine to make MCMC-based Bayesian fitting of multilevel models\nan attractive approach, even with rather large data sets. Other analytic strategies\nbased on less approximate likelihood methods are also possible but would benefit from\nfurther study of the type summarized here.

Keywords

Marginal likelihoodGibbs samplingMarkov chain Monte CarloRestricted maximum likelihoodStatisticsMathematicsBayesian probabilityRandom effects modelPrior probabilityLikelihood functionEstimation theory

Affiliated Institutions

Related Publications

Publication Info

Year
2006
Type
article
Volume
1
Issue
3
Citations
607
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

607
OpenAlex

Cite This

William J. Browne, David Draper (2006). A comparison of Bayesian and likelihood-based methods for fitting multilevel models. Bayesian Analysis , 1 (3) . https://doi.org/10.1214/06-ba117

Identifiers

DOI
10.1214/06-ba117