
This expands on Section 4.4.1 of our book. The basic random effects meta-analysis extends the common effect model by allowing studies to sample from their own populations, so that the true contrasts or intervention effects in the study populations differ. It seems reasonable to represent this between-study variation or heterogeneity with a normal distribution because of the Central Limit Theorem (discussed in Chapter 4 of our book). This produces a multilevel model:
model {
theta ~ ADD_YOUR_PRIOR_HERE
tau ~ ADD_YOUR_PRIOR_HERE
tau_prec <- 1 / (tau*tau)
for(j in 1:m) {
u[j] ~ dnorm(0, tau_prec)
prec_logor[j] <- 1 / (se_logor[j]*se_logor[j])
logor[j] ~ dnorm(theta+u[j], prec_logor[j])
}
oddsratio <- exp(theta)
}Here, we again start with a contrast-based model and use the log odds ratio as an example of a study statistic with an asymptotically normal sampling distribution. We are also taking the study estimates of the standard errors as totally correct and certain. That makes the model a Bayesian match for the frequentist DerSimonian-Laird estimator.
We have included only the basic implementation of prior and likelihood, omitting the storage of the study-specific parameters theta+u[j] for graphics, or prior and posterior predictive checking. We illustrate these methods in another post.
Funnel behaviour
There is a challenge to MCMC algorithms that is common in multilevel models when the heterogeneity standard deviation approaches zero, and this is referred to as a funnel (sometimes Neal’s funnel) problem.
Although Stan identifies problematic iterations of its algorithm as “divergent transitions”, BUGS/JAGS will just get periodically stuck in the funnel where tau is small without alerting you to the problem. One way to avoid it is to use a prior on tau that gives low prior density to values close to zero. Half-normal, half-t, or half-Cauchy will be prone to this, because they give the highest prior density to values close to zero, while gamma, lognormal, chi-squared, or bespoke elicited priors can avoid the problem, but only if you are willing to be more informative about tau. The slow mixing traceplots in Figure 2.12 of the book are in fact caused by funnel behaviour.
Including uncertainty in the standard errors
When the sampling distribution is asymptotically normal, the natural way to acknowledge uncertainty in the standard error is to swap the normal likelihood for the t-distribution, using the dt() function in BUGS and JAGS, with n[j]-2 degrees of freedom. This model is analogous to the Sidik-Jonkman estimator with the Hartung-Knapp adjustment, and we address this in Section 4.6 of the book. This also has the effect of making the meta-analysis a little more robust to outlier studies.
model {
theta ~ ADD_YOUR_PRIOR_HERE
tau ~ ADD_YOUR_PRIOR_HERE
tau_prec <- 1 / (tau*tau)
for(j in 1:m) {
u[j] ~ dnorm(0, tau_prec)
prec_logor[j] <- 1 / (se_logor[j]*se_logor[j])
logor[j] ~ dt(theta+u[j],
prec_logor[j],
n[j]-2);
}
oddsratio <- exp(theta)
}This student’s t distribution is the exact likelihood for means, but is heuristic for log odds ratios, simply chosen for its higher probability in the tails, and other options could be tried, including changing the degrees of freedom for the t.
tags: #bugs, #bugs-repo, #jags, #jags-repo, #repo


Leave a Reply