Albert and Chib introduced a complete Bayesian method to analyze data arising from the generalized linear model in which they used the Gibbs sampling algorithm facilitated by latent variables. Recently, Cowles proposed an alternative algorithm to accelerate the convergence of the Albert-Chib algorithm. The novelty in this latter algorithm is achieved by using a Hastings algorithm to generate latent variables and bin boundary parameters jointly instead of individually from their respective full conditionals. In the same spirit, we reparameterize the cumulative-link generalized linear model to accelerate the convergence of Cowles' algorithm even further. One important advantage of our method is that for the three-bin problem it does not require the Hastings algorithm. In addition, for problems with more than three bins, while the Hastings algorithm is required, we provide a proposal density based on the Dirichlet distribution which is more natural than the truncated normal density used in the competing algorithm. Also, using diagnostic procedures recommended in the literature for the Markov chain Monte Carlo algorithm (bath single and multiple runs) we show that our algorithm is substantially better than the one recently obtained. Precisely, our algorithm provides faster convergence and smaller autocorrelations between the iterates. Using the probit link function, extensive results are obtained for the three-bin and the five-bin multinomial ordinal data problems.
Mendeley saves you time finding and organizing research
Choose a citation style from the tabs below