Conditional predictive inference for beta regression model with autoregressive errors

0Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this chapter, we study a partially linear model with autoregressive beta distributed errors [6] from the Bayesian point of view. Our proposal also provides a useful method to determine the optimal order of the autoregressive processes through an adaptive procedure using the conditional predictive ordinate (CPO) statistic [9]. In this context, the linear predictor of the beta regression model g(μt ) incorporates an unknown smooth function for the auxiliary time covariate t and a sequence of autoregressive errors ∊t , i.e., for t = 1, . . . , T , where xt is a k × 1 vector of nonstochastic explanatory variable values and β is a k × 1 fixed parameter vector. Furthermore, these models have a convenient hierarchical representation allowing to us an easily implementation of a Markov chain Monte Carlo (MCMC) scheme.We also propose to modify the traditional conditional predictive ordinate (CPO) to obtain what we call the autoregressive CPO, which is computed for each new observation using only the data from previous time periods.

Cite

CITATION STYLE

APA

Ferreira, G., Navarrete, J. P., Castro, L. M., & de Castro, M. (2015). Conditional predictive inference for beta regression model with autoregressive errors. In Springer Proceedings in Mathematics and Statistics (Vol. 118, pp. 357–366). Springer New York LLC. https://doi.org/10.1007/978-3-319-12454-4_30

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free