In this contribution, we extend the existing theory of minimum mean squared error prediction (best prediction). This extention is motivated by the desire to be able to deal with models in which the parameter vectors have real-valued and/or integer-valued entries. New classes of predictors are introduced, based on the principle of equivariance. Equivariant prediction is developed for the real-parameter case, the integer-parameter case, and for the mixed integer/real case. The best predictors within these classes are identified, and they are shown to have a better performance than best linear (unbiased) prediction. This holds true for the mean squared error performance, as well as for the error variance performance. We show that, in the context of linear model prediction, best predictors and best estimators come in pairs. We take advantage of this property by also identifying the corresponding best estimators. All of the best equivariant estimators are shown to have a better precision than the best linear unbiased estimator. Although no restrictions are placed on the probability distributions of the random vectors, the Gaussian case is derived separately. The best predictors are also compared with least-squares predictors, in particular with the integer-based least-squares predictor introduced in Teunissen (J Geodesy, in press, 2006). © Springer-Verlag 2007.
CITATION STYLE
Teunissen, P. J. G. (2007). Best prediction in linear models with mixed integer/real unknowns: Theory and application. Journal of Geodesy, 81(12), 759–780. https://doi.org/10.1007/s00190-007-0140-6
Mendeley helps you to discover research relevant for your work.