Gaussian processes are a popular and effective Bayesian method for classification and regression. Generating sparse Gaussian processes is a hot research topic, since Gaussian processes have to face the problem of cubic time complexity with respect to the size of the training set. Inspired by the idea of multi-task learning, we believe that simultaneously selecting subsets of multiple Gaussian processes will be more suitable than selecting them separately. In this paper, we propose an improved multi-task sparsity regularizer which can effectively regularize the subset selection of multiple tasks for multi-task sparse Gaussian processes. In particular, based on the multi-task sparsity regularizer proposed in [12], we perform two improvements: 1) replacing a subset of points with a rough global structure when measuring the global consistency of one point; 2) performing normalization on each dimension of every data set before sparsification. We combine the regularizer with two methods to demonstrate its effectiveness. Experimental results on four real data sets show its superiority.
CITATION STYLE
Zhu, J., & Sun, S. (2014). Multi-task sparse gaussian processes with improved multi-task sparsity regularization. In Communications in Computer and Information Science (Vol. 483, pp. 54–62). Springer Verlag. https://doi.org/10.1007/978-3-662-45646-0_6
Mendeley helps you to discover research relevant for your work.