To compute the exact solution of Gaussian process regression one needs Ο(N3) computations for direct and Ο(N2) for iterative methods since it involves a densely populated kernel matrix of size N × N, here N denotes the number of data. This makes large scale learning problems intractable by standard techniques. We propose to use an alternative approach: the kernel matrix is replaced by a data-sparse approximation, called an H2-matrix. This matrix can be represented by only Ο(Nm) units of storage, where m is a parameter controlling the accuracy of the approximation, while the computation of the H2-matrix scales with Ο(N mlog N). Practical experiments demonstrate that our scheme leads to significant reductions in storage requirements and computing times for large data sets in lower dimensional spaces. © Springer-Verlag Berlin Heidelberg 2007.
CITATION STYLE
Börm, S., & Garcke, J. (2007). Approximating Gaussian processes with ℋ2-matrices. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4701 LNAI, pp. 42–53). https://doi.org/10.1007/978-3-540-74958-5_8
Mendeley helps you to discover research relevant for your work.