Abstract
In this paper, we focus on improving the performance of the Nystrom based kernel SVM. Although the Nystrom approximation has been studied extensively and its application to kernel classification has been exhibited in several studies, there still exists a potentially large gap between the performance of classifier learned with the Nystrom approximation and that learned with the original kernel. In this work, we make novel contributions to bridge the gap without increasing the training costs too much by proposing a refined Nystrom based kernel classifier. We adopt a two-step approach that in the first step we learn a sufficiently good dual solution and in the second step we use the obtained dual solution to construct a new set of bases for the Nystrom approximation to re-train a refined classifier. Our approach towards learning a good dual solution is based on a sparse-regularized dual formulation with the Nystrom approximation, which can be solved with the same time complexity as solving the standard formulation. We justify our approach by establishing a theoretical guarantee on the error of the learned dual solution in the first step with respect to the optimal dual solution under appropriate conditions. The experimental results demonstrate that (i) the obtained dual solution by our approach in the first step is closer to the optimal solution and yields improved prediction performance; and (ii) the second step using the obtained dual solution to re-train the model further improves the performance.
Cite
CITATION STYLE
Li, Z., Yang, T., Zhang, L., & Jin, R. (2016). Fast and accurate refined nystrom based kernel SVM. In 30th AAAI Conference on Artificial Intelligence, AAAI 2016 (pp. 1830–1836). AAAI press. https://doi.org/10.1609/aaai.v30i1.10244
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.