Abstract
The generalization performance of kernel methods is largely determined by the kernel, but spectral representations of stationary kernels are both input-independent and output independent, which limits their applications on complicated tasks. In this paper, we propose an efficient learning framework that incorporates the process of finding suitable kernels and model training. Using non-stationary spectral kernels and back propagation w.r.t. the objective, we obtain favorable spectral representations that depends on both inputs and outputs. Further, based on Rademacher complexity, we derive data-dependent generalization error bounds, where we investigate the effect of those factors and introduce regularization terms to improve the performance. Extensive experimental results validate the effectiveness of the proposed algorithm and coincide with our theoretical findings.
Cite
CITATION STYLE
Li, J., Liu, Y., & Wang, W. (2020). Automated spectral kernel learning. In AAAI 2020 - 34th AAAI Conference on Artificial Intelligence (pp. 4618–4625). AAAI press. https://doi.org/10.1609/aaai.v34i04.5892
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.