In this study, the computational properties of a kernel-based least-squares density-ratio estimator are investigated from the viewpoint of condition numbers. The condition number of the Hessian matrix of the loss function is closely related to the convergence rate of optimization and the numerical stability. We use smoothed analysis techniques and theoretically demonstrate that the kernel least-squares method has a smaller condition number than other M-estimators. This implies that the kernel least-squares method has desirable computational properties. In addition, an alternate formulation of the kernel least-squares estimator that possesses an even smaller condition number is presented. The validity of the theoretical analysis is verified through numerical experiments. © 2012 The Author(s).
CITATION STYLE
Kanamori, T., Suzuki, T., & Sugiyama, M. (2013). Computational complexity of kernel-based density-ratio estimation: A condition number analysis. Machine Learning, 90(3), 431–460. https://doi.org/10.1007/s10994-012-5323-6
Mendeley helps you to discover research relevant for your work.