Piecewise sparse linear classification via factorized asymptotic bayesian inference

2Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.

Abstract

Piecewise sparse linear regression models using factorized asymptotic Bayesian inference (a.k.a. FAB/HME) have recently been employed in practical applications in many industries as a core algorithm of the Heterogeneous Mixture Learning technology. Such applications include sales forecasting in retail stores, energy demand prediction of buildings for smart city, parts demand prediction to optimize inventory, and so on. This paper extends FAB/HME for classification and conducts the following two essential improvements. First, we derive a refined version of factorized information criterion which offers a better approximation of Bayesian marginal log-likelihood. Second, we introduce an analytic quadratic lower bounding technique in an EM-like iterative optimization process of FAB/HME, which drastically reduces computational cost. Experimental results show that advantages of our piecewise sparse linear classification over state-of-the-art piecewise linear models.

Cite

CITATION STYLE

APA

Fujimaki, R., Yamaguchi, Y., & Eto, R. (2016). Piecewise sparse linear classification via factorized asymptotic bayesian inference. Transactions of the Japanese Society for Artificial Intelligence, 31(6). https://doi.org/10.1527/tjsai.AI30-I

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free