The explosion of streaming data poses challenges to feature learning methods including linear discriminant analysis (LDA). Many existing LDA algorithms are not efficient enough to incrementally update with samples that sequentially arrive in various manners. First, we propose a new fast batch LDA (FLDA/QR) learning algorithm that uses the cluster centers to solve a lower triangular system that is optimized by the Cholesky-factorization. To take advantage of the intrinsically incremental mechanism of the matrix, we further develop an exact incremental algorithm (IFLDA/QR). The Gram-Schmidt process with reorthogonalization in IFLDA/QR significantly saves the space and time expenses compared with the rank-one QR-updating of most existing methods. IFLDA/QR is able to handle streaming data containing 1) new labeled samples in the existing classes, 2) samples of an entirely new (novel) class, and more significantly, 3) a chunk of examples mixed with those in 1) and 2). Both theoretical analysis and numerical experiments have demonstrated much lower space and time costs (2 ∼ 10 times faster) than the state of the art, with comparable classification accuracy.
CITATION STYLE
Wang, Y., Fan, X., Luo, Z., Wang, T., Min, M., & Luo, J. (2017). Fast online incremental learning on mixture streaming data. In 31st AAAI Conference on Artificial Intelligence, AAAI 2017 (pp. 2739–2745). AAAI press. https://doi.org/10.1609/aaai.v31i1.10874
Mendeley helps you to discover research relevant for your work.