The problem of embedding arises in many machine learning applications with the assumption that there may exist a small number of variabilities which can guarantee the "semantics" of the original high-dimensional data. Most of the existing embedding algorithms perform to maintain the locality-preserving property. In this study, inspired by the remarkable success of representation learning and deep learning, we propose a framework of embedding with autoencoder regularization (EAER for short), which incorporates embedding and autoencoding techniques naturally. In this framework, the original data are embedded into the lower dimension, represented by the output of the hidden layer of the autoencoder, thus the resulting data can not only maintain the locality-preserving property but also easily revert to their original forms. This is guaranteed by the joint minimization of the embedding loss and the autoencoder reconstruction error. It is worth mentioning that instead of operating in a batch mode as most of the previous embedding algorithms conduct, the proposed framework actually generates an inductive embedding model and thus supports incremental embedding efficiently. To show the effectiveness of EAER, we adapt this joint learning framework to three canonical embedding algorithms, and apply them to both synthetic and real-world data sets. The experimental results show that the adaption of EAER outperforms its original counterpart. Besides, compared with the existing incremental embedding algorithms, the results demonstrate that EAER performs incremental embedding with more competitive efficiency and effectiveness. © 2013 Springer-Verlag.
CITATION STYLE
Yu, W., Zeng, G., Luo, P., Zhuang, F., He, Q., & Shi, Z. (2013). Embedding with autoencoder regularization. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8190 LNAI, pp. 208–223). https://doi.org/10.1007/978-3-642-40994-3_14
Mendeley helps you to discover research relevant for your work.