Deep Recursive Embedding for High-Dimensional Data

7Citations
Citations of this article
24Readers
Mendeley users who have this article in their library.

Abstract

Embedding high-dimensional data onto a low-dimensional manifold is of both theoretical and practical value. In this article, we propose to combine deep neural networks (DNN) with mathematics-guided embedding rules for high-dimensional data embedding. We introduce a generic deep embedding network (DEN) framework, which is able to learn a parametric mapping from high-dimensional space to low-dimensional space, guided by well-established objectives such as Kullback-Leibler (KL) divergence minimization. We further propose a recursive strategy, called deep recursive embedding (DRE), to make use of the latent data representations for boosted embedding performance. We exemplify the flexibility of DRE by different architectures and loss functions, and benchmarked our method against the two most popular embedding methods, namely, t-distributed stochastic neighbor embedding (t-SNE) and uniform manifold approximation and projection (UMAP). The proposed DRE method can map out-of-sample data and scale to extremely large datasets. Experiments on a range of public datasets demonstrated improved embedding performance in terms of local and global structure preservation, compared with other state-of-The-Art embedding methods. Code is available at https://github.com/tao-Aimi/DeepRecursiveEmbedding.

Cite

CITATION STYLE

APA

Zhou, Z., Zu, X., Wang, Y., Lelieveldt, B. P. F., & Tao, Q. (2022). Deep Recursive Embedding for High-Dimensional Data. IEEE Transactions on Visualization and Computer Graphics, 28(2), 1237–1248. https://doi.org/10.1109/TVCG.2021.3122388

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free