A Framework for Interactive Visual Interpretation of Remote Sensing Data

3Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Machine learning methods have shown tremendous success in understanding earth observation data; however, recently, there is a rising claim toward explainable machine learning approaches. Concerned researchers found interpretable visualizations to be greatly helpful in understanding how a model works. In this research, we propose a framework for interactive and interpretable visualization of remote sensing data using two machine learning models and an Elasticsearch (ES) database. Two explainable machine learning models, namely, bag-of-visual-words (BoVWs) and latent Dirichlet allocation (LDA) are chosen to model the data in an unsupervised manner and give a textual representation. The textualized remote sensing data are stored in an ES database. This framework offers several fast content-based search functionalities exploiting the full-text query capabilities of ES based on the respective representations and also offers an efficient storage mechanism for the data.

Cite

CITATION STYLE

APA

Karmakar, C., & Datcu, M. (2022). A Framework for Interactive Visual Interpretation of Remote Sensing Data. IEEE Geoscience and Remote Sensing Letters, 19. https://doi.org/10.1109/LGRS.2022.3161959

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free