Bag of World Anchors for Instant Large-Scale Localization

0Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this work, we present a novel scene description to perform large-scale localization using only geometric constraints. Our work extends compact world anchors with a search data structure to efficiently perform localization and pose estimation of mobile augmented reality devices across multiple platforms (e.g., HoloLens 2, iPad). The algorithm uses a bag-of-words approach to characterize distinct scenes (e.g., rooms). Since the individual scene representations rely on compact geometric (rather than appearance-based) features, the resulting search structure is very lightweight and fast, lending itself to deployment on mobile devices. We present a set of experiments demonstrating the accuracy, performance and scalability of our novel localization method. In addition, we describe several use cases demonstrating how efficient cross-platform localization facilitates sharing of augmented reality experiences.

Cite

CITATION STYLE

APA

Reyes-Aviles, F., Fleck, P., Schmalstieg, D., & Arth, C. (2023). Bag of World Anchors for Instant Large-Scale Localization. IEEE Transactions on Visualization and Computer Graphics, 29(11), 4730–4739. https://doi.org/10.1109/TVCG.2023.3320264

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free