Vision-based indoor positioning (VBIP) - An indoor ar navigation system with a virtual tour guide

3Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we describe the structure of a Vision-Based Indoor Positioning (VBIP) system which is a pure software solution without any hardware deployment. We conducted an experiment by providing our VBIP system to navigate visitors through three different buildings with a total length over 350 m during Science and Technology Festival 2018 at Waseda University. This large scale experiment pointed out our incomprehensive thinking while designing the algorithm only based on human behavior, and motivated us to remodify our algorithm based on natural features in the environment. We further conducted another experiment and found out that our VBIP system is improved. VBIP system can now reduce the drift error of VIO (Visual Inertial Odometry) to around 1.4% for over 350 m tracking. For the experiment result, we believe that VBIP is one step closer to perfection.

Cite

CITATION STYLE

APA

Tsai, H. Y., Kuwahara, Y., leiri, Y., & Hishiyama, R. (2019). Vision-based indoor positioning (VBIP) - An indoor ar navigation system with a virtual tour guide. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11677 LNCS, pp. 96–109). Springer Verlag. https://doi.org/10.1007/978-3-030-28011-6_7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free