Collaboration calibration and three-dimensional localization in multi-view system

2Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this research, the authors have addressed the collaboration calibration and real-time three-dimensional (3D) localization problem in the multi-view system. The 3D localization method is proposed to fuse the two-dimensional image coordinates from multi-views and provide the 3D space location in real time. It is a fundamental solution to obtain the 3D location of the moving object in the research field of computer vision. Improved common perpendicular centroid algorithm is presented to reduce the side effect of the shadow detection and improve localization accuracy. The collaboration calibration is used to generate the intrinsic and extrinsic parameters of multi-view cameras synchronously. The experimental results show that the algorithm can complete accurate positioning in indoor multi-view monitoring and reduce the complexity.

Cite

CITATION STYLE

APA

Feng, S., Wu, C., Zhang, Y., & Shen, S. (2018). Collaboration calibration and three-dimensional localization in multi-view system. International Journal of Advanced Robotic Systems, 15(6). https://doi.org/10.1177/1729881418813778

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free