Capturing dynamic textured surfaces of moving targets

21Citations
Citations of this article
53Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

We present an end-to-end system for reconstructing complete watertight and textured models of moving subjects such as clothed humans and animals, using only three or four handheld sensors. The heart of our framework is a new pairwise registration algorithm that minimizes, using a particle swarm strategy, an alignment error metric based on mutual visibility and occlusion. We show that this algorithm reliably registers partial scans with as little as 15% overlap without requiring any initial correspondences, and outperforms alternative global registration algorithms. This registration algorithm allows us to reconstruct moving subjects from free-viewpoint video produced by consumer-grade sensors, without extensive sensor calibration, constrained capture volume, expensive arrays of cameras, or templates of the subject geometry.

Cite

CITATION STYLE

APA

Wang, R., Wei, L., Vouga, E., Huang, Q., Ceylan, D., Medioni, G., & Li, H. (2016). Capturing dynamic textured surfaces of moving targets. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9911 LNCS, pp. 271–288). Springer Verlag. https://doi.org/10.1007/978-3-319-46478-7_17

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free