Conditions for segmentation of 2D translations of 3D objects

4Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Various computer vision applications involve recovery and estimation of multiple motions from images of dynamic scenes. The exact nature of objects' motions and the camera parameters are often not known a priori and therefore, the most general motion model (the fundamental matrix) is applied. Although the estimation of a fundamental matrix and its use for motion segmentation are well understood, the conditions governing the feasibility of segmentation for different types of motions are yet to be discovered. In this paper, we study the feasibility of separating 2D translations of 3D objects in a dynamic scene. We show that successful segmentation of 2D translations depends on the magnitude of the translations, average distance between the camera and objects, focal length of the camera and level of noise. Extensive set of controlled experiments using both synthetic and real images were conducted to show the validity of the proposed constraints. In addition, we quantified the conditions for successful segmentation of 2D translations in terms of the magnitude of those translations, the average distance between the camera and objects in motions for a given camera. These results are of particular importance for practitioners designing solutions for computer vision problems. © 2009 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Basah, S. N., Bab-Hadiashar, A., & Hoseinnezhad, R. (2009). Conditions for segmentation of 2D translations of 3D objects. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5716 LNCS, pp. 82–91). https://doi.org/10.1007/978-3-642-04146-4_11

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free