Synchronized ego-motion recovery of two face-to-face cameras

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A movie captured by a wearable camera affixed to an actor's body gives audiences the sense of "immerse in the movie". The raw movie captured by wearable camera needs stabilization with jitters due to ego-motion. However, conventional approaches often fail in accurate ego-motion estimation when there are moving objects in the image and no sufficient feature pairs provided by background region. To address this problem, we proposed a new approach that utilizes an additional synchronized video captured by the camera attached on the foreground object (another actor). Formally we configure above sensor system as two face-to-face moving cameras. Then we derived the relations between four views including two consecutive views from each camera. The proposed solution has two steps. Firstly we calibrate the extrinsic relationship of two cameras with an AX=XB formulation, and secondly estimate the motion using calibration matrix. Experiments verify that this approach can recover from failures of conventional approach and provide acceptable stabilization results for real data. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Cui, J., Yagi, Y., Zha, H., Mukaigawa, Y., & Kondo, K. (2007). Synchronized ego-motion recovery of two face-to-face cameras. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4843 LNCS, pp. 544–554). Springer Verlag. https://doi.org/10.1007/978-3-540-76386-4_51

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free