The space around the body not only expands the interaction space of a mobile device beyond its small screen, but also enables users to utilize their kinesthetic sense. Therefore, body-centric peephole interaction has gained considerable attention. To support its practical implementation, we propose OddEyeCam, which is a vision-based method that tracks the 3D location of a mobile device in an absolute, wide, and continuous manner with respect to the body of a user in both static and mobile environments. OddEyeCam tracks the body of a user using a wide-view RGB camera and obtains precise depth information using a narrow-view depth camera from a smartphone close to the body. We quantitatively evaluated OddEyeCam through an accuracy test and two user studies. The accuracy test showed the average tracking accuracy of OddEyeCam was 4.17 and 4.47cm in 3D space when a participant is standing and walking, respectively. In the frst user study, we implemented various interaction scenarios and observed that OddEyeCam was well received by the participants. In the second user study, we observed that the peephole target acquisition task performed using our system followed Fitts? law. We also analyzed the performance of OddEyeCam using the obtained measurements and observed that the participants completed the tasks with suffcient speed and accuracy.
CITATION STYLE
Kim, D., Park, K., & Lee, G. (2020). OddEyeCam: A sensing technique for body-centric peephole interaction using WFoV RGB and NFoV depth cameras. In UIST 2020 - Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology (pp. 85–97). Association for Computing Machinery, Inc. https://doi.org/10.1145/3379337.3415889
Mendeley helps you to discover research relevant for your work.