Abstract
3D pose estimation of multi-person in real-time is a challenging problem which is essential for a various virtual reality (VR) applications. In this paper, we propose a real-time multi-person 3D pose estimation system for live streaming of 3D animation with multiple RGB-D cameras. The proposed system comprises the several edge devices connected to a central server via networks. 2D pose detection and depth sensing is locally conducted on each edge device. The edge device transmits the results to the central server. 3D pose reconstruction is performed on the central server. The central server aligns the multiple camera coordinates with a specific world plane. Then, the central server matches the 2D pose result across the multiple cameras to a person based on the distance. Finally, the 3D poses are reconstructed based on multi-view triangulation. The proposed system is capable of processing in real-time. To demonstrate the proposed system estimates multi-person 3D poses in real-time, we implement the prototypes and show the proposed system can be applied to the live streaming of 3D animation such as PC and Web.
Author supplied keywords
Cite
CITATION STYLE
Hwang, T., Kim, J., Kim, M., & Kim, M. (2023). A Real-time Multi-Person 3D Pose Estimation System from Multiple RGB-D Views for Live Streaming of 3D Animation. In International Conference on Intelligent User Interfaces, Proceedings IUI (pp. 105–107). Association for Computing Machinery. https://doi.org/10.1145/3581754.3584144
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.