Depth sensing is an important issue in many applications, such as Augmented Reality (AR), eXtended Reality (XR), and Metaverse. For 3D reconstruction, a depth map can be acquired by a stereo camera and a Time-of-Flight (ToF) sensor. We used both sensors complementarily to improve the accuracy of 3D information of the data. First, we applied a generalized multi-camera calibration method that uses both color and depth information. Next, depth maps of two sensors were fused by 3D registration and reprojection approach. Then, hole-filling was applied to refine the new depth map from the ToF-stereo fused data. Finally, the surface reconstruction technique was used to generate mesh data from the ToF-stereo fused pointcloud data. The proposed procedure was implemented and tested with real-world data and compared with various algorithms to validate its efficiency.
CITATION STYLE
Jung, S., Lee, Y. S., Lee, Y., & Lee, K. T. (2022). 3D Reconstruction Using 3D Registration-Based ToF-Stereo Fusion. Sensors, 22(21). https://doi.org/10.3390/s22218369
Mendeley helps you to discover research relevant for your work.