This study proposed a gaze-controlled method for visualization, navigation, and retrofitting of large point cloud data (PCD), produced by unmanned aerial vehicles (UAV) mounted with laser range-scanners. For this purpose, the estimated human gaze point was used to interact with a head-mounted display (HMD) to visualize the PCD and the computer-aided design (CAD) models. Virtual water treat plant pipeline models were considered for retrofitting against the PCD of the actual pipelines. In such an application, the objective was to use the gaze data to interact with the HMD so the virtual retrofitting process was performed by navigating with the eye gaze. It was inferred that the integration of eye gaze tracking for visualization and interaction with the HMD could improve both speed and functionality for human-computer interaction. A usability study was conducted to investigate the speed of the proposed method against the mouse interaction-based retrofitting. In addition, immersion, interface quality and accuracy was analyzed by adopting the appropriate questionnaire and user learning was tested by conducting experiments in iterations from participants. Finally, it was verified whether any negative psychological factors, such as cybersickness, general discomfort, fatigue, headache, eye strain and difficulty concentrating through the survey experiment.
CITATION STYLE
Kumar, B. N. P., Adithya, B., Chethana, B., Patil, A. K., & Chai, Y. H. (2018). Gaze-controlled virtual retrofitting of UAV-scanned point cloud data. Symmetry, 10(12). https://doi.org/10.3390/sym10120674
Mendeley helps you to discover research relevant for your work.