Abstract
We present a novel pipeline for localizing a free roaming eye tracker within a LiDAR-based 3D reconstructed scene with high levels of accuracy. By utilizing a combination of reconstruction algorithms that leverage the strengths of global versus local capture methods and user-assisted refinement, we reduce drift errors associated with Dense-SLAM techniques. Our framework supports regionof-interest (ROI) annotation and gaze statistics generation and the ability to visualize gaze in 3D from an immersive first person or third person perspective. This approach gives unique insights into viewers' problem solving and search task strategies and has high applicability in complex static environments such as crime scenes.
Author supplied keywords
Cite
CITATION STYLE
Pieszala, J., Diaz, G., Pelz, J., Speir, J., & Bailey, R. (2016). 3D Gaze Point Localization and Visualization Using LiDAR-based 3D reconstructions. In Eye Tracking Research and Applications Symposium (ETRA) (Vol. 14, pp. 201–204). Association for Computing Machinery. https://doi.org/10.1145/2857491.2857545
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.