This paper is devoted to extending the previously created unified pipeline for conducting eye-tracking-based experiments in a virtual reality environment. In the previous work, we proposed using SciVi semantic data mining platform, Unreal Engine and HTC Vive Pro Eye head-mounted display to study reading process in the immersive virtual reality. The currently proposed extension enables to handle so-called polycode stimuli: compound visual objects, which consist of individual parts carrying different semantics for the viewer. To segment polycode stimuli extracting areas of interest (areas, where the informant’s eye gaze is being tracked) we adopt Creative Maps Studio vector graphics editor. To integrate Creative Maps Studio into the existing pipeline we created plugins for SciVi platform to load and handle the segmented stimuli, place them in the virtual reality scenes, collect corresponding eye gaze tracking data and perform visual analysis of the data collected. To analyze the eye gaze tracks, we utilize a circular graph that allows comprehensive visualization of hierarchical areas of interest (mapping them to color-coded graph nodes grouped into the hierarchy with a help of multilevel circular scale) and corresponding eye movements (mapped to the graph edges). We tested our pipeline on two different stimuli: the advertising poster and the painting “The Appearance of Christ Before the People” by A. Ivanov (1857).
CITATION STYLE
Ryabinin, K., Belousov, K., & Chumakov, R. (2021). Visual analytics tools for polycode stimuli eye gaze tracking in virtual reality. In CEUR Workshop Proceedings (Vol. 3027, pp. 211–222). CEUR-WS. https://doi.org/10.20948/graphicon-2021-3027-211-222
Mendeley helps you to discover research relevant for your work.