Multi-modal interfaces that provide haptic access to statistical line graphs combined with verbal assistance are proposed as an effective tool to fulfill the needs of visually impaired people. Graphs do not only present data, they also provide and elicit the extraction of second order entities (such as maxima or trends), which are closely linked to shape properties of the graphs. In an experimental study, we investigated collaborative joint activities between haptic explorers of graphs and verbal assistants who helped haptic explorers to conceptualize local and non-local second-order concepts. The assistants have not only to decide what to say but in particular when to say it. Based on the empirical data of this experiment, we describe in the present paper the design of a feature set for describing patterns of haptic exploration, which is able to characterize the need for verbal assistance during the course of haptic exploration. We employed a (supervised) classification algorithm, namely the J48 decision tree. The constructed features within the range from basic (low-level) user-action features to complex (high-level) conceptual were categorized into four feature sets. All feature set combinations achieved high accuracy level. The best results in terms of sensitivity and specificity were achieved by adding the low-level graphical features.
CITATION STYLE
Alaçam, Ö., Acartürk, C., & Habel, C. (2015). Haptic exploration patterns in virtual line-graph comprehension. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9187, pp. 403–414). Springer Verlag. https://doi.org/10.1007/978-3-319-20898-5_39
Mendeley helps you to discover research relevant for your work.