We live in a society that depends on high-tech devices for assistances with everyday tasks, including everything from transportation to health care, communication, and entertainment. Tedious tactile input interfaces to these devices result in inefficient use of our time. Appropriate use of natural hand gestures will result in more efficient communication if the underlying meaning is understood. Overcoming natural hand gesture understanding challenges is vital to meet the needs of these increasingly pervasive devices in our every day lives. This paper presents a graph-based approach to understand the meaning of hand gestures by associating dynamic hand gestures with known concepts and relevant knowledge. Conceptual-level processing is emphasized to robustly handle noise and ambiguity introduced during generation, data acquisition, and low-level recognition. A simple recognition stage is used to help relax scalability limitations of conventional stochastic language models. Experimental results show that this graph-based approach to hand gesture understanding is able to successfully understand the meaning of ambiguous sets of phrases consisting of three to five hand gestures. The presented approximate graph-matching technique to understand human hand gestures supports practical and efficient communication of complex intent to the increasingly pervasive high-tech devices in our society. © 2005 IEEE.
CITATION STYLE
Miners, B. W., Basir, O. A., & Kamel, M. S. (2005). Understanding hand gestures using approximate graph matching. IEEE Transactions on Systems, Man, and Cybernetics Part A:Systems and Humans., 35(2), 239–248. https://doi.org/10.1109/TSMCA.2005.843378
Mendeley helps you to discover research relevant for your work.