Detecting Gestures Through a Gesture-Based Interface to Teach Introductory Programming Concepts

1Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The goal of this research is to find an algorithm that is capable of recognizing gestures drawn in a visual and gesture-driven interface used to teach introductory programming concepts. Our system combines components from Google’s Blockly, a visual programming language with a drag-and-drop puzzle piece interface, and Microsoft’s Xbox Kinect which is used to perform skeletal tracking. We focus on two supervised machine learning clustering algorithms, centroid matching and medoid matching, to detect gestures.

Cite

CITATION STYLE

APA

Streeter, L., & Gauch, J. (2020). Detecting Gestures Through a Gesture-Based Interface to Teach Introductory Programming Concepts. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12182 LNCS, pp. 137–153). Springer. https://doi.org/10.1007/978-3-030-49062-1_9

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free