Web Accessibility Evaluation with the Crowd : Using Glance to Rapidly Code User Testing Video

  • Gordon M
  • 17

    Readers

    Mendeley users who have this article in their library.
  • 0

    Citations

    Citations of this article.

Abstract

Evaluating the results of user accessibility testing on the web can take a significant amount of time, training, and effort. Some of this work can be offloaded to others through coding video data from user tests to systematically extract meaning from subtle human actions and emotions. However, traditional video coding methods can take a considerable amount of time. We have created Glance, a tool that uses the crowd to allow researchers to rapidly query, sample, and analyze large video datasets for behavioral events that are hard to detect automatically. In this abstract, we discuss how Glance can be used to quickly code video of users with special needs interacting with a website by coding for whether or not websites conform with accessibility guidelines, in order to evaluate how accessible a website is and where potential problems lie.

Author-supplied keywords

  • accessibility
  • crowdsourcing
  • data analysis
  • video

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document

Authors

  • Mitchell Gordon

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free