Web accessibility evaluation with the crowd: Using glance to rapidly code user testing video

1Citations
Citations of this article
26Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Evaluating the results of user accessibility testing on the web can take a significant amount of time, training, and effort. Some of this work can be offloaded to others through coding video data from user tests to systematically extract meaning from subtle human actions and emotions. However, traditional video coding methods can take a considerable amount of time. We have created Glance, a tool that uses the crowd to allow researchers to rapidly query, sample, and analyze large video datasets for behavioral events that are hard to detect automatically. In this abstract, we discuss how Glance can be used to quickly code video of users with special needs interacting with a website by coding for whether or not websites conform with accessibility guidelines, in order to evaluate how accessible a website is and where potential problems lie.

Cite

CITATION STYLE

APA

Gordon, M. (2014). Web accessibility evaluation with the crowd: Using glance to rapidly code user testing video. In ASSETS14 - Proceedings of the 16th International ACM SIGACCESS Conference on Computers and Accessibility (pp. 339–340). Association for Computing Machinery. https://doi.org/10.1145/2661334.2661412

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free